Dec 03 06:31:01 crc systemd[1]: Starting Kubernetes Kubelet... Dec 03 06:31:01 crc restorecon[4742]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:31:01 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:31:02 crc restorecon[4742]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Dec 03 06:31:02 crc restorecon[4742]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Dec 03 06:31:02 crc kubenswrapper[4831]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 06:31:02 crc kubenswrapper[4831]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Dec 03 06:31:02 crc kubenswrapper[4831]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 06:31:02 crc kubenswrapper[4831]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 06:31:02 crc kubenswrapper[4831]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Dec 03 06:31:02 crc kubenswrapper[4831]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.801231 4831 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804108 4831 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804123 4831 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804128 4831 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804132 4831 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804135 4831 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804139 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804143 4831 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804146 4831 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804150 4831 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804154 4831 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804158 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804162 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804167 4831 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804171 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804175 4831 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804180 4831 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804184 4831 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804188 4831 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804193 4831 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804198 4831 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804203 4831 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804218 4831 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804223 4831 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804227 4831 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804231 4831 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804235 4831 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804239 4831 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804244 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804247 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804251 4831 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804254 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804258 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804261 4831 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804265 4831 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804268 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804272 4831 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804275 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804280 4831 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804283 4831 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804287 4831 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804290 4831 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804295 4831 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804300 4831 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804304 4831 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804308 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804331 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804335 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804339 4831 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804343 4831 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804348 4831 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804351 4831 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804355 4831 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804359 4831 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804363 4831 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804366 4831 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804370 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804373 4831 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804378 4831 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804383 4831 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804387 4831 feature_gate.go:330] unrecognized feature gate: Example Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804391 4831 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804395 4831 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804398 4831 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804402 4831 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804405 4831 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804409 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804412 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804418 4831 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804422 4831 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804426 4831 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.804430 4831 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804623 4831 flags.go:64] FLAG: --address="0.0.0.0" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804633 4831 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804640 4831 flags.go:64] FLAG: --anonymous-auth="true" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804646 4831 flags.go:64] FLAG: --application-metrics-count-limit="100" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804656 4831 flags.go:64] FLAG: --authentication-token-webhook="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804661 4831 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804667 4831 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804672 4831 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804677 4831 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804682 4831 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804687 4831 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804691 4831 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804696 4831 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804700 4831 flags.go:64] FLAG: --cgroup-root="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804704 4831 flags.go:64] FLAG: --cgroups-per-qos="true" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804708 4831 flags.go:64] FLAG: --client-ca-file="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804712 4831 flags.go:64] FLAG: --cloud-config="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804716 4831 flags.go:64] FLAG: --cloud-provider="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804720 4831 flags.go:64] FLAG: --cluster-dns="[]" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804727 4831 flags.go:64] FLAG: --cluster-domain="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804731 4831 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804737 4831 flags.go:64] FLAG: --config-dir="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804748 4831 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804753 4831 flags.go:64] FLAG: --container-log-max-files="5" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804758 4831 flags.go:64] FLAG: --container-log-max-size="10Mi" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804763 4831 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804767 4831 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804771 4831 flags.go:64] FLAG: --containerd-namespace="k8s.io" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804775 4831 flags.go:64] FLAG: --contention-profiling="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804779 4831 flags.go:64] FLAG: --cpu-cfs-quota="true" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804783 4831 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804787 4831 flags.go:64] FLAG: --cpu-manager-policy="none" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804791 4831 flags.go:64] FLAG: --cpu-manager-policy-options="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804797 4831 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804801 4831 flags.go:64] FLAG: --enable-controller-attach-detach="true" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804805 4831 flags.go:64] FLAG: --enable-debugging-handlers="true" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804809 4831 flags.go:64] FLAG: --enable-load-reader="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804814 4831 flags.go:64] FLAG: --enable-server="true" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804818 4831 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804823 4831 flags.go:64] FLAG: --event-burst="100" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804828 4831 flags.go:64] FLAG: --event-qps="50" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804832 4831 flags.go:64] FLAG: --event-storage-age-limit="default=0" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804837 4831 flags.go:64] FLAG: --event-storage-event-limit="default=0" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804840 4831 flags.go:64] FLAG: --eviction-hard="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804845 4831 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804849 4831 flags.go:64] FLAG: --eviction-minimum-reclaim="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804854 4831 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804858 4831 flags.go:64] FLAG: --eviction-soft="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804862 4831 flags.go:64] FLAG: --eviction-soft-grace-period="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804866 4831 flags.go:64] FLAG: --exit-on-lock-contention="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804871 4831 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804875 4831 flags.go:64] FLAG: --experimental-mounter-path="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804879 4831 flags.go:64] FLAG: --fail-cgroupv1="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804885 4831 flags.go:64] FLAG: --fail-swap-on="true" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804889 4831 flags.go:64] FLAG: --feature-gates="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804894 4831 flags.go:64] FLAG: --file-check-frequency="20s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804898 4831 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804903 4831 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804913 4831 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804917 4831 flags.go:64] FLAG: --healthz-port="10248" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804921 4831 flags.go:64] FLAG: --help="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804925 4831 flags.go:64] FLAG: --hostname-override="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804929 4831 flags.go:64] FLAG: --housekeeping-interval="10s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804933 4831 flags.go:64] FLAG: --http-check-frequency="20s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804937 4831 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804941 4831 flags.go:64] FLAG: --image-credential-provider-config="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804946 4831 flags.go:64] FLAG: --image-gc-high-threshold="85" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804950 4831 flags.go:64] FLAG: --image-gc-low-threshold="80" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804954 4831 flags.go:64] FLAG: --image-service-endpoint="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804958 4831 flags.go:64] FLAG: --kernel-memcg-notification="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804962 4831 flags.go:64] FLAG: --kube-api-burst="100" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804966 4831 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804971 4831 flags.go:64] FLAG: --kube-api-qps="50" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804975 4831 flags.go:64] FLAG: --kube-reserved="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804979 4831 flags.go:64] FLAG: --kube-reserved-cgroup="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804982 4831 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804986 4831 flags.go:64] FLAG: --kubelet-cgroups="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804990 4831 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.804997 4831 flags.go:64] FLAG: --lock-file="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805001 4831 flags.go:64] FLAG: --log-cadvisor-usage="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805005 4831 flags.go:64] FLAG: --log-flush-frequency="5s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805009 4831 flags.go:64] FLAG: --log-json-info-buffer-size="0" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805015 4831 flags.go:64] FLAG: --log-json-split-stream="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805019 4831 flags.go:64] FLAG: --log-text-info-buffer-size="0" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805023 4831 flags.go:64] FLAG: --log-text-split-stream="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805027 4831 flags.go:64] FLAG: --logging-format="text" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805031 4831 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805035 4831 flags.go:64] FLAG: --make-iptables-util-chains="true" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805039 4831 flags.go:64] FLAG: --manifest-url="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805043 4831 flags.go:64] FLAG: --manifest-url-header="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805049 4831 flags.go:64] FLAG: --max-housekeeping-interval="15s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805052 4831 flags.go:64] FLAG: --max-open-files="1000000" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805058 4831 flags.go:64] FLAG: --max-pods="110" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805062 4831 flags.go:64] FLAG: --maximum-dead-containers="-1" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805074 4831 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805078 4831 flags.go:64] FLAG: --memory-manager-policy="None" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805082 4831 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805086 4831 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805090 4831 flags.go:64] FLAG: --node-ip="192.168.126.11" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805095 4831 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805104 4831 flags.go:64] FLAG: --node-status-max-images="50" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805108 4831 flags.go:64] FLAG: --node-status-update-frequency="10s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805112 4831 flags.go:64] FLAG: --oom-score-adj="-999" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805118 4831 flags.go:64] FLAG: --pod-cidr="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805122 4831 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805131 4831 flags.go:64] FLAG: --pod-manifest-path="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805135 4831 flags.go:64] FLAG: --pod-max-pids="-1" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805139 4831 flags.go:64] FLAG: --pods-per-core="0" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805143 4831 flags.go:64] FLAG: --port="10250" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805147 4831 flags.go:64] FLAG: --protect-kernel-defaults="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805153 4831 flags.go:64] FLAG: --provider-id="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805156 4831 flags.go:64] FLAG: --qos-reserved="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805160 4831 flags.go:64] FLAG: --read-only-port="10255" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805165 4831 flags.go:64] FLAG: --register-node="true" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805169 4831 flags.go:64] FLAG: --register-schedulable="true" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805173 4831 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805180 4831 flags.go:64] FLAG: --registry-burst="10" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805184 4831 flags.go:64] FLAG: --registry-qps="5" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805188 4831 flags.go:64] FLAG: --reserved-cpus="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805192 4831 flags.go:64] FLAG: --reserved-memory="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805197 4831 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805201 4831 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805205 4831 flags.go:64] FLAG: --rotate-certificates="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805209 4831 flags.go:64] FLAG: --rotate-server-certificates="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805213 4831 flags.go:64] FLAG: --runonce="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805217 4831 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805221 4831 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805225 4831 flags.go:64] FLAG: --seccomp-default="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805229 4831 flags.go:64] FLAG: --serialize-image-pulls="true" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805233 4831 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805243 4831 flags.go:64] FLAG: --storage-driver-db="cadvisor" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805248 4831 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805252 4831 flags.go:64] FLAG: --storage-driver-password="root" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805256 4831 flags.go:64] FLAG: --storage-driver-secure="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805260 4831 flags.go:64] FLAG: --storage-driver-table="stats" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805264 4831 flags.go:64] FLAG: --storage-driver-user="root" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805269 4831 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805273 4831 flags.go:64] FLAG: --sync-frequency="1m0s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805277 4831 flags.go:64] FLAG: --system-cgroups="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805280 4831 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805287 4831 flags.go:64] FLAG: --system-reserved-cgroup="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805290 4831 flags.go:64] FLAG: --tls-cert-file="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805295 4831 flags.go:64] FLAG: --tls-cipher-suites="[]" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805303 4831 flags.go:64] FLAG: --tls-min-version="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805307 4831 flags.go:64] FLAG: --tls-private-key-file="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805325 4831 flags.go:64] FLAG: --topology-manager-policy="none" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805329 4831 flags.go:64] FLAG: --topology-manager-policy-options="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805333 4831 flags.go:64] FLAG: --topology-manager-scope="container" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805338 4831 flags.go:64] FLAG: --v="2" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805344 4831 flags.go:64] FLAG: --version="false" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805356 4831 flags.go:64] FLAG: --vmodule="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805361 4831 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805366 4831 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805508 4831 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805513 4831 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805518 4831 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805521 4831 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805525 4831 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805529 4831 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805533 4831 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805537 4831 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805540 4831 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805544 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805547 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805551 4831 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805554 4831 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805564 4831 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805568 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805572 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805575 4831 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805579 4831 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805582 4831 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805586 4831 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805589 4831 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805594 4831 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805598 4831 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805603 4831 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805607 4831 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805611 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805615 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805619 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805623 4831 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805627 4831 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805631 4831 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805636 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805639 4831 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805643 4831 feature_gate.go:330] unrecognized feature gate: Example Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805646 4831 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805650 4831 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805653 4831 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805657 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805660 4831 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805664 4831 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805667 4831 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805670 4831 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805674 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805677 4831 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805681 4831 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805684 4831 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805687 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805691 4831 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805695 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805704 4831 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805708 4831 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805711 4831 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805716 4831 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805722 4831 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805726 4831 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805730 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805734 4831 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805738 4831 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805741 4831 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805745 4831 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805748 4831 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805751 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805755 4831 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805758 4831 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805762 4831 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805765 4831 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805769 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805774 4831 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805778 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805782 4831 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.805786 4831 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.805914 4831 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.817962 4831 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.818004 4831 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818166 4831 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818180 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818189 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818200 4831 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818209 4831 feature_gate.go:330] unrecognized feature gate: Example Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818217 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818226 4831 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818234 4831 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818243 4831 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818255 4831 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818271 4831 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818281 4831 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818290 4831 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818301 4831 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818343 4831 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818358 4831 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818370 4831 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818380 4831 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818389 4831 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818398 4831 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818407 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818415 4831 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818427 4831 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818437 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818446 4831 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818455 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818467 4831 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818477 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818487 4831 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818495 4831 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818504 4831 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818513 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818521 4831 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818529 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818538 4831 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818546 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818556 4831 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818564 4831 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818573 4831 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818581 4831 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818589 4831 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818597 4831 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818606 4831 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818614 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818622 4831 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818630 4831 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818639 4831 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818648 4831 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818656 4831 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818673 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818681 4831 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818689 4831 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818698 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818707 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818716 4831 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818724 4831 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818733 4831 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818741 4831 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818750 4831 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818758 4831 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818766 4831 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818774 4831 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818782 4831 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818791 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818799 4831 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818807 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818816 4831 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818824 4831 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818832 4831 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818841 4831 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.818849 4831 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.818863 4831 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819101 4831 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819115 4831 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819126 4831 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819135 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819145 4831 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819154 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819163 4831 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819171 4831 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819180 4831 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819189 4831 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819197 4831 feature_gate.go:330] unrecognized feature gate: SignatureStores Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819205 4831 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819214 4831 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819222 4831 feature_gate.go:330] unrecognized feature gate: PlatformOperators Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819230 4831 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819239 4831 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819248 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819257 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819265 4831 feature_gate.go:330] unrecognized feature gate: PinnedImages Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819274 4831 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819282 4831 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819290 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819300 4831 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819308 4831 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819354 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819369 4831 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819381 4831 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819390 4831 feature_gate.go:330] unrecognized feature gate: InsightsConfig Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819400 4831 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819408 4831 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819417 4831 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819426 4831 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819434 4831 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819445 4831 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819457 4831 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819466 4831 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819476 4831 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819485 4831 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819494 4831 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819503 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819513 4831 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819521 4831 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819541 4831 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819551 4831 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819560 4831 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819568 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819577 4831 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819586 4831 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819595 4831 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819603 4831 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819612 4831 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819620 4831 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819628 4831 feature_gate.go:330] unrecognized feature gate: Example Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819637 4831 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819647 4831 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819656 4831 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819665 4831 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819673 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819682 4831 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819690 4831 feature_gate.go:330] unrecognized feature gate: OVNObservability Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819698 4831 feature_gate.go:330] unrecognized feature gate: NewOLM Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819706 4831 feature_gate.go:330] unrecognized feature gate: GatewayAPI Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819714 4831 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819723 4831 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819731 4831 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819740 4831 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819749 4831 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819757 4831 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819766 4831 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819774 4831 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.819782 4831 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.819796 4831 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.820751 4831 server.go:940] "Client rotation is on, will bootstrap in background" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.826107 4831 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.826251 4831 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.827092 4831 server.go:997] "Starting client certificate rotation" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.827139 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.827407 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-12-07 23:39:33.459921542 +0000 UTC Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.827570 4831 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 113h8m30.632356674s for next certificate rotation Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.846224 4831 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.849008 4831 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.866011 4831 log.go:25] "Validated CRI v1 runtime API" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.887093 4831 log.go:25] "Validated CRI v1 image API" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.889489 4831 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.892916 4831 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2025-12-03-06-26-35-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.892963 4831 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.917953 4831 manager.go:217] Machine: {Timestamp:2025-12-03 06:31:02.916017169 +0000 UTC m=+0.259600757 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:1b42c798-2812-40ef-a506-f181e54d7ef9 BootID:0b87e1b8-395c-4ff9-834e-79e149dbf129 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:ff:4e:04 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:ff:4e:04 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:c9:1d:27 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:43:4c:23 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:9d:23:04 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:27:79:f8 Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:de:af:87 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4a:ba:eb:c7:64:a9 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:3e:ea:76:d9:88:46 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.918320 4831 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.918588 4831 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.918991 4831 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.919256 4831 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.919342 4831 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.919690 4831 topology_manager.go:138] "Creating topology manager with none policy" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.919709 4831 container_manager_linux.go:303] "Creating device plugin manager" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.920039 4831 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.920088 4831 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.920349 4831 state_mem.go:36] "Initialized new in-memory state store" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.920470 4831 server.go:1245] "Using root directory" path="/var/lib/kubelet" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.921607 4831 kubelet.go:418] "Attempting to sync node with API server" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.921658 4831 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.921712 4831 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.921731 4831 kubelet.go:324] "Adding apiserver pod source" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.921751 4831 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.933186 4831 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.933453 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.933518 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Dec 03 06:31:02 crc kubenswrapper[4831]: E1203 06:31:02.933615 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:31:02 crc kubenswrapper[4831]: E1203 06:31:02.933643 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.933862 4831 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.935256 4831 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.936293 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.936381 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.936403 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.936423 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.936454 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.936471 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.936488 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.936515 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.936535 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.936554 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.936579 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.936597 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.936928 4831 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.937854 4831 server.go:1280] "Started kubelet" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.938084 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.939718 4831 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Dec 03 06:31:02 crc systemd[1]: Started Kubernetes Kubelet. Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.940110 4831 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 22:56:06.445362279 +0000 UTC Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.940152 4831 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 856h25m3.505212296s for next certificate rotation Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.945637 4831 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.945651 4831 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.946111 4831 volume_manager.go:287] "The desired_state_of_world populator starts" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.946133 4831 volume_manager.go:289] "Starting Kubelet Volume Manager" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.946261 4831 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.948933 4831 server.go:460] "Adding debug handlers to kubelet server" Dec 03 06:31:02 crc kubenswrapper[4831]: E1203 06:31:02.948671 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.234:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187da0d45c195b6e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 06:31:02.937779054 +0000 UTC m=+0.281362602,LastTimestamp:2025-12-03 06:31:02.937779054 +0000 UTC m=+0.281362602,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 06:31:02 crc kubenswrapper[4831]: E1203 06:31:02.952816 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.954098 4831 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 03 06:31:02 crc kubenswrapper[4831]: E1203 06:31:02.954573 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="200ms" Dec 03 06:31:02 crc kubenswrapper[4831]: W1203 06:31:02.954769 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Dec 03 06:31:02 crc kubenswrapper[4831]: E1203 06:31:02.954883 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.954915 4831 factory.go:55] Registering systemd factory Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.954960 4831 factory.go:221] Registration of the systemd container factory successfully Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.955252 4831 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.955359 4831 factory.go:153] Registering CRI-O factory Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.955381 4831 factory.go:221] Registration of the crio container factory successfully Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.955454 4831 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.955492 4831 factory.go:103] Registering Raw factory Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.955516 4831 manager.go:1196] Started watching for new ooms in manager Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.958962 4831 manager.go:319] Starting recovery of all containers Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.960749 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.960961 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.961082 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.961196 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.961515 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.961630 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.961769 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.961879 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.961994 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.962118 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.962285 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.962425 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.962537 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.962672 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.962801 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.962913 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.963022 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.963129 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.963238 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.963400 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.963580 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.963697 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.963808 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.963915 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.964046 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.964174 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.964292 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.964442 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.964555 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.964665 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.964804 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.964917 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.965028 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.965135 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.965255 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.965462 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.965595 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.965831 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.965963 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.966095 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.966210 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.966367 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.966496 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.966611 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.966728 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.966845 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.967037 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.967179 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.967296 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.967451 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.967570 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.967707 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.967836 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.967964 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.968095 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.968225 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.968371 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.968508 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.968629 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.968767 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.968907 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.969026 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.969162 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.969288 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.969467 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.969584 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.969695 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.969842 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.969959 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.970072 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.970195 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.970309 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.970460 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.970596 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.972020 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.972204 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.972399 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.972549 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.972739 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.972875 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.973009 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.973126 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.973240 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.973432 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.973560 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.973673 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.973786 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.973912 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.974051 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.976009 4831 manager.go:324] Recovery completed Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.977814 4831 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.977898 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.977929 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.977953 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.977976 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.977998 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978018 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978042 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978065 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978087 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978126 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978148 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978168 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978187 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978208 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978228 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978258 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978284 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978306 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978369 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978402 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978440 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978464 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978487 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978509 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978530 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978552 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978574 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978593 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978646 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978668 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978688 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978709 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978730 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978750 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978771 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978790 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978810 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978831 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978855 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978876 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978897 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978918 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978938 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978957 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978978 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.978998 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979018 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979040 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979062 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979081 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979100 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979120 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979142 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979161 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979182 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979201 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979222 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979243 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979265 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979285 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979308 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979372 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979396 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979417 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979438 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979501 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979527 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979549 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979573 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979597 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979620 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979640 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979662 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979685 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979705 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979726 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979748 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979769 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979790 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979810 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979831 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979853 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979873 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979894 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979917 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979940 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979961 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.979982 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980003 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980022 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980042 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980063 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980084 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980107 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980128 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980148 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980170 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980201 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980222 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980243 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980264 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980283 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980303 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980368 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980394 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980414 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980434 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980455 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980475 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980495 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980515 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980535 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980555 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980576 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980597 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980616 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980638 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980659 4831 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980678 4831 reconstruct.go:97] "Volume reconstruction finished" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.980692 4831 reconciler.go:26] "Reconciler: start to sync state" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.990151 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.992053 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.992107 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.992123 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.993424 4831 cpu_manager.go:225] "Starting CPU manager" policy="none" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.993441 4831 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Dec 03 06:31:02 crc kubenswrapper[4831]: I1203 06:31:02.993463 4831 state_mem.go:36] "Initialized new in-memory state store" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.004256 4831 policy_none.go:49] "None policy: Start" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.006367 4831 memory_manager.go:170] "Starting memorymanager" policy="None" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.006474 4831 state_mem.go:35] "Initializing new in-memory state store" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.008462 4831 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.011347 4831 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.011396 4831 status_manager.go:217] "Starting to sync pod status with apiserver" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.011440 4831 kubelet.go:2335] "Starting kubelet main sync loop" Dec 03 06:31:03 crc kubenswrapper[4831]: E1203 06:31:03.011514 4831 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 03 06:31:03 crc kubenswrapper[4831]: W1203 06:31:03.013514 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Dec 03 06:31:03 crc kubenswrapper[4831]: E1203 06:31:03.013614 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:31:03 crc kubenswrapper[4831]: E1203 06:31:03.053161 4831 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.079090 4831 manager.go:334] "Starting Device Plugin manager" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.079164 4831 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.079184 4831 server.go:79] "Starting device plugin registration server" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.079850 4831 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.079885 4831 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.080459 4831 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.080626 4831 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.080647 4831 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 03 06:31:03 crc kubenswrapper[4831]: E1203 06:31:03.088795 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.112030 4831 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.112131 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.113172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.113201 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.113213 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.113374 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.113931 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.113965 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.114757 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.114778 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.114789 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.115754 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.115776 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.115787 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.115897 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.116340 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.116371 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.118491 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.118514 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.118525 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.118658 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.118676 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.118687 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.118783 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.119209 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.119233 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.120519 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.120535 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.120542 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.120541 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.120589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.120615 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.120845 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.121042 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.121092 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.122170 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.122186 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.122194 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.122247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.122275 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.122292 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.122328 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.122347 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.124418 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.124506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.124532 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:03 crc kubenswrapper[4831]: E1203 06:31:03.155854 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="400ms" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.180367 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.181455 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.181518 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.181539 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.181577 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:31:03 crc kubenswrapper[4831]: E1203 06:31:03.182214 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182328 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182372 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182397 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182421 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182442 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182461 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182607 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182655 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182719 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182756 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182787 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182821 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182851 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182876 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.182918 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283620 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283672 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283696 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283714 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283731 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283747 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283763 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283779 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283796 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283812 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283827 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283840 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283855 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283873 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283865 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283896 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283946 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283889 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283928 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283972 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283987 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283883 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.284037 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.284012 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283992 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.284057 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.283933 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.284098 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.284121 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.284232 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.382883 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.384483 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.384570 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.384596 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.384640 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:31:03 crc kubenswrapper[4831]: E1203 06:31:03.385367 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.459218 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.474192 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: W1203 06:31:03.488670 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-b9c6614ca570e9fc35d76158713e36a834378359af6d50fe70143210a247a57e WatchSource:0}: Error finding container b9c6614ca570e9fc35d76158713e36a834378359af6d50fe70143210a247a57e: Status 404 returned error can't find the container with id b9c6614ca570e9fc35d76158713e36a834378359af6d50fe70143210a247a57e Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.494964 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: W1203 06:31:03.498375 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-91dbe0bc82aa8921690a3df468dec2d23f3bd1c9ddccbafa002dfbe8b56970c9 WatchSource:0}: Error finding container 91dbe0bc82aa8921690a3df468dec2d23f3bd1c9ddccbafa002dfbe8b56970c9: Status 404 returned error can't find the container with id 91dbe0bc82aa8921690a3df468dec2d23f3bd1c9ddccbafa002dfbe8b56970c9 Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.503588 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: W1203 06:31:03.508747 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-ae3ec6719dcfec896825bfbed739f0948f0449049b944634f7ba67b4e2f87a7c WatchSource:0}: Error finding container ae3ec6719dcfec896825bfbed739f0948f0449049b944634f7ba67b4e2f87a7c: Status 404 returned error can't find the container with id ae3ec6719dcfec896825bfbed739f0948f0449049b944634f7ba67b4e2f87a7c Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.512900 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:31:03 crc kubenswrapper[4831]: W1203 06:31:03.530081 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-87dcaba7ae8ba1aa76e767e6aeb8920ba5ec28aed235aa410b856ead1cc670a2 WatchSource:0}: Error finding container 87dcaba7ae8ba1aa76e767e6aeb8920ba5ec28aed235aa410b856ead1cc670a2: Status 404 returned error can't find the container with id 87dcaba7ae8ba1aa76e767e6aeb8920ba5ec28aed235aa410b856ead1cc670a2 Dec 03 06:31:03 crc kubenswrapper[4831]: W1203 06:31:03.550745 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-91348b4faa17f421a41de3f0dc3ef330c0e2d9f2779d191fda660077625bb032 WatchSource:0}: Error finding container 91348b4faa17f421a41de3f0dc3ef330c0e2d9f2779d191fda660077625bb032: Status 404 returned error can't find the container with id 91348b4faa17f421a41de3f0dc3ef330c0e2d9f2779d191fda660077625bb032 Dec 03 06:31:03 crc kubenswrapper[4831]: E1203 06:31:03.557191 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="800ms" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.785645 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.787454 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.787505 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.787516 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.787545 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:31:03 crc kubenswrapper[4831]: E1203 06:31:03.788076 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Dec 03 06:31:03 crc kubenswrapper[4831]: W1203 06:31:03.859768 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Dec 03 06:31:03 crc kubenswrapper[4831]: E1203 06:31:03.859917 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:31:03 crc kubenswrapper[4831]: I1203 06:31:03.939346 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Dec 03 06:31:03 crc kubenswrapper[4831]: W1203 06:31:03.963476 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Dec 03 06:31:03 crc kubenswrapper[4831]: E1203 06:31:03.963573 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.018148 4831 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="50223f94ad8bd7d3e4a1fd96c6525e2344cce1c7d8172dfb849ff4177d29d35c" exitCode=0 Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.018247 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"50223f94ad8bd7d3e4a1fd96c6525e2344cce1c7d8172dfb849ff4177d29d35c"} Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.018389 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"b9c6614ca570e9fc35d76158713e36a834378359af6d50fe70143210a247a57e"} Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.020105 4831 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f" exitCode=0 Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.020220 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f"} Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.020270 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"91348b4faa17f421a41de3f0dc3ef330c0e2d9f2779d191fda660077625bb032"} Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.020474 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.021537 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.021573 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.021589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.022989 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a"} Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.023025 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"87dcaba7ae8ba1aa76e767e6aeb8920ba5ec28aed235aa410b856ead1cc670a2"} Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.029769 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a" exitCode=0 Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.029838 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a"} Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.029862 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ae3ec6719dcfec896825bfbed739f0948f0449049b944634f7ba67b4e2f87a7c"} Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.029993 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.030984 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.031025 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.031039 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.031362 4831 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="070e7b8e81e5b07a10339a1bb0cdb5d04f8055e2888f97b5c6c338c7fbcb50ea" exitCode=0 Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.031389 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"070e7b8e81e5b07a10339a1bb0cdb5d04f8055e2888f97b5c6c338c7fbcb50ea"} Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.031427 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"91dbe0bc82aa8921690a3df468dec2d23f3bd1c9ddccbafa002dfbe8b56970c9"} Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.031525 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.032255 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.032282 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.032297 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.033637 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.034400 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.034436 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.034455 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:04 crc kubenswrapper[4831]: E1203 06:31:04.357925 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="1.6s" Dec 03 06:31:04 crc kubenswrapper[4831]: E1203 06:31:04.533567 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.234:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.187da0d45c195b6e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 06:31:02.937779054 +0000 UTC m=+0.281362602,LastTimestamp:2025-12-03 06:31:02.937779054 +0000 UTC m=+0.281362602,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 06:31:04 crc kubenswrapper[4831]: W1203 06:31:04.547611 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Dec 03 06:31:04 crc kubenswrapper[4831]: E1203 06:31:04.547695 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:31:04 crc kubenswrapper[4831]: W1203 06:31:04.570639 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Dec 03 06:31:04 crc kubenswrapper[4831]: E1203 06:31:04.570751 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.234:6443: connect: connection refused" logger="UnhandledError" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.589193 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.590878 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.590912 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.590921 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.590943 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:31:04 crc kubenswrapper[4831]: E1203 06:31:04.591381 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.234:6443: connect: connection refused" node="crc" Dec 03 06:31:04 crc kubenswrapper[4831]: I1203 06:31:04.940002 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.234:6443: connect: connection refused Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.035938 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"d16c4ad7cfee1247394e7bbb77ce47852d4f18af08ffa88a4a2e61f5d2a15371"} Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.035992 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"978aff94d5ba5b47d721fd41ecdb14563b76a93959caae023dcd2add83e622d1"} Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.036010 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"2d320c688af94e83163a489c45472fa0c4aa5a3d1f4aeade63c37d536888361d"} Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.036116 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.036811 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.036841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.036852 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.039036 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558"} Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.039061 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc"} Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.039069 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.039071 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26"} Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.039797 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.039827 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.039837 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.041793 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c"} Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.041848 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1"} Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.041863 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f"} Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.041875 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801"} Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.043514 4831 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="8eb6c246a59310db5140a5cd3b4e7b26de51a98a5ebc61347c0bce1468f1cb4d" exitCode=0 Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.043553 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"8eb6c246a59310db5140a5cd3b4e7b26de51a98a5ebc61347c0bce1468f1cb4d"} Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.043586 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.043718 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.044223 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.044259 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.044273 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.044550 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.044582 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:05 crc kubenswrapper[4831]: I1203 06:31:05.044593 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.050256 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.050249 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812"} Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.051034 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.051059 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.051067 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.053775 4831 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="afb059fb5112e79cb880abafe8e617793e1db3d14b20ef719b4478d11fce1b9c" exitCode=0 Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.053816 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"afb059fb5112e79cb880abafe8e617793e1db3d14b20ef719b4478d11fce1b9c"} Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.054009 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.054980 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.055048 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.055063 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.055279 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"ca2e6c838b6a2917c53bec08f8abd4dc5e5cf0e279e7352a225ddd7703853343"} Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.055296 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.055339 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.055386 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.055396 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.056437 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.056462 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.056480 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.056490 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.056510 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.056496 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.056579 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.056596 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.056608 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.192497 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.195105 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.195168 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.195192 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:06 crc kubenswrapper[4831]: I1203 06:31:06.195241 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.007567 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.048751 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.060772 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e7cbfdbc110615fc390bf72addedafd63442e8bd4ae74101aa566c26cd031960"} Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.060807 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e2343ed445f3c7d70a6e7600a7e1f6d368777fdb049346ceffc08988244f05ab"} Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.060822 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"3437ebcb88c9e7ef0889abcb7474a049b1e214531c0c2392c15ad2ccac4a1afe"} Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.060833 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6a4e0413fa3ddf8b99e7f2cc90969d8badba86d572645d9fc6bcabcf0b43acc8"} Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.060956 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.061821 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.061861 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.061870 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.450528 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.913357 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.913635 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.915261 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.915354 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:07 crc kubenswrapper[4831]: I1203 06:31:07.915374 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.070916 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"083e245dd971b9daa1af7905a76311342a3da90d39e3582f18bd1113b64945e7"} Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.071011 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.071150 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.072395 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.072435 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.072451 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.072793 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.072858 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.072883 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.593804 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.594077 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.595879 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.595923 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:08 crc kubenswrapper[4831]: I1203 06:31:08.595942 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.073804 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.073895 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.075465 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.075505 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.075517 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.075560 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.075598 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.075615 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.203570 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.203767 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.205084 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.205162 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.205199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:09 crc kubenswrapper[4831]: I1203 06:31:09.400199 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Dec 03 06:31:10 crc kubenswrapper[4831]: I1203 06:31:10.076209 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:10 crc kubenswrapper[4831]: I1203 06:31:10.077737 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:10 crc kubenswrapper[4831]: I1203 06:31:10.077848 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:10 crc kubenswrapper[4831]: I1203 06:31:10.077959 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:10 crc kubenswrapper[4831]: I1203 06:31:10.126662 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Dec 03 06:31:10 crc kubenswrapper[4831]: I1203 06:31:10.438585 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:10 crc kubenswrapper[4831]: I1203 06:31:10.439239 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:10 crc kubenswrapper[4831]: I1203 06:31:10.442066 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:10 crc kubenswrapper[4831]: I1203 06:31:10.442128 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:10 crc kubenswrapper[4831]: I1203 06:31:10.442145 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:10 crc kubenswrapper[4831]: I1203 06:31:10.449226 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:10 crc kubenswrapper[4831]: I1203 06:31:10.914463 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 06:31:10 crc kubenswrapper[4831]: I1203 06:31:10.914881 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 06:31:11 crc kubenswrapper[4831]: I1203 06:31:11.080084 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:11 crc kubenswrapper[4831]: I1203 06:31:11.081220 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:11 crc kubenswrapper[4831]: I1203 06:31:11.081611 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:11 crc kubenswrapper[4831]: I1203 06:31:11.081672 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:11 crc kubenswrapper[4831]: I1203 06:31:11.081696 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:11 crc kubenswrapper[4831]: I1203 06:31:11.082876 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:11 crc kubenswrapper[4831]: I1203 06:31:11.082923 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:11 crc kubenswrapper[4831]: I1203 06:31:11.082946 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:11 crc kubenswrapper[4831]: I1203 06:31:11.990082 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:31:11 crc kubenswrapper[4831]: I1203 06:31:11.990675 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:11 crc kubenswrapper[4831]: I1203 06:31:11.992341 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:11 crc kubenswrapper[4831]: I1203 06:31:11.992389 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:11 crc kubenswrapper[4831]: I1203 06:31:11.992402 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:13 crc kubenswrapper[4831]: E1203 06:31:13.089053 4831 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Dec 03 06:31:15 crc kubenswrapper[4831]: W1203 06:31:15.940810 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 06:31:15 crc kubenswrapper[4831]: I1203 06:31:15.940933 4831 trace.go:236] Trace[2132235379]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 06:31:05.939) (total time: 10001ms): Dec 03 06:31:15 crc kubenswrapper[4831]: Trace[2132235379]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:31:15.940) Dec 03 06:31:15 crc kubenswrapper[4831]: Trace[2132235379]: [10.00182336s] [10.00182336s] END Dec 03 06:31:15 crc kubenswrapper[4831]: I1203 06:31:15.940865 4831 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Dec 03 06:31:15 crc kubenswrapper[4831]: E1203 06:31:15.941519 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 06:31:15 crc kubenswrapper[4831]: E1203 06:31:15.959502 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Dec 03 06:31:16 crc kubenswrapper[4831]: E1203 06:31:16.196907 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": net/http: TLS handshake timeout" node="crc" Dec 03 06:31:16 crc kubenswrapper[4831]: W1203 06:31:16.226129 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 06:31:16 crc kubenswrapper[4831]: I1203 06:31:16.226265 4831 trace.go:236] Trace[448484936]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 06:31:06.224) (total time: 10001ms): Dec 03 06:31:16 crc kubenswrapper[4831]: Trace[448484936]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (06:31:16.226) Dec 03 06:31:16 crc kubenswrapper[4831]: Trace[448484936]: [10.00173933s] [10.00173933s] END Dec 03 06:31:16 crc kubenswrapper[4831]: E1203 06:31:16.226311 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 06:31:16 crc kubenswrapper[4831]: W1203 06:31:16.308737 4831 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Dec 03 06:31:16 crc kubenswrapper[4831]: I1203 06:31:16.308887 4831 trace.go:236] Trace[843352170]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 06:31:06.304) (total time: 10004ms): Dec 03 06:31:16 crc kubenswrapper[4831]: Trace[843352170]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10003ms (06:31:16.308) Dec 03 06:31:16 crc kubenswrapper[4831]: Trace[843352170]: [10.004627104s] [10.004627104s] END Dec 03 06:31:16 crc kubenswrapper[4831]: E1203 06:31:16.308930 4831 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Dec 03 06:31:16 crc kubenswrapper[4831]: I1203 06:31:16.315852 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 06:31:16 crc kubenswrapper[4831]: I1203 06:31:16.315992 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 06:31:16 crc kubenswrapper[4831]: I1203 06:31:16.323387 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Dec 03 06:31:16 crc kubenswrapper[4831]: I1203 06:31:16.323466 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Dec 03 06:31:17 crc kubenswrapper[4831]: I1203 06:31:17.455632 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]log ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]etcd ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/start-apiserver-admission-initializer ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/openshift.io-api-request-count-filter ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/openshift.io-startkubeinformers ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/generic-apiserver-start-informers ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/priority-and-fairness-config-consumer ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/priority-and-fairness-filter ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/storage-object-count-tracker-hook ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/start-apiextensions-informers ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/start-apiextensions-controllers ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/crd-informer-synced ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/start-system-namespaces-controller ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/start-cluster-authentication-info-controller ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/start-legacy-token-tracking-controller ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/start-service-ip-repair-controllers ok Dec 03 06:31:17 crc kubenswrapper[4831]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/priority-and-fairness-config-producer ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/bootstrap-controller ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/start-kube-aggregator-informers ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/apiservice-status-local-available-controller ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/apiservice-status-remote-available-controller ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/apiservice-registration-controller ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/apiservice-wait-for-first-sync ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/apiservice-discovery-controller ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/kube-apiserver-autoregistration ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]autoregister-completion ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/apiservice-openapi-controller ok Dec 03 06:31:17 crc kubenswrapper[4831]: [+]poststarthook/apiservice-openapiv3-controller ok Dec 03 06:31:17 crc kubenswrapper[4831]: livez check failed Dec 03 06:31:17 crc kubenswrapper[4831]: I1203 06:31:17.455731 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:31:18 crc kubenswrapper[4831]: I1203 06:31:18.604228 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:18 crc kubenswrapper[4831]: I1203 06:31:18.604444 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:18 crc kubenswrapper[4831]: I1203 06:31:18.605982 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:18 crc kubenswrapper[4831]: I1203 06:31:18.606168 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:18 crc kubenswrapper[4831]: I1203 06:31:18.606361 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:19 crc kubenswrapper[4831]: I1203 06:31:19.328106 4831 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 06:31:19 crc kubenswrapper[4831]: I1203 06:31:19.398231 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:19 crc kubenswrapper[4831]: I1203 06:31:19.400187 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:19 crc kubenswrapper[4831]: I1203 06:31:19.400275 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:19 crc kubenswrapper[4831]: I1203 06:31:19.400305 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:19 crc kubenswrapper[4831]: I1203 06:31:19.400378 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:31:19 crc kubenswrapper[4831]: E1203 06:31:19.405840 4831 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Dec 03 06:31:20 crc kubenswrapper[4831]: I1203 06:31:20.157641 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Dec 03 06:31:20 crc kubenswrapper[4831]: I1203 06:31:20.157908 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:20 crc kubenswrapper[4831]: I1203 06:31:20.160779 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:20 crc kubenswrapper[4831]: I1203 06:31:20.160817 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:20 crc kubenswrapper[4831]: I1203 06:31:20.160828 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:20 crc kubenswrapper[4831]: I1203 06:31:20.177014 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Dec 03 06:31:20 crc kubenswrapper[4831]: I1203 06:31:20.915171 4831 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 06:31:20 crc kubenswrapper[4831]: I1203 06:31:20.915271 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.108755 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.109675 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.109784 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.109858 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.320433 4831 trace.go:236] Trace[1684318561]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (03-Dec-2025 06:31:06.782) (total time: 14537ms): Dec 03 06:31:21 crc kubenswrapper[4831]: Trace[1684318561]: ---"Objects listed" error: 14537ms (06:31:21.320) Dec 03 06:31:21 crc kubenswrapper[4831]: Trace[1684318561]: [14.537999478s] [14.537999478s] END Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.320652 4831 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.321379 4831 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.354802 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57160->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.354883 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57160->192.168.126.11:17697: read: connection reset by peer" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.354816 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57168->192.168.126.11:17697: read: connection reset by peer" start-of-body= Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.355111 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:57168->192.168.126.11:17697: read: connection reset by peer" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.521850 4831 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.606216 4831 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.947863 4831 apiserver.go:52] "Watching apiserver" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.959184 4831 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.959819 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.960448 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.960561 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:21 crc kubenswrapper[4831]: E1203 06:31:21.960745 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.960790 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:21 crc kubenswrapper[4831]: E1203 06:31:21.960915 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.961290 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.962131 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.962348 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.962980 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.963080 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 06:31:21 crc kubenswrapper[4831]: E1203 06:31:21.963084 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.963189 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.965094 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.965217 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.965622 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.965895 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.966093 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 06:31:21 crc kubenswrapper[4831]: I1203 06:31:21.966280 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.023855 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.042739 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.047175 4831 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.057634 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.071040 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.079789 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.097021 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.107638 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.113860 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.116206 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812" exitCode=255 Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.116279 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812"} Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.120339 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126528 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126589 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126621 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126647 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126669 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126690 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126711 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126731 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126751 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126792 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126812 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126831 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126850 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126910 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126932 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126954 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126973 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.126997 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127018 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127039 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127061 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127082 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127103 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127123 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127129 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127165 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127265 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127305 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127373 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127404 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127433 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127457 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127479 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127503 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127528 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127559 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127589 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127627 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127656 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127683 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127709 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127735 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127767 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127795 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127821 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127841 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127850 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127929 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.127976 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128020 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128060 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128094 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128125 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128153 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128161 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128187 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128205 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128222 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128262 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128301 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128367 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128402 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128438 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128473 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128506 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128544 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128580 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128612 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128648 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128681 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128723 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128754 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128787 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128824 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128858 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128951 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128984 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129018 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129056 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129091 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129122 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129158 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129190 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129229 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129262 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129304 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129362 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129395 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129427 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129472 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129505 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129541 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129574 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129604 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129634 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129667 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129698 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129735 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129769 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129890 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129945 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129978 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130010 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130042 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130075 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130108 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130139 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130171 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130202 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130237 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130268 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130300 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130355 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130390 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130425 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130455 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130494 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130528 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130559 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130592 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130627 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130661 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130693 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130726 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130762 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130826 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130859 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130891 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130924 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130957 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131030 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131068 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131106 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131140 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131176 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131212 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131248 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131281 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131512 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131591 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131624 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131774 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131814 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131852 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131885 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131920 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132021 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132061 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132101 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132137 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132172 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132206 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132244 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132282 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132341 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132380 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132414 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132448 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132485 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132517 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132550 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132583 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132618 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132651 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132689 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132722 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132757 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132789 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132825 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132864 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132895 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132928 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132964 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132996 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133031 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133068 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133100 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133137 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133174 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133212 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133250 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133283 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133342 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133392 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133429 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133464 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133500 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133534 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133568 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133604 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133641 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133672 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133707 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133746 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133779 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133812 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133861 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133897 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133935 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133970 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134038 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134085 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134127 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134166 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134205 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134246 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134351 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134403 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134440 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134474 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134517 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134553 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134589 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134626 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134710 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134734 4831 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134759 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134810 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134829 4831 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128266 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128493 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128580 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.137097 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.137104 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128638 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128752 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128821 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.137178 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128923 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.128976 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129103 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.129199 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130097 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.137387 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130383 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.137373 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130446 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130739 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130812 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130970 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131059 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131307 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131599 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131631 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131646 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131660 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131636 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131773 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.131833 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132017 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132134 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.137579 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132566 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132563 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132784 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132923 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132981 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.137655 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.132996 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133010 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133187 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133197 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133388 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133534 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133900 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.133919 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134197 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134199 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134485 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.134690 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.136280 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.136264 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.136607 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.136689 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.136712 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.136924 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.136921 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.130715 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.137939 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.138133 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.138165 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.138154 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.138338 4831 scope.go:117] "RemoveContainer" containerID="e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.138350 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.138382 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.138540 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.138561 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.138924 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.139103 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.139121 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.139181 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.143255 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.144672 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.145457 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.145828 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:31:22.64580691 +0000 UTC m=+19.989390418 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.146489 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.147165 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.147641 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.147678 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.147744 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.147901 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.152854 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.152886 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.153149 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.153220 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.153407 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.153434 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.153598 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.153657 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.154430 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.154817 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.155057 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.155166 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.155425 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.155857 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.156443 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.156549 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.156592 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.156793 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.156955 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.157227 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.157337 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.157865 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.158268 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.158268 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.158491 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.158571 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.158979 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.159164 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.159298 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.159448 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.159465 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.159478 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.159594 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.159912 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.160188 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.160241 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.160398 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.160449 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.160606 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.160601 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.160665 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.160883 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.161046 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.161246 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.161437 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:22.661257596 +0000 UTC m=+20.004841144 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.161716 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.162026 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.162161 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.162242 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.162404 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.162300 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.162472 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.161524 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.162562 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.162881 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.162910 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.163377 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.164191 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.164262 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.164498 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.164505 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.164620 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.164697 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.165187 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.165072 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.165752 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.166368 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.166430 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.166574 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.166650 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.166918 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.166947 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.165971 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.167182 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.167203 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.167218 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.167241 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.167632 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.167924 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.168439 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.168450 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.168685 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.168722 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.168058 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.169003 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.169230 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.169308 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.169299 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.169440 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.169636 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.161156 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.170178 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.170423 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.170525 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.170991 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.171009 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:22.670595398 +0000 UTC m=+20.014178916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.171962 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.173240 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.173524 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.173531 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.173580 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.175877 4831 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.176042 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.177708 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.178664 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.182852 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.186912 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.186938 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.187062 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.186969 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.187365 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.187623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.190460 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.190483 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.190594 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.191172 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.191193 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.191207 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.191202 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.191256 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:22.691240187 +0000 UTC m=+20.034823695 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.193438 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.193484 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.193504 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.193591 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:22.69356703 +0000 UTC m=+20.037150638 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.196242 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.197278 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.198471 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.198845 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.207434 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.209084 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.212771 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.219260 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.220953 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.230711 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.234617 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235212 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235271 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235362 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235379 4831 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235393 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235404 4831 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235414 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235425 4831 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235437 4831 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235448 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235460 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235472 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235483 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235494 4831 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235505 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235515 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235528 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235539 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235549 4831 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235561 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235572 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235582 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235594 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235605 4831 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235616 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235627 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235638 4831 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235648 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235658 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235669 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235680 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235691 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235701 4831 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235712 4831 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235723 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235733 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235744 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235754 4831 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235764 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235774 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235785 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235795 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235807 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235819 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235831 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235842 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235852 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235863 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235876 4831 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235886 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235896 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235906 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235917 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235927 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235940 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235951 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235963 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235973 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235983 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.235996 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236009 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236021 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236044 4831 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236056 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236068 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236078 4831 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236088 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236099 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236110 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236120 4831 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236130 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236140 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236151 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236162 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236172 4831 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236184 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236194 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236205 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236215 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236226 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236237 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236247 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236258 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236269 4831 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236280 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236291 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236302 4831 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236333 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236346 4831 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236358 4831 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236370 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236381 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236391 4831 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236402 4831 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236413 4831 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236424 4831 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236433 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236445 4831 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236456 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236467 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236477 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236487 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236500 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236511 4831 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236521 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236531 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236542 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236554 4831 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236565 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236574 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236584 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236595 4831 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236605 4831 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236615 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236626 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236637 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236647 4831 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236657 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236667 4831 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236679 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236689 4831 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236699 4831 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236709 4831 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236719 4831 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236730 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236740 4831 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236751 4831 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236762 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236773 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236783 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236793 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236804 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236814 4831 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236825 4831 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236835 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236843 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236851 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236859 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236867 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236875 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236905 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236921 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236934 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236944 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236952 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236960 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236968 4831 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236976 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236987 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.236999 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237010 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237020 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237031 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237039 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237048 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237056 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237063 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237073 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237085 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237095 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237105 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237118 4831 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237129 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237139 4831 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237147 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237155 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237166 4831 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237176 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237187 4831 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237198 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237210 4831 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237222 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237232 4831 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237244 4831 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237254 4831 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237266 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237277 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237287 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237302 4831 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237816 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237840 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237851 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.237864 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.238296 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.238308 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.238332 4831 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.238342 4831 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.238353 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.238363 4831 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.238433 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.238537 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.246487 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.255208 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.288434 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.301187 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.313896 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Dec 03 06:31:22 crc kubenswrapper[4831]: W1203 06:31:22.331324 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-0988a3537e76d710913c8a803207f3759022667594e640bc3a337e0017176be2 WatchSource:0}: Error finding container 0988a3537e76d710913c8a803207f3759022667594e640bc3a337e0017176be2: Status 404 returned error can't find the container with id 0988a3537e76d710913c8a803207f3759022667594e640bc3a337e0017176be2 Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.339026 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.339269 4831 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.455818 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.467012 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.482307 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.490450 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.502785 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.513839 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.531219 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.541008 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.742436 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.742536 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.742648 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.742713 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:22 crc kubenswrapper[4831]: I1203 06:31:22.742759 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.742859 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.742931 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:23.742909754 +0000 UTC m=+21.086493292 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.743055 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:31:23.743030218 +0000 UTC m=+21.086613766 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.743202 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.743233 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.743257 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.743355 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:23.743300376 +0000 UTC m=+21.086883924 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.743462 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.743506 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:23.743492682 +0000 UTC m=+21.087076220 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.743582 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.743599 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.743613 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:22 crc kubenswrapper[4831]: E1203 06:31:22.743667 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:23.743645107 +0000 UTC m=+21.087228655 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.015375 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.015965 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.017047 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.017637 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.019123 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.019668 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.020337 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.021350 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.022008 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.023102 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.023737 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.024751 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.025201 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.025694 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.026546 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.027018 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.027928 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.028276 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.028816 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.029932 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.030443 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.031438 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.031867 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.033283 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.033683 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.034277 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.036102 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.036749 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.037073 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.037627 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.038101 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.039088 4831 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.039258 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.040796 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.041613 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.041989 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.043413 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.044030 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.044954 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.045568 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.046557 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.046971 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.047862 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.048457 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.049468 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.049887 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.050704 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.051158 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.052141 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.052674 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.053482 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.053906 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.054759 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.055272 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.055715 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.060495 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.080538 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.114710 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.123354 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.125505 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364"} Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.126489 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"0988a3537e76d710913c8a803207f3759022667594e640bc3a337e0017176be2"} Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.127702 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0"} Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.127753 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3d5c91f1f160a78e066f90b19b005fd2fe252507d7cace76db2872c5898b9661"} Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.130004 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5"} Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.130032 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311"} Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.130046 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"88149117a45cf1d42c1b4811946bd009e34be94252978090519de5bc7859e730"} Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.132848 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.137092 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.160016 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-cjft5"] Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.160382 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cjft5" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.163950 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.164268 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.164573 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.198666 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.214691 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.236290 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.261559 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.290163 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.304800 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.316278 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.349124 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bdcd6b2b-8124-46f0-9b94-e32e05ef6e49-hosts-file\") pod \"node-resolver-cjft5\" (UID: \"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\") " pod="openshift-dns/node-resolver-cjft5" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.349199 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4hvz\" (UniqueName: \"kubernetes.io/projected/bdcd6b2b-8124-46f0-9b94-e32e05ef6e49-kube-api-access-h4hvz\") pod \"node-resolver-cjft5\" (UID: \"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\") " pod="openshift-dns/node-resolver-cjft5" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.354703 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.388327 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.411751 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.450168 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4hvz\" (UniqueName: \"kubernetes.io/projected/bdcd6b2b-8124-46f0-9b94-e32e05ef6e49-kube-api-access-h4hvz\") pod \"node-resolver-cjft5\" (UID: \"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\") " pod="openshift-dns/node-resolver-cjft5" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.450237 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bdcd6b2b-8124-46f0-9b94-e32e05ef6e49-hosts-file\") pod \"node-resolver-cjft5\" (UID: \"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\") " pod="openshift-dns/node-resolver-cjft5" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.450394 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/bdcd6b2b-8124-46f0-9b94-e32e05ef6e49-hosts-file\") pod \"node-resolver-cjft5\" (UID: \"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\") " pod="openshift-dns/node-resolver-cjft5" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.473186 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4hvz\" (UniqueName: \"kubernetes.io/projected/bdcd6b2b-8124-46f0-9b94-e32e05ef6e49-kube-api-access-h4hvz\") pod \"node-resolver-cjft5\" (UID: \"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\") " pod="openshift-dns/node-resolver-cjft5" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.476050 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-cjft5" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.753836 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.753930 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.753962 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.753992 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.754016 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:23 crc kubenswrapper[4831]: E1203 06:31:23.754098 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:23 crc kubenswrapper[4831]: E1203 06:31:23.754160 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:25.754141014 +0000 UTC m=+23.097724522 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:23 crc kubenswrapper[4831]: E1203 06:31:23.754223 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:31:25.754213807 +0000 UTC m=+23.097797315 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:31:23 crc kubenswrapper[4831]: E1203 06:31:23.754309 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:23 crc kubenswrapper[4831]: E1203 06:31:23.754358 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:25.754346291 +0000 UTC m=+23.097929799 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:23 crc kubenswrapper[4831]: E1203 06:31:23.754427 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:23 crc kubenswrapper[4831]: E1203 06:31:23.754442 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:23 crc kubenswrapper[4831]: E1203 06:31:23.754456 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:23 crc kubenswrapper[4831]: E1203 06:31:23.754485 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:25.754476425 +0000 UTC m=+23.098059933 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:23 crc kubenswrapper[4831]: E1203 06:31:23.754536 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:23 crc kubenswrapper[4831]: E1203 06:31:23.754550 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:23 crc kubenswrapper[4831]: E1203 06:31:23.754559 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:23 crc kubenswrapper[4831]: E1203 06:31:23.754584 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:25.754576518 +0000 UTC m=+23.098160016 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.950561 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-vz8ft"] Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.951165 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vz8ft" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.953112 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.955789 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.957748 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.957935 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.957942 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-j2xfs"] Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.958045 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.958511 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.965808 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-dvcq5"] Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.966079 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.972337 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.973781 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.975329 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.975516 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.975634 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.975759 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.975905 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 06:31:23 crc kubenswrapper[4831]: I1203 06:31:23.976011 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.012429 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.012457 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.012456 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:24 crc kubenswrapper[4831]: E1203 06:31:24.012566 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:24 crc kubenswrapper[4831]: E1203 06:31:24.012654 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:24 crc kubenswrapper[4831]: E1203 06:31:24.012764 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.014859 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.043485 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.055607 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cc17c62-00e2-4756-afa5-60655e6a5a71-system-cni-dir\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.055643 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-hostroot\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.055662 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cc17c62-00e2-4756-afa5-60655e6a5a71-os-release\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.055681 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-multus-cni-dir\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.055699 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/74a16df4-1f25-4b0f-9e08-f6486f262a68-multus-daemon-config\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.055718 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cc17c62-00e2-4756-afa5-60655e6a5a71-cnibin\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.055740 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2cc17c62-00e2-4756-afa5-60655e6a5a71-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.055858 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cc17c62-00e2-4756-afa5-60655e6a5a71-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.055892 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.055923 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-var-lib-cni-bin\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.055952 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-run-multus-certs\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056045 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-run-k8s-cni-cncf-io\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056088 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr6sx\" (UniqueName: \"kubernetes.io/projected/2cc17c62-00e2-4756-afa5-60655e6a5a71-kube-api-access-jr6sx\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056108 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-cnibin\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056125 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-multus-conf-dir\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056146 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-var-lib-cni-multus\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056161 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-etc-kubernetes\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056181 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cc17c62-00e2-4756-afa5-60655e6a5a71-cni-binary-copy\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056197 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-system-cni-dir\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056256 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbzrh\" (UniqueName: \"kubernetes.io/projected/74a16df4-1f25-4b0f-9e08-f6486f262a68-kube-api-access-qbzrh\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056340 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-multus-socket-dir-parent\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056361 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-run-netns\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056385 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-var-lib-kubelet\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056427 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-os-release\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.056452 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74a16df4-1f25-4b0f-9e08-f6486f262a68-cni-binary-copy\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.070659 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.083546 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.099043 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.111870 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.121082 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.133491 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cjft5" event={"ID":"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49","Type":"ContainerStarted","Data":"a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638"} Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.133547 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-cjft5" event={"ID":"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49","Type":"ContainerStarted","Data":"b32eab2eaf6591bbaa55f92548abeeb991ddeac53170787e8aeb25b8b8798691"} Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.133896 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.136527 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.150769 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157044 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cc17c62-00e2-4756-afa5-60655e6a5a71-cni-binary-copy\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157088 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-var-lib-cni-multus\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157110 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-etc-kubernetes\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157132 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-system-cni-dir\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157158 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkqgc\" (UniqueName: \"kubernetes.io/projected/4e04caf2-8e18-4af8-9779-c5711262077b-kube-api-access-zkqgc\") pod \"machine-config-daemon-dvcq5\" (UID: \"4e04caf2-8e18-4af8-9779-c5711262077b\") " pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157183 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-run-netns\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157206 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbzrh\" (UniqueName: \"kubernetes.io/projected/74a16df4-1f25-4b0f-9e08-f6486f262a68-kube-api-access-qbzrh\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157233 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-etc-kubernetes\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157249 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-multus-socket-dir-parent\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157328 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-var-lib-kubelet\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157334 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-multus-socket-dir-parent\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157352 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-run-netns\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157389 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-var-lib-kubelet\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157331 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-system-cni-dir\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157383 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-os-release\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157469 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74a16df4-1f25-4b0f-9e08-f6486f262a68-cni-binary-copy\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157496 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-hostroot\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157520 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cc17c62-00e2-4756-afa5-60655e6a5a71-system-cni-dir\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157566 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-hostroot\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157585 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/2cc17c62-00e2-4756-afa5-60655e6a5a71-system-cni-dir\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157600 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-os-release\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157630 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cc17c62-00e2-4756-afa5-60655e6a5a71-os-release\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157654 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-multus-cni-dir\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157689 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/74a16df4-1f25-4b0f-9e08-f6486f262a68-multus-daemon-config\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157700 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/2cc17c62-00e2-4756-afa5-60655e6a5a71-os-release\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157708 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cc17c62-00e2-4756-afa5-60655e6a5a71-cnibin\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157727 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/2cc17c62-00e2-4756-afa5-60655e6a5a71-cnibin\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157747 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/2cc17c62-00e2-4756-afa5-60655e6a5a71-cni-binary-copy\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157775 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2cc17c62-00e2-4756-afa5-60655e6a5a71-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157810 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4e04caf2-8e18-4af8-9779-c5711262077b-rootfs\") pod \"machine-config-daemon-dvcq5\" (UID: \"4e04caf2-8e18-4af8-9779-c5711262077b\") " pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157821 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-multus-cni-dir\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157839 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cc17c62-00e2-4756-afa5-60655e6a5a71-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157867 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e04caf2-8e18-4af8-9779-c5711262077b-proxy-tls\") pod \"machine-config-daemon-dvcq5\" (UID: \"4e04caf2-8e18-4af8-9779-c5711262077b\") " pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157902 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4e04caf2-8e18-4af8-9779-c5711262077b-mcd-auth-proxy-config\") pod \"machine-config-daemon-dvcq5\" (UID: \"4e04caf2-8e18-4af8-9779-c5711262077b\") " pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157942 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-var-lib-cni-bin\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157954 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-var-lib-cni-multus\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157967 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-run-multus-certs\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.157997 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-run-k8s-cni-cncf-io\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.158016 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr6sx\" (UniqueName: \"kubernetes.io/projected/2cc17c62-00e2-4756-afa5-60655e6a5a71-kube-api-access-jr6sx\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.158026 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-run-multus-certs\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.158033 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-cnibin\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.158063 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-multus-conf-dir\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.158069 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-cnibin\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.158179 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-multus-conf-dir\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.158197 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-run-k8s-cni-cncf-io\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.158659 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/74a16df4-1f25-4b0f-9e08-f6486f262a68-host-var-lib-cni-bin\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.158692 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/2cc17c62-00e2-4756-afa5-60655e6a5a71-tuning-conf-dir\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.158676 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/2cc17c62-00e2-4756-afa5-60655e6a5a71-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.158829 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74a16df4-1f25-4b0f-9e08-f6486f262a68-cni-binary-copy\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.162675 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/74a16df4-1f25-4b0f-9e08-f6486f262a68-multus-daemon-config\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.162862 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.173900 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr6sx\" (UniqueName: \"kubernetes.io/projected/2cc17c62-00e2-4756-afa5-60655e6a5a71-kube-api-access-jr6sx\") pod \"multus-additional-cni-plugins-j2xfs\" (UID: \"2cc17c62-00e2-4756-afa5-60655e6a5a71\") " pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.176905 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbzrh\" (UniqueName: \"kubernetes.io/projected/74a16df4-1f25-4b0f-9e08-f6486f262a68-kube-api-access-qbzrh\") pod \"multus-vz8ft\" (UID: \"74a16df4-1f25-4b0f-9e08-f6486f262a68\") " pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.180225 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.193595 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.203048 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.214595 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.227034 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.236734 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.258442 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.258576 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkqgc\" (UniqueName: \"kubernetes.io/projected/4e04caf2-8e18-4af8-9779-c5711262077b-kube-api-access-zkqgc\") pod \"machine-config-daemon-dvcq5\" (UID: \"4e04caf2-8e18-4af8-9779-c5711262077b\") " pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.258951 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4e04caf2-8e18-4af8-9779-c5711262077b-rootfs\") pod \"machine-config-daemon-dvcq5\" (UID: \"4e04caf2-8e18-4af8-9779-c5711262077b\") " pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.258998 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e04caf2-8e18-4af8-9779-c5711262077b-proxy-tls\") pod \"machine-config-daemon-dvcq5\" (UID: \"4e04caf2-8e18-4af8-9779-c5711262077b\") " pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.259019 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4e04caf2-8e18-4af8-9779-c5711262077b-mcd-auth-proxy-config\") pod \"machine-config-daemon-dvcq5\" (UID: \"4e04caf2-8e18-4af8-9779-c5711262077b\") " pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.259180 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/4e04caf2-8e18-4af8-9779-c5711262077b-rootfs\") pod \"machine-config-daemon-dvcq5\" (UID: \"4e04caf2-8e18-4af8-9779-c5711262077b\") " pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.260010 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4e04caf2-8e18-4af8-9779-c5711262077b-mcd-auth-proxy-config\") pod \"machine-config-daemon-dvcq5\" (UID: \"4e04caf2-8e18-4af8-9779-c5711262077b\") " pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.267772 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/4e04caf2-8e18-4af8-9779-c5711262077b-proxy-tls\") pod \"machine-config-daemon-dvcq5\" (UID: \"4e04caf2-8e18-4af8-9779-c5711262077b\") " pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.274697 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.278827 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkqgc\" (UniqueName: \"kubernetes.io/projected/4e04caf2-8e18-4af8-9779-c5711262077b-kube-api-access-zkqgc\") pod \"machine-config-daemon-dvcq5\" (UID: \"4e04caf2-8e18-4af8-9779-c5711262077b\") " pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.288330 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.291609 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vz8ft" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.291642 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.291659 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.304018 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.316700 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: W1203 06:31:24.321805 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cc17c62_00e2_4756_afa5_60655e6a5a71.slice/crio-a68f5dca1e177241e1ed8ccf7689c08826e7f4c7833e83b646d6a4b3d1114697 WatchSource:0}: Error finding container a68f5dca1e177241e1ed8ccf7689c08826e7f4c7833e83b646d6a4b3d1114697: Status 404 returned error can't find the container with id a68f5dca1e177241e1ed8ccf7689c08826e7f4c7833e83b646d6a4b3d1114697 Dec 03 06:31:24 crc kubenswrapper[4831]: W1203 06:31:24.322610 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e04caf2_8e18_4af8_9779_c5711262077b.slice/crio-13783f5b8d900adc9ef266d42b92f63cb2b7bbc362cb5841fd8e9cb9ee623bd1 WatchSource:0}: Error finding container 13783f5b8d900adc9ef266d42b92f63cb2b7bbc362cb5841fd8e9cb9ee623bd1: Status 404 returned error can't find the container with id 13783f5b8d900adc9ef266d42b92f63cb2b7bbc362cb5841fd8e9cb9ee623bd1 Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.329019 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.340269 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.353301 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ps95j"] Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.354561 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.355459 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.357713 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.357743 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.359299 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.359558 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.359631 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.359490 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.359690 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.367808 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.383018 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.394161 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.407412 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.422834 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.443711 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.458706 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460173 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-systemd-units\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460213 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-ovn\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460233 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-etc-openvswitch\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460277 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-node-log\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460322 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-kubelet\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460353 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-slash\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460373 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-cni-bin\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460395 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sntzl\" (UniqueName: \"kubernetes.io/projected/3d7d0c92-6857-4846-93ab-3364282a1e85-kube-api-access-sntzl\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460413 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-var-lib-openvswitch\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460429 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-run-ovn-kubernetes\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460445 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-ovnkube-config\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460473 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d7d0c92-6857-4846-93ab-3364282a1e85-ovn-node-metrics-cert\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460534 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-run-netns\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460550 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-cni-netd\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460583 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-env-overrides\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460598 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-systemd\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460612 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-openvswitch\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460628 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-log-socket\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460677 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.460702 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-ovnkube-script-lib\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.471425 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.484391 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.509437 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.537876 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.551001 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561579 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-ovnkube-script-lib\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561620 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-systemd-units\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561646 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-ovn\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561675 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-etc-openvswitch\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561731 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-node-log\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561750 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-kubelet\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561769 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-slash\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561789 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-cni-bin\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561812 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sntzl\" (UniqueName: \"kubernetes.io/projected/3d7d0c92-6857-4846-93ab-3364282a1e85-kube-api-access-sntzl\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561831 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-var-lib-openvswitch\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561852 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-run-ovn-kubernetes\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561849 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-ovn\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561872 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-ovnkube-config\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561951 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d7d0c92-6857-4846-93ab-3364282a1e85-ovn-node-metrics-cert\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561978 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-run-netns\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.561996 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-cni-netd\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562051 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-env-overrides\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562067 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-systemd\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562083 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-openvswitch\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562097 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-log-socket\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562123 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562186 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562210 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-systemd-units\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562507 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-slash\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562572 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-etc-openvswitch\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562582 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-ovnkube-script-lib\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562611 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-node-log\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562653 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-kubelet\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562655 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-ovnkube-config\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562692 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-run-ovn-kubernetes\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562741 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-var-lib-openvswitch\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562783 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-cni-bin\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562816 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-cni-netd\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562842 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-run-netns\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562842 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-openvswitch\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562861 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-systemd\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.562877 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-log-socket\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.563084 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-env-overrides\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.564623 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.565879 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d7d0c92-6857-4846-93ab-3364282a1e85-ovn-node-metrics-cert\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.574955 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.579700 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sntzl\" (UniqueName: \"kubernetes.io/projected/3d7d0c92-6857-4846-93ab-3364282a1e85-kube-api-access-sntzl\") pod \"ovnkube-node-ps95j\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.587409 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.600163 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.613674 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:24Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:24 crc kubenswrapper[4831]: I1203 06:31:24.687472 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:24 crc kubenswrapper[4831]: W1203 06:31:24.705418 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d7d0c92_6857_4846_93ab_3364282a1e85.slice/crio-18f819e0cd64ee0bce1037ff1266ef042a5491ae047c2f407f4e4fb3d863dea3 WatchSource:0}: Error finding container 18f819e0cd64ee0bce1037ff1266ef042a5491ae047c2f407f4e4fb3d863dea3: Status 404 returned error can't find the container with id 18f819e0cd64ee0bce1037ff1266ef042a5491ae047c2f407f4e4fb3d863dea3 Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.137675 4831 generic.go:334] "Generic (PLEG): container finished" podID="2cc17c62-00e2-4756-afa5-60655e6a5a71" containerID="72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e" exitCode=0 Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.137740 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" event={"ID":"2cc17c62-00e2-4756-afa5-60655e6a5a71","Type":"ContainerDied","Data":"72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e"} Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.137769 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" event={"ID":"2cc17c62-00e2-4756-afa5-60655e6a5a71","Type":"ContainerStarted","Data":"a68f5dca1e177241e1ed8ccf7689c08826e7f4c7833e83b646d6a4b3d1114697"} Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.140464 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646"} Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.140536 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c"} Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.140557 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"13783f5b8d900adc9ef266d42b92f63cb2b7bbc362cb5841fd8e9cb9ee623bd1"} Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.142690 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vz8ft" event={"ID":"74a16df4-1f25-4b0f-9e08-f6486f262a68","Type":"ContainerStarted","Data":"7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5"} Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.142747 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vz8ft" event={"ID":"74a16df4-1f25-4b0f-9e08-f6486f262a68","Type":"ContainerStarted","Data":"453790abb92d14715d8976da93a80c64212cc9bac0625036a88477e1dcd10597"} Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.144308 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205"} Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.145466 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerID="f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797" exitCode=0 Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.145534 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797"} Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.145571 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerStarted","Data":"18f819e0cd64ee0bce1037ff1266ef042a5491ae047c2f407f4e4fb3d863dea3"} Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.154249 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.170799 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.182243 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.192288 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.206719 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.225813 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.238698 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.250885 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.265710 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.280237 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.294622 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.307414 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.319606 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.330245 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.342731 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.357682 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.373433 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.395575 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.409712 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.424176 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.438056 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-dm6hd"] Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.438643 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dm6hd" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.439643 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.442210 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.442375 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.442553 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.443303 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.454840 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.474002 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.488933 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.504044 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.525164 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.534680 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.545527 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.559176 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.574940 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1-serviceca\") pod \"node-ca-dm6hd\" (UID: \"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\") " pod="openshift-image-registry/node-ca-dm6hd" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.574988 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1-host\") pod \"node-ca-dm6hd\" (UID: \"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\") " pod="openshift-image-registry/node-ca-dm6hd" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.575020 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zppkv\" (UniqueName: \"kubernetes.io/projected/8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1-kube-api-access-zppkv\") pod \"node-ca-dm6hd\" (UID: \"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\") " pod="openshift-image-registry/node-ca-dm6hd" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.605930 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.621593 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.639102 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.657684 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.670290 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.675761 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1-serviceca\") pod \"node-ca-dm6hd\" (UID: \"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\") " pod="openshift-image-registry/node-ca-dm6hd" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.675820 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1-host\") pod \"node-ca-dm6hd\" (UID: \"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\") " pod="openshift-image-registry/node-ca-dm6hd" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.675859 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zppkv\" (UniqueName: \"kubernetes.io/projected/8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1-kube-api-access-zppkv\") pod \"node-ca-dm6hd\" (UID: \"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\") " pod="openshift-image-registry/node-ca-dm6hd" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.676184 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1-host\") pod \"node-ca-dm6hd\" (UID: \"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\") " pod="openshift-image-registry/node-ca-dm6hd" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.676971 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1-serviceca\") pod \"node-ca-dm6hd\" (UID: \"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\") " pod="openshift-image-registry/node-ca-dm6hd" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.682708 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.697889 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zppkv\" (UniqueName: \"kubernetes.io/projected/8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1-kube-api-access-zppkv\") pod \"node-ca-dm6hd\" (UID: \"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\") " pod="openshift-image-registry/node-ca-dm6hd" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.700333 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.727052 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.776283 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.776433 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.776482 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:31:29.77644266 +0000 UTC m=+27.120026168 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.776558 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.776597 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.776628 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.776675 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:29.776655217 +0000 UTC m=+27.120238735 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.776703 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.776723 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.776810 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.776815 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.776832 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:29.776810411 +0000 UTC m=+27.120393909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.776833 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.776841 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.776856 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.776863 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.776916 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:29.776906135 +0000 UTC m=+27.120489663 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.776937 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:29.776927525 +0000 UTC m=+27.120511033 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.790015 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dm6hd" Dec 03 06:31:25 crc kubenswrapper[4831]: W1203 06:31:25.802729 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c66c9ba_10ff_43b1_baab_d4bb0b32d7a1.slice/crio-2a132c6ff6b00831c4956c15e5df1ba5ac6dd5ead53ba53fddeb731f95a8d984 WatchSource:0}: Error finding container 2a132c6ff6b00831c4956c15e5df1ba5ac6dd5ead53ba53fddeb731f95a8d984: Status 404 returned error can't find the container with id 2a132c6ff6b00831c4956c15e5df1ba5ac6dd5ead53ba53fddeb731f95a8d984 Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.806117 4831 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.807686 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.807739 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.807757 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.807897 4831 kubelet_node_status.go:76] "Attempting to register node" node="crc" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.814032 4831 kubelet_node_status.go:115] "Node was previously registered" node="crc" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.814292 4831 kubelet_node_status.go:79] "Successfully registered node" node="crc" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.815889 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.815933 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.815950 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.815972 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.815989 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:25Z","lastTransitionTime":"2025-12-03T06:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.844383 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.855966 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.856000 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.856010 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.856026 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.856037 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:25Z","lastTransitionTime":"2025-12-03T06:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.868541 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.871211 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.871248 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.871260 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.871274 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.871286 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:25Z","lastTransitionTime":"2025-12-03T06:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.882181 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.885518 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.885543 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.885551 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.885565 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.885603 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:25Z","lastTransitionTime":"2025-12-03T06:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.899819 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.909609 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.909655 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.909668 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.909688 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.909700 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:25Z","lastTransitionTime":"2025-12-03T06:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.924197 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:25Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:25 crc kubenswrapper[4831]: E1203 06:31:25.924389 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.928036 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.928076 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.928088 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.928107 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:25 crc kubenswrapper[4831]: I1203 06:31:25.928118 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:25Z","lastTransitionTime":"2025-12-03T06:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.012283 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.012360 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.012290 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:26 crc kubenswrapper[4831]: E1203 06:31:26.012472 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:26 crc kubenswrapper[4831]: E1203 06:31:26.012601 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:26 crc kubenswrapper[4831]: E1203 06:31:26.012719 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.030408 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.030784 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.030795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.030816 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.030828 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:26Z","lastTransitionTime":"2025-12-03T06:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.135895 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.135954 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.135972 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.136048 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.136109 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:26Z","lastTransitionTime":"2025-12-03T06:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.156859 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerStarted","Data":"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.156925 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerStarted","Data":"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.156938 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerStarted","Data":"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.156949 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerStarted","Data":"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.156959 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerStarted","Data":"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.156969 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerStarted","Data":"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.158792 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dm6hd" event={"ID":"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1","Type":"ContainerStarted","Data":"8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.158825 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dm6hd" event={"ID":"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1","Type":"ContainerStarted","Data":"2a132c6ff6b00831c4956c15e5df1ba5ac6dd5ead53ba53fddeb731f95a8d984"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.163562 4831 generic.go:334] "Generic (PLEG): container finished" podID="2cc17c62-00e2-4756-afa5-60655e6a5a71" containerID="7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f" exitCode=0 Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.163656 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" event={"ID":"2cc17c62-00e2-4756-afa5-60655e6a5a71","Type":"ContainerDied","Data":"7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.184903 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.200540 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.212905 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.223368 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.238852 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.238913 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.238924 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.238940 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.238971 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:26Z","lastTransitionTime":"2025-12-03T06:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.244342 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.266538 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.278451 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.295760 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.304941 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.317990 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.337221 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.346271 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.346302 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.346327 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.346345 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.346355 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:26Z","lastTransitionTime":"2025-12-03T06:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.355565 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.368034 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.387705 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.403475 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.418265 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.445168 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.448773 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.448810 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.448820 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.448836 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.448845 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:26Z","lastTransitionTime":"2025-12-03T06:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.481888 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.523659 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.551890 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.551938 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.551991 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.552015 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.552029 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:26Z","lastTransitionTime":"2025-12-03T06:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.559156 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.603522 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.650294 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.654678 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.654721 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.654731 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.654752 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.654763 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:26Z","lastTransitionTime":"2025-12-03T06:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.683506 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.721935 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.758433 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.758489 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.758508 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.758533 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.758553 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:26Z","lastTransitionTime":"2025-12-03T06:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.770480 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.808280 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:26Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.860303 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.860367 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.860376 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.860392 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.860402 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:26Z","lastTransitionTime":"2025-12-03T06:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.962557 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.962587 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.962595 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.962607 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:26 crc kubenswrapper[4831]: I1203 06:31:26.962616 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:26Z","lastTransitionTime":"2025-12-03T06:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.065295 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.065353 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.065363 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.065377 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.065386 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:27Z","lastTransitionTime":"2025-12-03T06:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.167901 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.167941 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.167951 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.167968 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.167978 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:27Z","lastTransitionTime":"2025-12-03T06:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.169583 4831 generic.go:334] "Generic (PLEG): container finished" podID="2cc17c62-00e2-4756-afa5-60655e6a5a71" containerID="3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578" exitCode=0 Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.169637 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" event={"ID":"2cc17c62-00e2-4756-afa5-60655e6a5a71","Type":"ContainerDied","Data":"3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578"} Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.190866 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.207583 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.223919 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.233826 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.244478 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.258180 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.270156 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.270263 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.270279 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.270287 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.270300 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.270307 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:27Z","lastTransitionTime":"2025-12-03T06:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.282371 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.297404 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.312416 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.323442 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.334486 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.346485 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.372914 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.372963 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.372973 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.372988 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.373000 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:27Z","lastTransitionTime":"2025-12-03T06:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.475306 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.475372 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.475383 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.475399 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.475410 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:27Z","lastTransitionTime":"2025-12-03T06:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.578250 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.578291 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.578302 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.578338 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.578350 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:27Z","lastTransitionTime":"2025-12-03T06:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.681651 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.681712 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.681729 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.681755 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.681772 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:27Z","lastTransitionTime":"2025-12-03T06:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.785067 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.785123 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.785142 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.785166 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.785186 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:27Z","lastTransitionTime":"2025-12-03T06:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.888129 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.888183 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.888201 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.888223 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.888239 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:27Z","lastTransitionTime":"2025-12-03T06:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.920866 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.927788 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.934384 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.945803 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.965819 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.980727 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.991247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.991348 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.991375 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.991405 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.991422 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:27Z","lastTransitionTime":"2025-12-03T06:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:27 crc kubenswrapper[4831]: I1203 06:31:27.996050 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.012686 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.012789 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.012706 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:28 crc kubenswrapper[4831]: E1203 06:31:28.012875 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:28 crc kubenswrapper[4831]: E1203 06:31:28.013087 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:28 crc kubenswrapper[4831]: E1203 06:31:28.013254 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.016104 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.037591 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.057040 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.077630 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.091785 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.093389 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.093430 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.093447 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.093469 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.093486 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:28Z","lastTransitionTime":"2025-12-03T06:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.103882 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.118667 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.142613 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.184123 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.191352 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" event={"ID":"2cc17c62-00e2-4756-afa5-60655e6a5a71","Type":"ContainerDied","Data":"22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738"} Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.191299 4831 generic.go:334] "Generic (PLEG): container finished" podID="2cc17c62-00e2-4756-afa5-60655e6a5a71" containerID="22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738" exitCode=0 Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.195615 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.195650 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.195663 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.195681 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.195695 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:28Z","lastTransitionTime":"2025-12-03T06:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.201829 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.220531 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.236520 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.253436 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.272607 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.285983 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.299248 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.299370 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.299398 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.299432 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.299459 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:28Z","lastTransitionTime":"2025-12-03T06:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.303858 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.320431 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.342307 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.362877 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.381670 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.399715 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.401595 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.401830 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.401927 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.402018 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.402093 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:28Z","lastTransitionTime":"2025-12-03T06:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.414232 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.428169 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.464224 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.505193 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.505240 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.505259 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.505282 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.505300 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:28Z","lastTransitionTime":"2025-12-03T06:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.505968 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.544687 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.588135 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.607986 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.608019 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.608032 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.608048 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.608060 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:28Z","lastTransitionTime":"2025-12-03T06:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.631615 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.666486 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.707670 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.710660 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.710701 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.710720 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.710741 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.710757 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:28Z","lastTransitionTime":"2025-12-03T06:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.745833 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.788305 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.814177 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.814222 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.814238 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.814263 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.814280 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:28Z","lastTransitionTime":"2025-12-03T06:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.826871 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.865660 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.906028 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.916673 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.916724 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.916740 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.916765 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.916785 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:28Z","lastTransitionTime":"2025-12-03T06:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.944463 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:28 crc kubenswrapper[4831]: I1203 06:31:28.982959 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:28Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.041035 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.041132 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.041152 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.041210 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.041229 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:29Z","lastTransitionTime":"2025-12-03T06:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.144129 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.144200 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.144227 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.144258 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.144281 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:29Z","lastTransitionTime":"2025-12-03T06:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.223876 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerStarted","Data":"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f"} Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.228049 4831 generic.go:334] "Generic (PLEG): container finished" podID="2cc17c62-00e2-4756-afa5-60655e6a5a71" containerID="4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc" exitCode=0 Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.228098 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" event={"ID":"2cc17c62-00e2-4756-afa5-60655e6a5a71","Type":"ContainerDied","Data":"4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc"} Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.249020 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.249079 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.249103 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.249134 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.249157 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:29Z","lastTransitionTime":"2025-12-03T06:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.253538 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.268837 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.279273 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.302826 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.319434 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.337704 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.351255 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.351291 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.351300 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.351327 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.351337 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:29Z","lastTransitionTime":"2025-12-03T06:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.354590 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.372666 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.395736 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.414413 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.433889 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.454497 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.454548 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.454565 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.454587 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.454603 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:29Z","lastTransitionTime":"2025-12-03T06:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.468025 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.510918 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.547747 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:29Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.557374 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.557439 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.557461 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.557490 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.557512 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:29Z","lastTransitionTime":"2025-12-03T06:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.660084 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.660133 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.660151 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.660175 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.660192 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:29Z","lastTransitionTime":"2025-12-03T06:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.763697 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.763762 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.763781 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.763806 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.763822 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:29Z","lastTransitionTime":"2025-12-03T06:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.843637 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.843823 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:29 crc kubenswrapper[4831]: E1203 06:31:29.843900 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:31:37.843867198 +0000 UTC m=+35.187450746 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.843958 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.844002 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.844044 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:29 crc kubenswrapper[4831]: E1203 06:31:29.844067 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:29 crc kubenswrapper[4831]: E1203 06:31:29.844165 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:29 crc kubenswrapper[4831]: E1203 06:31:29.844196 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:37.844162108 +0000 UTC m=+35.187745656 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:29 crc kubenswrapper[4831]: E1203 06:31:29.844171 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:29 crc kubenswrapper[4831]: E1203 06:31:29.844220 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:29 crc kubenswrapper[4831]: E1203 06:31:29.844243 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:29 crc kubenswrapper[4831]: E1203 06:31:29.844262 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:29 crc kubenswrapper[4831]: E1203 06:31:29.844206 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:29 crc kubenswrapper[4831]: E1203 06:31:29.844405 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:29 crc kubenswrapper[4831]: E1203 06:31:29.844376 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:37.844258051 +0000 UTC m=+35.187841639 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:29 crc kubenswrapper[4831]: E1203 06:31:29.844493 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:37.844463327 +0000 UTC m=+35.188046955 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:29 crc kubenswrapper[4831]: E1203 06:31:29.844536 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:37.844513659 +0000 UTC m=+35.188097307 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.867369 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.867437 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.867458 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.867485 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.867504 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:29Z","lastTransitionTime":"2025-12-03T06:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.970713 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.970750 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.970762 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.970779 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:29 crc kubenswrapper[4831]: I1203 06:31:29.970791 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:29Z","lastTransitionTime":"2025-12-03T06:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.012271 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.012398 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:30 crc kubenswrapper[4831]: E1203 06:31:30.012447 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:30 crc kubenswrapper[4831]: E1203 06:31:30.012592 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.012707 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:30 crc kubenswrapper[4831]: E1203 06:31:30.012798 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.073223 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.073260 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.073272 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.073289 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.073300 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:30Z","lastTransitionTime":"2025-12-03T06:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.176283 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.176366 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.176384 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.176406 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.176422 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:30Z","lastTransitionTime":"2025-12-03T06:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.236793 4831 generic.go:334] "Generic (PLEG): container finished" podID="2cc17c62-00e2-4756-afa5-60655e6a5a71" containerID="cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266" exitCode=0 Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.236862 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" event={"ID":"2cc17c62-00e2-4756-afa5-60655e6a5a71","Type":"ContainerDied","Data":"cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266"} Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.259631 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.280951 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.282676 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.282728 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.282747 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.282774 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.282790 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:30Z","lastTransitionTime":"2025-12-03T06:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.299815 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.320059 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.344441 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.364354 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.384033 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.385821 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.385868 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.385877 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.386053 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.386073 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:30Z","lastTransitionTime":"2025-12-03T06:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.408235 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.438722 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.461712 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.481990 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.489827 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.489863 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.489875 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.489903 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.489916 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:30Z","lastTransitionTime":"2025-12-03T06:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.498772 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.516199 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.527098 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:30Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.594591 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.594636 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.594648 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.594664 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.594673 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:30Z","lastTransitionTime":"2025-12-03T06:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.697865 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.697907 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.697918 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.697935 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.697947 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:30Z","lastTransitionTime":"2025-12-03T06:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.800287 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.800344 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.800358 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.800378 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.800391 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:30Z","lastTransitionTime":"2025-12-03T06:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.903582 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.903615 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.903623 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.903653 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:30 crc kubenswrapper[4831]: I1203 06:31:30.903662 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:30Z","lastTransitionTime":"2025-12-03T06:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.006835 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.006867 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.006876 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.006890 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.006898 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:31Z","lastTransitionTime":"2025-12-03T06:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.109341 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.109392 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.109422 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.109440 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.109451 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:31Z","lastTransitionTime":"2025-12-03T06:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.211278 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.211337 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.211348 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.211365 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.211375 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:31Z","lastTransitionTime":"2025-12-03T06:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.247949 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" event={"ID":"2cc17c62-00e2-4756-afa5-60655e6a5a71","Type":"ContainerStarted","Data":"af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6"} Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.254825 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerStarted","Data":"0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07"} Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.255120 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.255180 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.294143 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.295756 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.312262 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.313742 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.313781 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.313805 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.313821 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.313831 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:31Z","lastTransitionTime":"2025-12-03T06:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.323843 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.334981 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.347229 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.361598 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.382874 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.397221 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.410835 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.416482 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.416507 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.416520 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.416538 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.416551 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:31Z","lastTransitionTime":"2025-12-03T06:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.425192 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.436951 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.447690 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.460442 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.473372 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.491247 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.507634 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.519015 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.519061 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.519078 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.519100 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.519116 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:31Z","lastTransitionTime":"2025-12-03T06:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.520841 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.541499 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.554441 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.568917 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.582916 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.605340 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.620448 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.621927 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.621952 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.621960 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.621987 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.621996 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:31Z","lastTransitionTime":"2025-12-03T06:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.638709 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.649257 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.658484 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.676576 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.688644 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.698933 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:31Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.724123 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.724190 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.724216 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.724247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.724270 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:31Z","lastTransitionTime":"2025-12-03T06:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.827365 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.827452 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.827502 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.827526 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.827543 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:31Z","lastTransitionTime":"2025-12-03T06:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.930972 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.931061 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.931080 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.931158 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:31 crc kubenswrapper[4831]: I1203 06:31:31.931179 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:31Z","lastTransitionTime":"2025-12-03T06:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.012500 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.012550 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.012515 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:32 crc kubenswrapper[4831]: E1203 06:31:32.012695 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:32 crc kubenswrapper[4831]: E1203 06:31:32.012849 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:32 crc kubenswrapper[4831]: E1203 06:31:32.012982 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.034018 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.034060 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.034075 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.034100 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.034119 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:32Z","lastTransitionTime":"2025-12-03T06:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.137614 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.137687 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.137711 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.137743 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.137768 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:32Z","lastTransitionTime":"2025-12-03T06:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.240947 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.241034 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.241063 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.241093 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.241116 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:32Z","lastTransitionTime":"2025-12-03T06:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.258381 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.344326 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.344374 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.344388 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.344407 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.344421 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:32Z","lastTransitionTime":"2025-12-03T06:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.447188 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.447225 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.447233 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.447247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.447256 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:32Z","lastTransitionTime":"2025-12-03T06:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.550236 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.550304 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.550348 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.550372 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.550389 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:32Z","lastTransitionTime":"2025-12-03T06:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.653738 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.653830 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.653851 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.653884 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.653906 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:32Z","lastTransitionTime":"2025-12-03T06:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.756719 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.756758 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.756768 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.756784 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.756795 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:32Z","lastTransitionTime":"2025-12-03T06:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.859868 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.860281 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.860305 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.860357 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.860375 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:32Z","lastTransitionTime":"2025-12-03T06:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.963681 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.963741 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.963758 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.963782 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:32 crc kubenswrapper[4831]: I1203 06:31:32.963799 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:32Z","lastTransitionTime":"2025-12-03T06:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.029355 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.047892 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.061583 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.066243 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.066301 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.066350 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.066375 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.066394 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:33Z","lastTransitionTime":"2025-12-03T06:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.078820 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.100225 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.128483 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.150094 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.166459 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.170459 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.170511 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.170531 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.170559 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.170582 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:33Z","lastTransitionTime":"2025-12-03T06:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.184021 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.199662 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.214629 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.235618 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.251127 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.262980 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.267461 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.273305 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.273356 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.273367 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.273384 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.273396 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:33Z","lastTransitionTime":"2025-12-03T06:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.375626 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.375684 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.375701 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.375725 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.375742 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:33Z","lastTransitionTime":"2025-12-03T06:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.477996 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.478048 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.478062 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.478083 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.478097 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:33Z","lastTransitionTime":"2025-12-03T06:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.582295 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.582424 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.582442 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.582536 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.582563 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:33Z","lastTransitionTime":"2025-12-03T06:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.685659 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.685712 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.685729 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.685751 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.685768 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:33Z","lastTransitionTime":"2025-12-03T06:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.788785 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.788846 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.788864 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.788890 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.788912 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:33Z","lastTransitionTime":"2025-12-03T06:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.891866 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.891929 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.891948 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.891974 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.891991 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:33Z","lastTransitionTime":"2025-12-03T06:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.994955 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.995029 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.995052 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.995081 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:33 crc kubenswrapper[4831]: I1203 06:31:33.995103 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:33Z","lastTransitionTime":"2025-12-03T06:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.012114 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.012130 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:34 crc kubenswrapper[4831]: E1203 06:31:34.012349 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:34 crc kubenswrapper[4831]: E1203 06:31:34.012494 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.012154 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:34 crc kubenswrapper[4831]: E1203 06:31:34.012633 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.097708 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.097759 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.097781 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.097810 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.097834 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:34Z","lastTransitionTime":"2025-12-03T06:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.201603 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.201665 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.201692 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.201724 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.201746 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:34Z","lastTransitionTime":"2025-12-03T06:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.268433 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/0.log" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.272486 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerID="0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07" exitCode=1 Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.272540 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07"} Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.274381 4831 scope.go:117] "RemoveContainer" containerID="0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.297796 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.304861 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.304902 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.304913 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.304930 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.304942 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:34Z","lastTransitionTime":"2025-12-03T06:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.317092 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.331082 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.344152 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.363557 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.379286 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.395790 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.408128 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.408181 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.408199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.408225 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.408244 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:34Z","lastTransitionTime":"2025-12-03T06:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.414832 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.431654 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.450306 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.462518 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.477197 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.496179 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.511180 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.511223 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.511234 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.511254 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.511269 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:34Z","lastTransitionTime":"2025-12-03T06:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.526806 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:33Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:31:33.035445 6137 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:31:33.035499 6137 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:31:33.035553 6137 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:31:33.035566 6137 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:33.035588 6137 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:31:33.035597 6137 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:33.035604 6137 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:31:33.035618 6137 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:33.035624 6137 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:33.035629 6137 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:33.035632 6137 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:33.035645 6137 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:33.035657 6137 factory.go:656] Stopping watch factory\\\\nI1203 06:31:33.035659 6137 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:31:33.035672 6137 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:34Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.614371 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.614419 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.614435 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.614453 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.614466 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:34Z","lastTransitionTime":"2025-12-03T06:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.717411 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.717444 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.717458 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.717475 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.717485 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:34Z","lastTransitionTime":"2025-12-03T06:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.820795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.820846 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.820861 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.820885 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.820901 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:34Z","lastTransitionTime":"2025-12-03T06:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.923502 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.923538 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.923547 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.923562 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:34 crc kubenswrapper[4831]: I1203 06:31:34.923572 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:34Z","lastTransitionTime":"2025-12-03T06:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.026008 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.026063 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.026083 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.026105 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.026122 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:35Z","lastTransitionTime":"2025-12-03T06:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.129159 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.129200 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.129214 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.129232 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.129244 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:35Z","lastTransitionTime":"2025-12-03T06:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.232342 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.232398 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.232413 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.232435 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.232450 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:35Z","lastTransitionTime":"2025-12-03T06:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.279817 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/0.log" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.283243 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerStarted","Data":"5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a"} Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.283390 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.306449 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.324224 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.335492 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.335569 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.335591 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.335622 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.335646 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:35Z","lastTransitionTime":"2025-12-03T06:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.346685 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.362648 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.387613 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.408389 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.427899 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.438005 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.438083 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.438110 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.438141 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.438164 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:35Z","lastTransitionTime":"2025-12-03T06:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.446784 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.470620 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.494083 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.512535 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.531605 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.541349 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.541411 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.541431 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.541456 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.541477 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:35Z","lastTransitionTime":"2025-12-03T06:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.561745 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.599387 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:33Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:31:33.035445 6137 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:31:33.035499 6137 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:31:33.035553 6137 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:31:33.035566 6137 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:33.035588 6137 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:31:33.035597 6137 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:33.035604 6137 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:31:33.035618 6137 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:33.035624 6137 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:33.035629 6137 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:33.035632 6137 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:33.035645 6137 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:33.035657 6137 factory.go:656] Stopping watch factory\\\\nI1203 06:31:33.035659 6137 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:31:33.035672 6137 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:35Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.645432 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.645504 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.645521 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.645547 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.645564 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:35Z","lastTransitionTime":"2025-12-03T06:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.749097 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.749135 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.749144 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.749159 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.749168 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:35Z","lastTransitionTime":"2025-12-03T06:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.851222 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.851286 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.851304 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.851366 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.851383 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:35Z","lastTransitionTime":"2025-12-03T06:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.954563 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.954635 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.954671 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.954707 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:35 crc kubenswrapper[4831]: I1203 06:31:35.954732 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:35Z","lastTransitionTime":"2025-12-03T06:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.011825 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.011903 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.012288 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:36 crc kubenswrapper[4831]: E1203 06:31:36.012517 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:36 crc kubenswrapper[4831]: E1203 06:31:36.012606 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:36 crc kubenswrapper[4831]: E1203 06:31:36.012787 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.057504 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.057563 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.057583 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.057608 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.057627 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.095356 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9"] Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.096023 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.100698 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.101072 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.111001 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wl7l\" (UniqueName: \"kubernetes.io/projected/877c82a5-4683-47a1-8a61-639e563263af-kube-api-access-2wl7l\") pod \"ovnkube-control-plane-749d76644c-nbmt9\" (UID: \"877c82a5-4683-47a1-8a61-639e563263af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.111068 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/877c82a5-4683-47a1-8a61-639e563263af-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nbmt9\" (UID: \"877c82a5-4683-47a1-8a61-639e563263af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.111098 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/877c82a5-4683-47a1-8a61-639e563263af-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nbmt9\" (UID: \"877c82a5-4683-47a1-8a61-639e563263af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.111155 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/877c82a5-4683-47a1-8a61-639e563263af-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nbmt9\" (UID: \"877c82a5-4683-47a1-8a61-639e563263af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.118671 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.130890 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.144360 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.159947 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.160219 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.160354 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.160494 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.160611 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.162380 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.181678 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:33Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:31:33.035445 6137 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:31:33.035499 6137 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:31:33.035553 6137 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:31:33.035566 6137 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:33.035588 6137 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:31:33.035597 6137 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:33.035604 6137 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:31:33.035618 6137 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:33.035624 6137 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:33.035629 6137 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:33.035632 6137 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:33.035645 6137 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:33.035657 6137 factory.go:656] Stopping watch factory\\\\nI1203 06:31:33.035659 6137 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:31:33.035672 6137 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.197048 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.209583 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.212071 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wl7l\" (UniqueName: \"kubernetes.io/projected/877c82a5-4683-47a1-8a61-639e563263af-kube-api-access-2wl7l\") pod \"ovnkube-control-plane-749d76644c-nbmt9\" (UID: \"877c82a5-4683-47a1-8a61-639e563263af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.212151 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/877c82a5-4683-47a1-8a61-639e563263af-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nbmt9\" (UID: \"877c82a5-4683-47a1-8a61-639e563263af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.212187 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/877c82a5-4683-47a1-8a61-639e563263af-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nbmt9\" (UID: \"877c82a5-4683-47a1-8a61-639e563263af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.212287 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/877c82a5-4683-47a1-8a61-639e563263af-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nbmt9\" (UID: \"877c82a5-4683-47a1-8a61-639e563263af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.213104 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/877c82a5-4683-47a1-8a61-639e563263af-env-overrides\") pod \"ovnkube-control-plane-749d76644c-nbmt9\" (UID: \"877c82a5-4683-47a1-8a61-639e563263af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.213809 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/877c82a5-4683-47a1-8a61-639e563263af-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-nbmt9\" (UID: \"877c82a5-4683-47a1-8a61-639e563263af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.222022 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/877c82a5-4683-47a1-8a61-639e563263af-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-nbmt9\" (UID: \"877c82a5-4683-47a1-8a61-639e563263af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.228170 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.228555 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wl7l\" (UniqueName: \"kubernetes.io/projected/877c82a5-4683-47a1-8a61-639e563263af-kube-api-access-2wl7l\") pod \"ovnkube-control-plane-749d76644c-nbmt9\" (UID: \"877c82a5-4683-47a1-8a61-639e563263af\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.241300 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.253363 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.263298 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.263375 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.263394 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.263422 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.263442 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.268234 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.278930 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.289414 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/1.log" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.290308 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/0.log" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.293721 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerID="5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a" exitCode=1 Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.293793 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.293882 4831 scope.go:117] "RemoveContainer" containerID="0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.298385 4831 scope.go:117] "RemoveContainer" containerID="5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a" Dec 03 06:31:36 crc kubenswrapper[4831]: E1203 06:31:36.299876 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.300142 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.303140 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.303485 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.303503 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.303525 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.303543 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.314728 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: E1203 06:31:36.315459 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.318873 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.318900 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.318910 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.318926 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.318935 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.327489 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: E1203 06:31:36.330639 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.338739 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.338769 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.338781 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.338802 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.338816 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.345675 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: E1203 06:31:36.354762 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.357658 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.358178 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.358205 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.358216 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.358234 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.358245 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.368014 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: E1203 06:31:36.374807 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.378755 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.378786 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.378797 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.378814 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.378826 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.385349 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: E1203 06:31:36.393260 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: E1203 06:31:36.393558 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.395411 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.395466 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.395484 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.395510 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.395528 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.401569 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.416833 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.417000 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.433179 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: W1203 06:31:36.434417 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod877c82a5_4683_47a1_8a61_639e563263af.slice/crio-b178af5b4b1e65abb9bca92a8633720811c1706dd94c121fd32ebe9072fa973a WatchSource:0}: Error finding container b178af5b4b1e65abb9bca92a8633720811c1706dd94c121fd32ebe9072fa973a: Status 404 returned error can't find the container with id b178af5b4b1e65abb9bca92a8633720811c1706dd94c121fd32ebe9072fa973a Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.451689 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.469987 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.483170 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.495769 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.497657 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.497690 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.497701 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.497718 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.497729 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.517042 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.545356 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:33Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:31:33.035445 6137 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:31:33.035499 6137 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:31:33.035553 6137 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:31:33.035566 6137 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:33.035588 6137 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:31:33.035597 6137 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:33.035604 6137 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:31:33.035618 6137 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:33.035624 6137 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:33.035629 6137 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:33.035632 6137 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:33.035645 6137 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:33.035657 6137 factory.go:656] Stopping watch factory\\\\nI1203 06:31:33.035659 6137 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:31:33.035672 6137 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:35Z\\\",\\\"message\\\":\\\"esses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.140],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1203 06:31:35.236645 6262 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 06:31:35.236299 6262 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.559587 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.573458 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:36Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.600512 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.600569 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.600588 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.600611 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.600629 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.703540 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.703603 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.703625 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.703653 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.703674 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.811473 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.811536 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.811553 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.811576 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.811594 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.914950 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.915018 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.915036 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.915061 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:36 crc kubenswrapper[4831]: I1203 06:31:36.915078 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:36Z","lastTransitionTime":"2025-12-03T06:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.022124 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.022202 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.022229 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.022261 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.022288 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:37Z","lastTransitionTime":"2025-12-03T06:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.057769 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.078614 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.100185 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.119499 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.125632 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.125909 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.125929 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.125955 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.125973 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:37Z","lastTransitionTime":"2025-12-03T06:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.138054 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.152434 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.171298 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.189000 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.211877 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:33Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:31:33.035445 6137 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:31:33.035499 6137 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:31:33.035553 6137 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:31:33.035566 6137 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:33.035588 6137 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:31:33.035597 6137 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:33.035604 6137 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:31:33.035618 6137 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:33.035624 6137 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:33.035629 6137 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:33.035632 6137 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:33.035645 6137 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:33.035657 6137 factory.go:656] Stopping watch factory\\\\nI1203 06:31:33.035659 6137 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:31:33.035672 6137 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:35Z\\\",\\\"message\\\":\\\"esses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.140],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1203 06:31:35.236645 6262 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 06:31:35.236299 6262 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.222525 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.228223 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.228252 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.228261 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.228274 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.228286 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:37Z","lastTransitionTime":"2025-12-03T06:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.246124 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.269096 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.287289 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.297810 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" event={"ID":"877c82a5-4683-47a1-8a61-639e563263af","Type":"ContainerStarted","Data":"610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.297854 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" event={"ID":"877c82a5-4683-47a1-8a61-639e563263af","Type":"ContainerStarted","Data":"34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.297865 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" event={"ID":"877c82a5-4683-47a1-8a61-639e563263af","Type":"ContainerStarted","Data":"b178af5b4b1e65abb9bca92a8633720811c1706dd94c121fd32ebe9072fa973a"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.299410 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/1.log" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.302681 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.315828 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.329812 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.330666 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.330696 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.330704 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.330716 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.330725 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:37Z","lastTransitionTime":"2025-12-03T06:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.343416 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.353454 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.361820 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.373418 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.383998 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.399899 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.412527 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.428617 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.433485 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.433550 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.433568 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.433591 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.433608 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:37Z","lastTransitionTime":"2025-12-03T06:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.451595 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:33Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:31:33.035445 6137 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:31:33.035499 6137 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:31:33.035553 6137 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:31:33.035566 6137 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:33.035588 6137 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:31:33.035597 6137 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:33.035604 6137 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:31:33.035618 6137 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:33.035624 6137 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:33.035629 6137 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:33.035632 6137 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:33.035645 6137 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:33.035657 6137 factory.go:656] Stopping watch factory\\\\nI1203 06:31:33.035659 6137 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:31:33.035672 6137 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:35Z\\\",\\\"message\\\":\\\"esses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.140],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1203 06:31:35.236645 6262 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 06:31:35.236299 6262 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.468177 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.481536 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.493220 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.506016 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.517541 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.539611 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.539682 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.539807 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.539851 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.539882 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:37Z","lastTransitionTime":"2025-12-03T06:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.544125 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:37Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.642423 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.642487 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.642499 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.642517 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.642527 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:37Z","lastTransitionTime":"2025-12-03T06:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.744803 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.744842 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.744852 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.744867 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.744877 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:37Z","lastTransitionTime":"2025-12-03T06:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.847837 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.847903 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.847927 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.847955 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.847977 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:37Z","lastTransitionTime":"2025-12-03T06:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.930367 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.930564 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.930624 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.930653 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:31:53.930611946 +0000 UTC m=+51.274195504 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.930730 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.930742 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.930761 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.930794 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.930812 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.930854 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:53.930806192 +0000 UTC m=+51.274405370 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.930895 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.930922 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.930930 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:53.930912045 +0000 UTC m=+51.274495683 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.930937 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.931003 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:53.930985938 +0000 UTC m=+51.274569586 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.931008 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.930895 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.931075 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:53.93105547 +0000 UTC m=+51.274638988 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.951638 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.951716 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.951736 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.952190 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.952256 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:37Z","lastTransitionTime":"2025-12-03T06:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.986011 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-lllsw"] Dec 03 06:31:37 crc kubenswrapper[4831]: I1203 06:31:37.986946 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:37 crc kubenswrapper[4831]: E1203 06:31:37.987082 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.012492 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.012547 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.012528 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: E1203 06:31:38.012687 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.012519 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:38 crc kubenswrapper[4831]: E1203 06:31:38.012828 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:38 crc kubenswrapper[4831]: E1203 06:31:38.012981 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.033707 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.033777 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jjmv\" (UniqueName: \"kubernetes.io/projected/8283839a-a189-493f-bde7-e0193d575963-kube-api-access-8jjmv\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.035829 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.055386 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.055444 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.055462 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.055486 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.055504 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:38Z","lastTransitionTime":"2025-12-03T06:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.058018 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.080366 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.114213 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:33Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:31:33.035445 6137 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:31:33.035499 6137 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:31:33.035553 6137 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:31:33.035566 6137 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:33.035588 6137 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:31:33.035597 6137 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:33.035604 6137 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:31:33.035618 6137 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:33.035624 6137 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:33.035629 6137 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:33.035632 6137 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:33.035645 6137 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:33.035657 6137 factory.go:656] Stopping watch factory\\\\nI1203 06:31:33.035659 6137 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:31:33.035672 6137 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:35Z\\\",\\\"message\\\":\\\"esses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.140],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1203 06:31:35.236645 6262 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 06:31:35.236299 6262 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.132617 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.135210 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.135382 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jjmv\" (UniqueName: \"kubernetes.io/projected/8283839a-a189-493f-bde7-e0193d575963-kube-api-access-8jjmv\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:38 crc kubenswrapper[4831]: E1203 06:31:38.135449 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:31:38 crc kubenswrapper[4831]: E1203 06:31:38.135603 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs podName:8283839a-a189-493f-bde7-e0193d575963 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:38.635546332 +0000 UTC m=+35.979129930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs") pod "network-metrics-daemon-lllsw" (UID: "8283839a-a189-493f-bde7-e0193d575963") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.150793 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.158529 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.158585 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.158604 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.158629 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.158647 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:38Z","lastTransitionTime":"2025-12-03T06:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.168903 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jjmv\" (UniqueName: \"kubernetes.io/projected/8283839a-a189-493f-bde7-e0193d575963-kube-api-access-8jjmv\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.171439 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.189633 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.207129 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.222927 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.245364 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.261488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.261548 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.261565 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.261594 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.261612 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:38Z","lastTransitionTime":"2025-12-03T06:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.268788 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.289735 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.307572 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.327465 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:38Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.364579 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.364628 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.364646 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.364670 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.364687 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:38Z","lastTransitionTime":"2025-12-03T06:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.467855 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.467920 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.467938 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.467963 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.467979 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:38Z","lastTransitionTime":"2025-12-03T06:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.571538 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.571586 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.571605 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.571632 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.571650 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:38Z","lastTransitionTime":"2025-12-03T06:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.641505 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:38 crc kubenswrapper[4831]: E1203 06:31:38.641861 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:31:38 crc kubenswrapper[4831]: E1203 06:31:38.642002 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs podName:8283839a-a189-493f-bde7-e0193d575963 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:39.641960108 +0000 UTC m=+36.985543686 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs") pod "network-metrics-daemon-lllsw" (UID: "8283839a-a189-493f-bde7-e0193d575963") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.675036 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.675112 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.675129 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.675152 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.675170 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:38Z","lastTransitionTime":"2025-12-03T06:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.778017 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.778080 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.778099 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.778126 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.778142 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:38Z","lastTransitionTime":"2025-12-03T06:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.885144 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.885219 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.885280 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.885308 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.885359 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:38Z","lastTransitionTime":"2025-12-03T06:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.988376 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.988455 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.988480 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.988511 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:38 crc kubenswrapper[4831]: I1203 06:31:38.988536 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:38Z","lastTransitionTime":"2025-12-03T06:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.091578 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.091647 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.091667 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.091694 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.091712 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:39Z","lastTransitionTime":"2025-12-03T06:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.194920 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.194986 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.195009 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.195037 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.195058 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:39Z","lastTransitionTime":"2025-12-03T06:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.298223 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.298290 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.298308 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.298384 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.298402 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:39Z","lastTransitionTime":"2025-12-03T06:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.402384 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.402468 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.402493 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.402522 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.402541 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:39Z","lastTransitionTime":"2025-12-03T06:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.506050 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.506123 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.506143 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.506174 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.506196 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:39Z","lastTransitionTime":"2025-12-03T06:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.609488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.609557 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.609582 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.609613 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.609631 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:39Z","lastTransitionTime":"2025-12-03T06:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.653502 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:39 crc kubenswrapper[4831]: E1203 06:31:39.653826 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:31:39 crc kubenswrapper[4831]: E1203 06:31:39.653967 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs podName:8283839a-a189-493f-bde7-e0193d575963 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:41.653937281 +0000 UTC m=+38.997520829 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs") pod "network-metrics-daemon-lllsw" (UID: "8283839a-a189-493f-bde7-e0193d575963") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.712874 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.712950 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.713017 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.713049 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.713071 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:39Z","lastTransitionTime":"2025-12-03T06:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.816716 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.816791 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.816814 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.816841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.816859 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:39Z","lastTransitionTime":"2025-12-03T06:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.920496 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.920558 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.920574 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.920598 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:39 crc kubenswrapper[4831]: I1203 06:31:39.920615 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:39Z","lastTransitionTime":"2025-12-03T06:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.012425 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:40 crc kubenswrapper[4831]: E1203 06:31:40.012596 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.012668 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.012704 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.012681 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:40 crc kubenswrapper[4831]: E1203 06:31:40.012826 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:40 crc kubenswrapper[4831]: E1203 06:31:40.012957 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:31:40 crc kubenswrapper[4831]: E1203 06:31:40.013047 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.023875 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.023967 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.023991 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.024197 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.024259 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:40Z","lastTransitionTime":"2025-12-03T06:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.130586 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.130661 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.130684 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.130717 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.130739 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:40Z","lastTransitionTime":"2025-12-03T06:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.234527 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.234577 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.234591 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.234610 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.234627 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:40Z","lastTransitionTime":"2025-12-03T06:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.336655 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.336968 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.337108 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.337143 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.337166 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:40Z","lastTransitionTime":"2025-12-03T06:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.439192 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.439246 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.439261 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.439284 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.439299 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:40Z","lastTransitionTime":"2025-12-03T06:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.543495 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.543562 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.543583 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.543612 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.543633 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:40Z","lastTransitionTime":"2025-12-03T06:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.647981 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.648126 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.648149 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.648175 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.648194 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:40Z","lastTransitionTime":"2025-12-03T06:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.751418 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.751484 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.751501 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.751526 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.751545 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:40Z","lastTransitionTime":"2025-12-03T06:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.854732 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.854813 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.854840 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.854868 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.854888 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:40Z","lastTransitionTime":"2025-12-03T06:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.958533 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.958594 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.958612 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.958634 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:40 crc kubenswrapper[4831]: I1203 06:31:40.958652 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:40Z","lastTransitionTime":"2025-12-03T06:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.061779 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.061858 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.061878 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.061906 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.061926 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:41Z","lastTransitionTime":"2025-12-03T06:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.164991 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.165084 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.165101 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.165130 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.165149 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:41Z","lastTransitionTime":"2025-12-03T06:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.268359 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.268504 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.268524 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.268549 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.268566 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:41Z","lastTransitionTime":"2025-12-03T06:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.371240 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.371290 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.371302 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.371338 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.371352 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:41Z","lastTransitionTime":"2025-12-03T06:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.474852 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.474914 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.474937 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.474969 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.474990 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:41Z","lastTransitionTime":"2025-12-03T06:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.578628 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.578702 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.578738 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.578769 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.578789 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:41Z","lastTransitionTime":"2025-12-03T06:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.675623 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:41 crc kubenswrapper[4831]: E1203 06:31:41.675804 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:31:41 crc kubenswrapper[4831]: E1203 06:31:41.675885 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs podName:8283839a-a189-493f-bde7-e0193d575963 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:45.675861685 +0000 UTC m=+43.019445223 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs") pod "network-metrics-daemon-lllsw" (UID: "8283839a-a189-493f-bde7-e0193d575963") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.683576 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.683633 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.683651 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.683675 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.683691 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:41Z","lastTransitionTime":"2025-12-03T06:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.786863 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.786926 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.786948 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.786973 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.786992 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:41Z","lastTransitionTime":"2025-12-03T06:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.890188 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.890248 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.890264 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.890290 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.890308 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:41Z","lastTransitionTime":"2025-12-03T06:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.993007 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.993057 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.993069 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.993089 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:41 crc kubenswrapper[4831]: I1203 06:31:41.993101 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:41Z","lastTransitionTime":"2025-12-03T06:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.012267 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.012409 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.012474 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.012304 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:42 crc kubenswrapper[4831]: E1203 06:31:42.012529 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:42 crc kubenswrapper[4831]: E1203 06:31:42.012627 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:42 crc kubenswrapper[4831]: E1203 06:31:42.012746 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:42 crc kubenswrapper[4831]: E1203 06:31:42.013128 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.095967 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.096041 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.096084 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.096115 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.096171 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:42Z","lastTransitionTime":"2025-12-03T06:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.199343 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.199407 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.199425 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.199462 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.199481 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:42Z","lastTransitionTime":"2025-12-03T06:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.302602 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.302666 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.302683 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.302710 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.302729 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:42Z","lastTransitionTime":"2025-12-03T06:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.406746 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.406819 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.406841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.406871 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.406891 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:42Z","lastTransitionTime":"2025-12-03T06:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.510492 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.510559 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.510580 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.510604 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.510621 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:42Z","lastTransitionTime":"2025-12-03T06:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.613654 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.613706 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.613724 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.613752 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.613769 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:42Z","lastTransitionTime":"2025-12-03T06:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.717096 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.717185 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.717208 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.717237 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.717260 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:42Z","lastTransitionTime":"2025-12-03T06:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.820389 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.820465 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.820489 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.820517 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.820538 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:42Z","lastTransitionTime":"2025-12-03T06:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.924475 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.924538 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.924547 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.924571 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:42 crc kubenswrapper[4831]: I1203 06:31:42.924584 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:42Z","lastTransitionTime":"2025-12-03T06:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.030034 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.030091 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.030111 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.030135 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.030039 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.030152 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:43Z","lastTransitionTime":"2025-12-03T06:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.043305 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.060248 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.079984 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.094789 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.112921 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.132009 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.134220 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.134247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.134257 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.134275 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.134286 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:43Z","lastTransitionTime":"2025-12-03T06:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.146895 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.166575 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://0be9f2d5bc155bbb7e6626b37aaaeb3b7f2a95cb40ef8b68ed8013a9b0bdfc07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:33Z\\\",\\\"message\\\":\\\"all/v1/apis/informers/externalversions/factory.go:140\\\\nI1203 06:31:33.035445 6137 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI1203 06:31:33.035499 6137 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI1203 06:31:33.035553 6137 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI1203 06:31:33.035566 6137 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:33.035588 6137 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI1203 06:31:33.035597 6137 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:33.035604 6137 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI1203 06:31:33.035618 6137 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:33.035624 6137 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:33.035629 6137 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:33.035632 6137 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:33.035645 6137 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:33.035657 6137 factory.go:656] Stopping watch factory\\\\nI1203 06:31:33.035659 6137 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06:31:33.035672 6137 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:35Z\\\",\\\"message\\\":\\\"esses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.140],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1203 06:31:35.236645 6262 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 06:31:35.236299 6262 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.177298 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.196135 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.212135 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.229493 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.237161 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.237190 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.237200 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.237214 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.237224 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:43Z","lastTransitionTime":"2025-12-03T06:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.243161 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.255833 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.270028 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:43Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.340187 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.340494 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.340505 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.340518 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.340527 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:43Z","lastTransitionTime":"2025-12-03T06:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.443790 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.443847 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.443864 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.443887 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.443905 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:43Z","lastTransitionTime":"2025-12-03T06:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.547826 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.547889 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.547910 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.547935 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.547953 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:43Z","lastTransitionTime":"2025-12-03T06:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.651160 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.651211 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.651221 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.651240 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.651251 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:43Z","lastTransitionTime":"2025-12-03T06:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.754172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.754214 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.754225 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.754242 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.754252 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:43Z","lastTransitionTime":"2025-12-03T06:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.856866 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.856894 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.856902 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.856913 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.856921 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:43Z","lastTransitionTime":"2025-12-03T06:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.958814 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.958875 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.958896 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.958968 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:43 crc kubenswrapper[4831]: I1203 06:31:43.958990 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:43Z","lastTransitionTime":"2025-12-03T06:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.012358 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.012378 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.012508 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:44 crc kubenswrapper[4831]: E1203 06:31:44.012534 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:44 crc kubenswrapper[4831]: E1203 06:31:44.012620 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.012771 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:44 crc kubenswrapper[4831]: E1203 06:31:44.013048 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:44 crc kubenswrapper[4831]: E1203 06:31:44.013228 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.062432 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.062492 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.062514 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.062542 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.062564 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:44Z","lastTransitionTime":"2025-12-03T06:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.165654 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.165715 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.165741 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.165769 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.165789 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:44Z","lastTransitionTime":"2025-12-03T06:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.269846 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.269926 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.269947 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.269974 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.269991 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:44Z","lastTransitionTime":"2025-12-03T06:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.372914 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.373028 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.373047 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.373070 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.373086 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:44Z","lastTransitionTime":"2025-12-03T06:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.476760 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.476836 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.476858 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.476893 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.476918 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:44Z","lastTransitionTime":"2025-12-03T06:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.579944 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.580037 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.580060 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.580170 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.580249 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:44Z","lastTransitionTime":"2025-12-03T06:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.683689 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.683787 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.683855 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.683881 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.683901 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:44Z","lastTransitionTime":"2025-12-03T06:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.786196 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.786232 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.786241 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.786257 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.786267 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:44Z","lastTransitionTime":"2025-12-03T06:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.889400 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.889450 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.889463 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.889481 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.889493 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:44Z","lastTransitionTime":"2025-12-03T06:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.992688 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.992747 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.992764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.992788 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:44 crc kubenswrapper[4831]: I1203 06:31:44.992805 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:44Z","lastTransitionTime":"2025-12-03T06:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.096450 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.096532 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.096555 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.096582 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.096599 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:45Z","lastTransitionTime":"2025-12-03T06:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.200067 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.200134 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.200159 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.200203 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.200232 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:45Z","lastTransitionTime":"2025-12-03T06:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.303991 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.304065 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.304083 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.304112 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.304130 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:45Z","lastTransitionTime":"2025-12-03T06:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.407872 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.408248 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.408528 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.408691 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.408827 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:45Z","lastTransitionTime":"2025-12-03T06:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.512237 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.512338 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.512363 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.512394 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.512414 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:45Z","lastTransitionTime":"2025-12-03T06:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.615753 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.615820 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.615837 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.615861 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.615877 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:45Z","lastTransitionTime":"2025-12-03T06:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.719257 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.719340 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.719365 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.719392 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.719413 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:45Z","lastTransitionTime":"2025-12-03T06:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.723373 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:45 crc kubenswrapper[4831]: E1203 06:31:45.723593 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:31:45 crc kubenswrapper[4831]: E1203 06:31:45.723687 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs podName:8283839a-a189-493f-bde7-e0193d575963 nodeName:}" failed. No retries permitted until 2025-12-03 06:31:53.723662378 +0000 UTC m=+51.067245926 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs") pod "network-metrics-daemon-lllsw" (UID: "8283839a-a189-493f-bde7-e0193d575963") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.822010 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.822966 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.823152 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.823387 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.823585 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:45Z","lastTransitionTime":"2025-12-03T06:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.926833 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.926878 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.926894 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.926916 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:45 crc kubenswrapper[4831]: I1203 06:31:45.926933 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:45Z","lastTransitionTime":"2025-12-03T06:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.012638 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.012779 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.012835 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:46 crc kubenswrapper[4831]: E1203 06:31:46.012847 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.012643 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:46 crc kubenswrapper[4831]: E1203 06:31:46.013119 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:46 crc kubenswrapper[4831]: E1203 06:31:46.013200 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:31:46 crc kubenswrapper[4831]: E1203 06:31:46.013357 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.029697 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.029760 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.029777 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.029799 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.029816 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.133253 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.133360 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.133387 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.133414 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.133435 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.237239 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.237291 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.237309 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.237369 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.237389 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.339306 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.339362 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.339373 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.339389 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.339430 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.441847 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.442222 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.442402 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.442544 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.442707 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.546160 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.546522 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.546721 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.546850 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.546962 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.650499 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.650863 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.650984 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.651342 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.651464 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.754841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.754899 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.754920 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.754947 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.754967 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.756684 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.756728 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.756744 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.756764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.756780 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: E1203 06:31:46.772709 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.777257 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.777284 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.777293 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.777306 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.777330 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: E1203 06:31:46.792119 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.795698 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.795738 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.795750 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.795767 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.795779 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: E1203 06:31:46.810208 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.813563 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.813597 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.813608 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.813626 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.813637 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: E1203 06:31:46.824355 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.827520 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.827554 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.827562 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.827576 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.827586 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: E1203 06:31:46.837739 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:46Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:46Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:46 crc kubenswrapper[4831]: E1203 06:31:46.837849 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.857150 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.857177 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.857186 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.857198 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.857207 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.960109 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.960160 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.960168 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.960185 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:46 crc kubenswrapper[4831]: I1203 06:31:46.960194 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:46Z","lastTransitionTime":"2025-12-03T06:31:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.013202 4831 scope.go:117] "RemoveContainer" containerID="5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.034340 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.055998 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.063415 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.063456 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.063466 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.063480 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.063491 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:47Z","lastTransitionTime":"2025-12-03T06:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.080553 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:35Z\\\",\\\"message\\\":\\\"esses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.140],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1203 06:31:35.236645 6262 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 06:31:35.236299 6262 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.094511 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.110574 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.128132 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.143712 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.156245 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.166000 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.166066 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.166085 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.166111 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.166128 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:47Z","lastTransitionTime":"2025-12-03T06:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.173468 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.188777 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.203406 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.220076 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.240668 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.261402 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.268843 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.268894 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.268912 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.268940 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.268961 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:47Z","lastTransitionTime":"2025-12-03T06:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.280870 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.294149 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.340190 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/1.log" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.346080 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerStarted","Data":"c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1"} Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.346220 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.357945 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.367848 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.376514 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.376570 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.376582 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.376598 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.376609 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:47Z","lastTransitionTime":"2025-12-03T06:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.381171 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.399213 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.414138 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.432853 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.453769 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.466918 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.478476 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.478517 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.478528 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.478544 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.478553 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:47Z","lastTransitionTime":"2025-12-03T06:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.483675 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.497605 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.512103 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.522718 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.533276 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.547152 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.562825 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.578970 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:35Z\\\",\\\"message\\\":\\\"esses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.140],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1203 06:31:35.236645 6262 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 06:31:35.236299 6262 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:47Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.580184 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.580222 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.580232 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.580248 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.580262 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:47Z","lastTransitionTime":"2025-12-03T06:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.683274 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.683356 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.683368 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.683385 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.683398 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:47Z","lastTransitionTime":"2025-12-03T06:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.785826 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.785860 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.785871 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.785886 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.785899 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:47Z","lastTransitionTime":"2025-12-03T06:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.888865 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.888917 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.888932 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.888959 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.888976 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:47Z","lastTransitionTime":"2025-12-03T06:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.991340 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.991374 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.991388 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.991408 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:47 crc kubenswrapper[4831]: I1203 06:31:47.991422 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:47Z","lastTransitionTime":"2025-12-03T06:31:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.011719 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.011787 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.011817 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.011817 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:48 crc kubenswrapper[4831]: E1203 06:31:48.011880 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:48 crc kubenswrapper[4831]: E1203 06:31:48.011957 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:48 crc kubenswrapper[4831]: E1203 06:31:48.012107 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:48 crc kubenswrapper[4831]: E1203 06:31:48.012176 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.094868 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.094942 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.094967 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.094996 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.095018 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:48Z","lastTransitionTime":"2025-12-03T06:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.198086 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.198149 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.198167 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.198193 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.198211 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:48Z","lastTransitionTime":"2025-12-03T06:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.301585 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.301668 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.301689 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.301713 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.301731 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:48Z","lastTransitionTime":"2025-12-03T06:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.351825 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/2.log" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.352976 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/1.log" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.357070 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerID="c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1" exitCode=1 Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.357138 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1"} Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.357197 4831 scope.go:117] "RemoveContainer" containerID="5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.358307 4831 scope.go:117] "RemoveContainer" containerID="c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1" Dec 03 06:31:48 crc kubenswrapper[4831]: E1203 06:31:48.358593 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.381981 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.402149 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.408491 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.408555 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.408574 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.408599 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.408616 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:48Z","lastTransitionTime":"2025-12-03T06:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.427432 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.452097 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.474057 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.488966 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.505418 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.511455 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.511508 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.511527 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.511551 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.511569 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:48Z","lastTransitionTime":"2025-12-03T06:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.522632 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.553041 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5311c6502af8775d57354efbf7d62ebdbc4bd7c0f7e24abd159f2bb52980917a\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:35Z\\\",\\\"message\\\":\\\"esses:false,SessionAffinityConfig:nil,IPFamilyPolicy:*SingleStack,ClusterIPs:[10.217.4.140],IPFamilies:[IPv4],AllocateLoadBalancerNodePorts:nil,LoadBalancerClass:nil,InternalTrafficPolicy:*Cluster,TrafficDistribution:nil,},Status:ServiceStatus{LoadBalancer:LoadBalancerStatus{Ingress:[]LoadBalancerIngress{},},Conditions:[]Condition{},},}\\\\nI1203 06:31:35.236645 6262 services_controller.go:454] Service openshift-console-operator/metrics for network=default has 1 cluster-wide, 0 per-node configs, 0 template configs, making 1 (cluster) 0 (per node) and 0 (template) load balancers\\\\nI1203 06:31:35.236299 6262 services_controller.go:434] Service openshift-kube-controller-manager/kube-controller-manager retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{kube-controller-manager openshift-kube-controller-manager 90927ca1-43e2-420d-8485-a35952e82cd9 4812 0 2025-02-23 05:22:57 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[prometheus:kube-controller-manager] map[operator.openshift.io/spec-hash:bb05a56151ce98d11c8554843985ba99e0498dcafd98129435c2d982c5ea4c11 service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-secret-name:serving-cert service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{Service\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:48Z\\\",\\\"message\\\":\\\"moval\\\\nI1203 06:31:47.938671 6468 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:47.938679 6468 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:47.938707 6468 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 06:31:47.938714 6468 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:47.938719 6468 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 06:31:47.938719 6468 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:47.938719 6468 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 06:31:47.938738 6468 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:47.938754 6468 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 06:31:47.938766 6468 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 06:31:47.939102 6468 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:47.939122 6468 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:47.939157 6468 factory.go:656] Stopping watch factory\\\\nI1203 06:31:47.939173 6468 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:31:47.939200 6468 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.571284 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.599950 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.615176 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.615252 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.615293 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.615349 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.615367 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:48Z","lastTransitionTime":"2025-12-03T06:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.635259 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.656420 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.669398 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.682688 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.695633 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:48Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.717713 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.717759 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.717771 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.717789 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.717829 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:48Z","lastTransitionTime":"2025-12-03T06:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.821032 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.821428 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.821447 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.821474 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.821491 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:48Z","lastTransitionTime":"2025-12-03T06:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.924421 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.924459 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.924470 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.924487 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:48 crc kubenswrapper[4831]: I1203 06:31:48.924498 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:48Z","lastTransitionTime":"2025-12-03T06:31:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.028006 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.028097 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.028118 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.028546 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.028822 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:49Z","lastTransitionTime":"2025-12-03T06:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.131270 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.131361 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.131378 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.131401 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.131418 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:49Z","lastTransitionTime":"2025-12-03T06:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.233337 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.233379 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.233390 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.233405 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.233416 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:49Z","lastTransitionTime":"2025-12-03T06:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.336910 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.336976 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.336993 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.337017 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.337035 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:49Z","lastTransitionTime":"2025-12-03T06:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.370232 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/2.log" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.439824 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.439897 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.439915 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.439939 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.439955 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:49Z","lastTransitionTime":"2025-12-03T06:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.543054 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.543119 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.543136 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.543164 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.543181 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:49Z","lastTransitionTime":"2025-12-03T06:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.645869 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.645926 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.645943 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.645964 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.645980 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:49Z","lastTransitionTime":"2025-12-03T06:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.748954 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.749022 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.749041 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.749070 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.749089 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:49Z","lastTransitionTime":"2025-12-03T06:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.852568 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.852631 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.852647 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.852669 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.852686 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:49Z","lastTransitionTime":"2025-12-03T06:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.955879 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.955977 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.955995 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.956072 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:49 crc kubenswrapper[4831]: I1203 06:31:49.956092 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:49Z","lastTransitionTime":"2025-12-03T06:31:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.012616 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:50 crc kubenswrapper[4831]: E1203 06:31:50.012809 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.013383 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.013598 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:50 crc kubenswrapper[4831]: E1203 06:31:50.013692 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:50 crc kubenswrapper[4831]: E1203 06:31:50.013528 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.013826 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:50 crc kubenswrapper[4831]: E1203 06:31:50.014014 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.062364 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.062438 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.062479 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.062514 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.062582 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:50Z","lastTransitionTime":"2025-12-03T06:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.166027 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.166086 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.166107 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.166131 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.166148 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:50Z","lastTransitionTime":"2025-12-03T06:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.269423 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.269505 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.269523 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.269548 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.269565 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:50Z","lastTransitionTime":"2025-12-03T06:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.373375 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.373443 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.373470 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.373505 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.373530 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:50Z","lastTransitionTime":"2025-12-03T06:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.476660 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.476719 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.476740 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.476765 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.476784 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:50Z","lastTransitionTime":"2025-12-03T06:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.580001 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.580074 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.580098 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.580129 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.580153 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:50Z","lastTransitionTime":"2025-12-03T06:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.683019 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.683081 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.683098 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.683124 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.683148 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:50Z","lastTransitionTime":"2025-12-03T06:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.787183 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.787238 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.787254 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.787276 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.787293 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:50Z","lastTransitionTime":"2025-12-03T06:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.814263 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.815809 4831 scope.go:117] "RemoveContainer" containerID="c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1" Dec 03 06:31:50 crc kubenswrapper[4831]: E1203 06:31:50.816140 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.837690 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:50Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.860121 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:50Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.878425 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:50Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.890436 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.890506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.890529 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.890561 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.890583 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:50Z","lastTransitionTime":"2025-12-03T06:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.901464 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:50Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.927033 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:50Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.958463 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:48Z\\\",\\\"message\\\":\\\"moval\\\\nI1203 06:31:47.938671 6468 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:47.938679 6468 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:47.938707 6468 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 06:31:47.938714 6468 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:47.938719 6468 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 06:31:47.938719 6468 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:47.938719 6468 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 06:31:47.938738 6468 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:47.938754 6468 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 06:31:47.938766 6468 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 06:31:47.939102 6468 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:47.939122 6468 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:47.939157 6468 factory.go:656] Stopping watch factory\\\\nI1203 06:31:47.939173 6468 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:31:47.939200 6468 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:50Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.975210 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:50Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.990952 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:50Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.993519 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.993589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.993614 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.993646 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:50 crc kubenswrapper[4831]: I1203 06:31:50.993670 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:50Z","lastTransitionTime":"2025-12-03T06:31:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.006193 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:51Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.018369 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:51Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.029084 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:51Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.040400 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:51Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.050682 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:51Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.064545 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:51Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.084493 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:51Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.096676 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.096747 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.096771 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.096799 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.096823 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:51Z","lastTransitionTime":"2025-12-03T06:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.099871 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:51Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.227487 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.227539 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.227555 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.227577 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.227593 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:51Z","lastTransitionTime":"2025-12-03T06:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.330270 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.330340 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.330357 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.330379 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.330396 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:51Z","lastTransitionTime":"2025-12-03T06:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.433567 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.433633 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.433650 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.433674 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.433692 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:51Z","lastTransitionTime":"2025-12-03T06:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.536754 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.536847 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.536865 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.536887 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.536903 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:51Z","lastTransitionTime":"2025-12-03T06:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.640143 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.640194 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.640247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.640303 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.640359 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:51Z","lastTransitionTime":"2025-12-03T06:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.743575 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.743637 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.743653 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.743677 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.743695 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:51Z","lastTransitionTime":"2025-12-03T06:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.846941 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.846980 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.846989 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.847002 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.847011 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:51Z","lastTransitionTime":"2025-12-03T06:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.950056 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.950113 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.950129 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.950150 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.950167 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:51Z","lastTransitionTime":"2025-12-03T06:31:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:51 crc kubenswrapper[4831]: I1203 06:31:51.997014 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.010230 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.012607 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:52 crc kubenswrapper[4831]: E1203 06:31:52.012726 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.012741 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:52 crc kubenswrapper[4831]: E1203 06:31:52.012840 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.013040 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.013060 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:52 crc kubenswrapper[4831]: E1203 06:31:52.013103 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:52 crc kubenswrapper[4831]: E1203 06:31:52.013172 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.015699 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.032923 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.046950 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.052925 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.052961 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.052972 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.052990 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.053003 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:52Z","lastTransitionTime":"2025-12-03T06:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.064915 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.082755 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.097072 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.111716 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.136099 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.155783 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.155878 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.155891 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.155912 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.155927 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:52Z","lastTransitionTime":"2025-12-03T06:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.165184 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:48Z\\\",\\\"message\\\":\\\"moval\\\\nI1203 06:31:47.938671 6468 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:47.938679 6468 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:47.938707 6468 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 06:31:47.938714 6468 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:47.938719 6468 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 06:31:47.938719 6468 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:47.938719 6468 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 06:31:47.938738 6468 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:47.938754 6468 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 06:31:47.938766 6468 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 06:31:47.939102 6468 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:47.939122 6468 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:47.939157 6468 factory.go:656] Stopping watch factory\\\\nI1203 06:31:47.939173 6468 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:31:47.939200 6468 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.181962 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.204004 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.222803 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.242030 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.259910 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.259971 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.259993 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.260024 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.260046 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:52Z","lastTransitionTime":"2025-12-03T06:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.260111 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.275747 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.294087 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:52Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.362537 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.362592 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.362610 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.362633 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.362651 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:52Z","lastTransitionTime":"2025-12-03T06:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.465404 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.465496 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.465547 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.465574 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.465592 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:52Z","lastTransitionTime":"2025-12-03T06:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.569312 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.569414 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.569436 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.569464 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.569487 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:52Z","lastTransitionTime":"2025-12-03T06:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.672158 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.672224 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.672247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.672278 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.672302 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:52Z","lastTransitionTime":"2025-12-03T06:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.775777 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.775846 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.775867 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.775897 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.775923 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:52Z","lastTransitionTime":"2025-12-03T06:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.879364 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.879427 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.879449 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.879475 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.879494 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:52Z","lastTransitionTime":"2025-12-03T06:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.982971 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.983037 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.983055 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.983082 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:52 crc kubenswrapper[4831]: I1203 06:31:52.983100 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:52Z","lastTransitionTime":"2025-12-03T06:31:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.034688 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.055475 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.075080 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.085967 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.086046 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.086066 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.086089 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.086105 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:53Z","lastTransitionTime":"2025-12-03T06:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.091525 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.112675 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.130039 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.149963 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.168808 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.186115 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.189880 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.189932 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.189950 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.189978 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.189995 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:53Z","lastTransitionTime":"2025-12-03T06:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.207603 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.227979 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.244514 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.262221 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069455ec-45ad-4fb1-b102-c0624f3c7f15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d320c688af94e83163a489c45472fa0c4aa5a3d1f4aeade63c37d536888361d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://978aff94d5ba5b47d721fd41ecdb14563b76a93959caae023dcd2add83e622d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16c4ad7cfee1247394e7bbb77ce47852d4f18af08ffa88a4a2e61f5d2a15371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.281531 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.293137 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.293260 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.293354 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.293452 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.293483 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:53Z","lastTransitionTime":"2025-12-03T06:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.309614 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.346998 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:48Z\\\",\\\"message\\\":\\\"moval\\\\nI1203 06:31:47.938671 6468 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:47.938679 6468 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:47.938707 6468 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 06:31:47.938714 6468 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:47.938719 6468 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 06:31:47.938719 6468 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:47.938719 6468 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 06:31:47.938738 6468 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:47.938754 6468 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 06:31:47.938766 6468 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 06:31:47.939102 6468 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:47.939122 6468 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:47.939157 6468 factory.go:656] Stopping watch factory\\\\nI1203 06:31:47.939173 6468 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:31:47.939200 6468 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.364070 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:53Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.395857 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.395915 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.395931 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.395956 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.395972 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:53Z","lastTransitionTime":"2025-12-03T06:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.499736 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.499782 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.499807 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.499829 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.499845 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:53Z","lastTransitionTime":"2025-12-03T06:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.603180 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.603246 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.603262 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.603293 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.603310 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:53Z","lastTransitionTime":"2025-12-03T06:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.706491 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.706569 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.706591 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.706621 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.706643 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:53Z","lastTransitionTime":"2025-12-03T06:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.759931 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.760227 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.760391 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs podName:8283839a-a189-493f-bde7-e0193d575963 nodeName:}" failed. No retries permitted until 2025-12-03 06:32:09.760352894 +0000 UTC m=+67.103936432 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs") pod "network-metrics-daemon-lllsw" (UID: "8283839a-a189-493f-bde7-e0193d575963") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.809927 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.809983 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.809997 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.810016 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.810029 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:53Z","lastTransitionTime":"2025-12-03T06:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.913603 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.913673 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.913694 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.913721 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.913742 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:53Z","lastTransitionTime":"2025-12-03T06:31:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.961506 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.961685 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.961747 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.961818 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:32:25.96178083 +0000 UTC m=+83.305364388 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.961860 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.961877 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.961917 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:32:25.961900884 +0000 UTC m=+83.305484432 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:31:53 crc kubenswrapper[4831]: I1203 06:31:53.961968 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.962142 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.962213 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:32:25.962197672 +0000 UTC m=+83.305781220 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.962480 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.962630 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.962666 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.962834 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:32:25.962799541 +0000 UTC m=+83.306383109 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.963481 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.963518 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.963538 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:53 crc kubenswrapper[4831]: E1203 06:31:53.963636 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:32:25.963609067 +0000 UTC m=+83.307192675 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.012277 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.012423 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.012423 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:54 crc kubenswrapper[4831]: E1203 06:31:54.012600 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.012882 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:54 crc kubenswrapper[4831]: E1203 06:31:54.013048 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:54 crc kubenswrapper[4831]: E1203 06:31:54.013201 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:31:54 crc kubenswrapper[4831]: E1203 06:31:54.013372 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.016883 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.016957 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.016974 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.016999 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.017018 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:54Z","lastTransitionTime":"2025-12-03T06:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.119718 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.119776 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.119795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.119818 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.119835 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:54Z","lastTransitionTime":"2025-12-03T06:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.223716 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.223751 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.223761 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.223774 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.223783 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:54Z","lastTransitionTime":"2025-12-03T06:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.327450 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.327513 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.327529 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.327555 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.327576 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:54Z","lastTransitionTime":"2025-12-03T06:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.430256 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.430385 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.430411 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.430438 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.430457 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:54Z","lastTransitionTime":"2025-12-03T06:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.533462 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.533512 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.533530 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.533553 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.533570 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:54Z","lastTransitionTime":"2025-12-03T06:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.636887 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.636937 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.636955 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.636977 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.636993 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:54Z","lastTransitionTime":"2025-12-03T06:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.740773 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.740820 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.740836 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.740858 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.740874 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:54Z","lastTransitionTime":"2025-12-03T06:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.843665 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.843714 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.843726 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.843741 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.843754 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:54Z","lastTransitionTime":"2025-12-03T06:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.946367 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.946423 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.946438 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.946461 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:54 crc kubenswrapper[4831]: I1203 06:31:54.946481 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:54Z","lastTransitionTime":"2025-12-03T06:31:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.049134 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.049194 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.049211 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.049235 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.049252 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:55Z","lastTransitionTime":"2025-12-03T06:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.152304 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.152399 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.152418 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.152443 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.152465 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:55Z","lastTransitionTime":"2025-12-03T06:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.254923 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.254994 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.255016 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.255050 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.255072 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:55Z","lastTransitionTime":"2025-12-03T06:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.357832 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.357877 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.357899 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.357925 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.357951 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:55Z","lastTransitionTime":"2025-12-03T06:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.461105 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.461160 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.461179 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.461204 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.461221 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:55Z","lastTransitionTime":"2025-12-03T06:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.564829 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.564892 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.564915 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.564942 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.564964 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:55Z","lastTransitionTime":"2025-12-03T06:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.667911 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.667968 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.667991 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.668018 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.668036 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:55Z","lastTransitionTime":"2025-12-03T06:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.770679 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.770733 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.770751 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.770772 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.770788 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:55Z","lastTransitionTime":"2025-12-03T06:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.874985 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.875051 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.875082 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.875111 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.875129 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:55Z","lastTransitionTime":"2025-12-03T06:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.977774 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.977853 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.977874 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.977901 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:55 crc kubenswrapper[4831]: I1203 06:31:55.977923 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:55Z","lastTransitionTime":"2025-12-03T06:31:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.012587 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.012672 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.012698 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.012711 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:56 crc kubenswrapper[4831]: E1203 06:31:56.013294 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:56 crc kubenswrapper[4831]: E1203 06:31:56.013409 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:56 crc kubenswrapper[4831]: E1203 06:31:56.013547 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:56 crc kubenswrapper[4831]: E1203 06:31:56.013787 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.080407 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.080459 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.080476 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.080501 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.080518 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:56Z","lastTransitionTime":"2025-12-03T06:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.184550 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.184671 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.184691 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.184717 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.184735 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:56Z","lastTransitionTime":"2025-12-03T06:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.287615 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.287689 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.287715 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.287746 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.287769 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:56Z","lastTransitionTime":"2025-12-03T06:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.391246 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.391304 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.391357 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.391384 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.391402 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:56Z","lastTransitionTime":"2025-12-03T06:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.494813 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.494924 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.494947 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.494978 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.495035 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:56Z","lastTransitionTime":"2025-12-03T06:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.598469 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.598555 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.598579 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.598612 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.598642 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:56Z","lastTransitionTime":"2025-12-03T06:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.701521 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.701586 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.701605 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.701631 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.701649 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:56Z","lastTransitionTime":"2025-12-03T06:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.804472 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.804542 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.804561 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.804589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.804607 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:56Z","lastTransitionTime":"2025-12-03T06:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.906731 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.906784 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.906801 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.906823 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.906838 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:56Z","lastTransitionTime":"2025-12-03T06:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.955635 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.955701 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.955719 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.955743 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.955760 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:56Z","lastTransitionTime":"2025-12-03T06:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:56 crc kubenswrapper[4831]: E1203 06:31:56.974907 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:56Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.979708 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.979783 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.979803 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.979829 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.979847 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:56Z","lastTransitionTime":"2025-12-03T06:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:56 crc kubenswrapper[4831]: E1203 06:31:56.995080 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:56Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.999727 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.999785 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.999807 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.999830 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:56 crc kubenswrapper[4831]: I1203 06:31:56.999848 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:56Z","lastTransitionTime":"2025-12-03T06:31:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:57 crc kubenswrapper[4831]: E1203 06:31:57.015232 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:57Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.019237 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.019299 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.019353 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.019385 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.019409 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:57Z","lastTransitionTime":"2025-12-03T06:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:57 crc kubenswrapper[4831]: E1203 06:31:57.038662 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:57Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.043684 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.043741 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.043756 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.043780 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.043796 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:57Z","lastTransitionTime":"2025-12-03T06:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:57 crc kubenswrapper[4831]: E1203 06:31:57.059975 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:31:57Z is after 2025-08-24T17:21:41Z" Dec 03 06:31:57 crc kubenswrapper[4831]: E1203 06:31:57.060152 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.061913 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.061945 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.061956 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.061976 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.061992 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:57Z","lastTransitionTime":"2025-12-03T06:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.171114 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.171172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.171191 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.171215 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.171232 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:57Z","lastTransitionTime":"2025-12-03T06:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.274660 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.274737 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.274755 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.274781 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.274801 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:57Z","lastTransitionTime":"2025-12-03T06:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.377526 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.377585 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.377602 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.377630 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.377648 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:57Z","lastTransitionTime":"2025-12-03T06:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.480659 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.480737 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.480761 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.480791 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.480812 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:57Z","lastTransitionTime":"2025-12-03T06:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.584488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.584549 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.584572 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.584606 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.584627 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:57Z","lastTransitionTime":"2025-12-03T06:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.687808 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.687899 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.687925 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.687957 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.687984 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:57Z","lastTransitionTime":"2025-12-03T06:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.791002 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.791075 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.791092 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.791117 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.791134 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:57Z","lastTransitionTime":"2025-12-03T06:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.894649 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.894709 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.894733 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.894761 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.894779 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:57Z","lastTransitionTime":"2025-12-03T06:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.997684 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.997749 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.997768 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.997793 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:57 crc kubenswrapper[4831]: I1203 06:31:57.997812 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:57Z","lastTransitionTime":"2025-12-03T06:31:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.012194 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.012248 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.012196 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:31:58 crc kubenswrapper[4831]: E1203 06:31:58.012391 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.012435 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:31:58 crc kubenswrapper[4831]: E1203 06:31:58.012511 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:31:58 crc kubenswrapper[4831]: E1203 06:31:58.012663 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:31:58 crc kubenswrapper[4831]: E1203 06:31:58.012773 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.101916 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.101994 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.102017 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.102048 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.102070 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:58Z","lastTransitionTime":"2025-12-03T06:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.205310 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.205499 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.205530 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.205559 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.205581 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:58Z","lastTransitionTime":"2025-12-03T06:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.309088 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.309148 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.309167 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.309191 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.309207 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:58Z","lastTransitionTime":"2025-12-03T06:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.412200 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.412256 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.412272 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.412296 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.412343 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:58Z","lastTransitionTime":"2025-12-03T06:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.515785 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.515843 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.515859 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.515890 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.515912 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:58Z","lastTransitionTime":"2025-12-03T06:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.619865 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.619942 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.619967 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.619996 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.620018 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:58Z","lastTransitionTime":"2025-12-03T06:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.723524 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.723583 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.723633 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.723657 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.723675 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:58Z","lastTransitionTime":"2025-12-03T06:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.826911 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.826981 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.827001 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.827025 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.827044 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:58Z","lastTransitionTime":"2025-12-03T06:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.930235 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.930301 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.930351 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.930377 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:58 crc kubenswrapper[4831]: I1203 06:31:58.930395 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:58Z","lastTransitionTime":"2025-12-03T06:31:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.033175 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.033236 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.033255 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.033280 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.033297 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:59Z","lastTransitionTime":"2025-12-03T06:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.136349 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.136415 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.136432 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.136457 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.136478 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:59Z","lastTransitionTime":"2025-12-03T06:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.239499 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.239555 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.239573 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.239596 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.239612 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:59Z","lastTransitionTime":"2025-12-03T06:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.342372 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.342440 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.342462 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.342489 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.342507 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:59Z","lastTransitionTime":"2025-12-03T06:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.445096 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.445151 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.445169 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.445191 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.445208 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:59Z","lastTransitionTime":"2025-12-03T06:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.548443 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.548529 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.548555 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.548581 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.548598 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:59Z","lastTransitionTime":"2025-12-03T06:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.652089 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.652157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.652175 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.652197 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.652213 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:59Z","lastTransitionTime":"2025-12-03T06:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.756211 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.756266 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.756283 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.756305 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.756351 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:59Z","lastTransitionTime":"2025-12-03T06:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.859382 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.859445 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.859463 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.859488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.859505 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:59Z","lastTransitionTime":"2025-12-03T06:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.961955 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.962057 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.962140 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.962216 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:31:59 crc kubenswrapper[4831]: I1203 06:31:59.962249 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:31:59Z","lastTransitionTime":"2025-12-03T06:31:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.012088 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.012186 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.012290 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:00 crc kubenswrapper[4831]: E1203 06:32:00.012283 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:00 crc kubenswrapper[4831]: E1203 06:32:00.012536 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.012587 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:00 crc kubenswrapper[4831]: E1203 06:32:00.012698 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:00 crc kubenswrapper[4831]: E1203 06:32:00.012809 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.065429 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.065489 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.065515 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.065543 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.065568 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:00Z","lastTransitionTime":"2025-12-03T06:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.168046 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.168095 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.168115 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.168145 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.168168 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:00Z","lastTransitionTime":"2025-12-03T06:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.271135 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.271193 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.271209 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.271235 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.271251 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:00Z","lastTransitionTime":"2025-12-03T06:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.374756 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.374817 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.374834 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.374859 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.374878 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:00Z","lastTransitionTime":"2025-12-03T06:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.477768 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.477824 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.477841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.477868 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.477885 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:00Z","lastTransitionTime":"2025-12-03T06:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.580650 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.580706 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.580722 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.580743 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.580760 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:00Z","lastTransitionTime":"2025-12-03T06:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.683448 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.683492 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.683509 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.683535 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.683554 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:00Z","lastTransitionTime":"2025-12-03T06:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.786839 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.786890 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.786906 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.786933 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.786950 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:00Z","lastTransitionTime":"2025-12-03T06:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.890240 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.890301 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.890373 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.890400 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.890416 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:00Z","lastTransitionTime":"2025-12-03T06:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.994008 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.994072 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.994094 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.994122 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:00 crc kubenswrapper[4831]: I1203 06:32:00.994143 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:00Z","lastTransitionTime":"2025-12-03T06:32:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.097441 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.097516 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.097541 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.097576 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.097601 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:01Z","lastTransitionTime":"2025-12-03T06:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.201262 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.201463 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.201490 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.201524 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.201551 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:01Z","lastTransitionTime":"2025-12-03T06:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.304717 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.304766 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.304782 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.304806 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.304823 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:01Z","lastTransitionTime":"2025-12-03T06:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.408644 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.408703 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.408725 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.408749 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.408766 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:01Z","lastTransitionTime":"2025-12-03T06:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.511267 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.511369 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.511397 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.511429 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.511452 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:01Z","lastTransitionTime":"2025-12-03T06:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.614837 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.614898 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.614915 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.614941 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.614963 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:01Z","lastTransitionTime":"2025-12-03T06:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.718215 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.718274 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.718296 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.718356 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.718381 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:01Z","lastTransitionTime":"2025-12-03T06:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.821295 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.821408 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.821430 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.821456 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.821473 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:01Z","lastTransitionTime":"2025-12-03T06:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.924131 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.924199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.924222 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.924253 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:01 crc kubenswrapper[4831]: I1203 06:32:01.924276 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:01Z","lastTransitionTime":"2025-12-03T06:32:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.012775 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.012826 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:02 crc kubenswrapper[4831]: E1203 06:32:02.012886 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.012778 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:02 crc kubenswrapper[4831]: E1203 06:32:02.012966 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.013061 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:02 crc kubenswrapper[4831]: E1203 06:32:02.017000 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:02 crc kubenswrapper[4831]: E1203 06:32:02.017125 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.027076 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.027124 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.027133 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.027147 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.027155 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:02Z","lastTransitionTime":"2025-12-03T06:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.130602 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.130659 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.130675 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.130698 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.130718 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:02Z","lastTransitionTime":"2025-12-03T06:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.234411 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.234477 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.234494 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.234516 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.234536 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:02Z","lastTransitionTime":"2025-12-03T06:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.337407 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.337466 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.337482 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.337504 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.337522 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:02Z","lastTransitionTime":"2025-12-03T06:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.440198 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.440276 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.440293 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.440355 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.440374 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:02Z","lastTransitionTime":"2025-12-03T06:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.543562 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.543646 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.543667 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.543691 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.543736 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:02Z","lastTransitionTime":"2025-12-03T06:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.650028 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.650115 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.650165 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.650192 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.650210 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:02Z","lastTransitionTime":"2025-12-03T06:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.753212 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.753266 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.753283 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.753306 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.753349 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:02Z","lastTransitionTime":"2025-12-03T06:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.857669 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.857731 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.857751 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.857778 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.857796 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:02Z","lastTransitionTime":"2025-12-03T06:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.961444 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.961504 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.961520 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.961543 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:02 crc kubenswrapper[4831]: I1203 06:32:02.961565 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:02Z","lastTransitionTime":"2025-12-03T06:32:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.035708 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.056954 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.064873 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.064956 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.064979 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.065012 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.065046 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:03Z","lastTransitionTime":"2025-12-03T06:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.076346 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.099518 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.121017 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.138880 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.157038 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.168206 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.168267 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.168291 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.168362 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.168389 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:03Z","lastTransitionTime":"2025-12-03T06:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.173982 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069455ec-45ad-4fb1-b102-c0624f3c7f15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d320c688af94e83163a489c45472fa0c4aa5a3d1f4aeade63c37d536888361d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://978aff94d5ba5b47d721fd41ecdb14563b76a93959caae023dcd2add83e622d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16c4ad7cfee1247394e7bbb77ce47852d4f18af08ffa88a4a2e61f5d2a15371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.190174 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.210889 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.239191 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:48Z\\\",\\\"message\\\":\\\"moval\\\\nI1203 06:31:47.938671 6468 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:47.938679 6468 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:47.938707 6468 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 06:31:47.938714 6468 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:47.938719 6468 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 06:31:47.938719 6468 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:47.938719 6468 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 06:31:47.938738 6468 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:47.938754 6468 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 06:31:47.938766 6468 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 06:31:47.939102 6468 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:47.939122 6468 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:47.939157 6468 factory.go:656] Stopping watch factory\\\\nI1203 06:31:47.939173 6468 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:31:47.939200 6468 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.255424 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.271254 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.271831 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.271889 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.271905 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.271930 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.271948 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:03Z","lastTransitionTime":"2025-12-03T06:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.289867 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.309462 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.328191 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.343456 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:03Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.374559 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.374592 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.374604 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.374619 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.374630 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:03Z","lastTransitionTime":"2025-12-03T06:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.477491 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.477560 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.477583 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.477614 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.477637 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:03Z","lastTransitionTime":"2025-12-03T06:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.580344 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.580391 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.580400 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.580430 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.580439 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:03Z","lastTransitionTime":"2025-12-03T06:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.683010 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.683251 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.683266 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.683281 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.683294 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:03Z","lastTransitionTime":"2025-12-03T06:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.785632 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.785716 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.785735 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.785760 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.785780 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:03Z","lastTransitionTime":"2025-12-03T06:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.888874 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.888919 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.888933 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.888948 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.888960 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:03Z","lastTransitionTime":"2025-12-03T06:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.991509 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.991591 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.991624 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.991654 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:03 crc kubenswrapper[4831]: I1203 06:32:03.991677 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:03Z","lastTransitionTime":"2025-12-03T06:32:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.012373 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.012420 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.012456 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.012420 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:04 crc kubenswrapper[4831]: E1203 06:32:04.012513 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:04 crc kubenswrapper[4831]: E1203 06:32:04.012653 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:04 crc kubenswrapper[4831]: E1203 06:32:04.012796 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:04 crc kubenswrapper[4831]: E1203 06:32:04.012914 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.094791 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.099563 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.099583 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.099617 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.099635 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:04Z","lastTransitionTime":"2025-12-03T06:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.202710 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.202752 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.202766 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.202786 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.202800 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:04Z","lastTransitionTime":"2025-12-03T06:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.306598 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.306652 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.306668 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.306691 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.306706 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:04Z","lastTransitionTime":"2025-12-03T06:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.409537 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.409594 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.409615 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.409644 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.409669 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:04Z","lastTransitionTime":"2025-12-03T06:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.512858 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.512939 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.512964 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.512994 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.513014 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:04Z","lastTransitionTime":"2025-12-03T06:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.615491 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.615569 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.615586 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.615610 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.615627 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:04Z","lastTransitionTime":"2025-12-03T06:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.774052 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.774094 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.774103 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.774117 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.774126 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:04Z","lastTransitionTime":"2025-12-03T06:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.876252 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.876292 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.876300 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.876329 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.876341 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:04Z","lastTransitionTime":"2025-12-03T06:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.979068 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.979143 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.979172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.979266 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:04 crc kubenswrapper[4831]: I1203 06:32:04.979297 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:04Z","lastTransitionTime":"2025-12-03T06:32:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.082289 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.082385 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.082409 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.082435 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.082455 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:05Z","lastTransitionTime":"2025-12-03T06:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.185847 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.185895 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.185911 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.185933 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.185949 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:05Z","lastTransitionTime":"2025-12-03T06:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.288623 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.288665 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.288682 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.288703 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.288719 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:05Z","lastTransitionTime":"2025-12-03T06:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.391023 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.391073 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.391089 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.391111 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.391127 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:05Z","lastTransitionTime":"2025-12-03T06:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.494447 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.494501 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.494519 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.494544 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.494561 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:05Z","lastTransitionTime":"2025-12-03T06:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.597719 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.597762 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.597777 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.597799 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.597816 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:05Z","lastTransitionTime":"2025-12-03T06:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.700215 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.700251 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.700263 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.700278 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.700290 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:05Z","lastTransitionTime":"2025-12-03T06:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.803972 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.804037 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.804054 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.804079 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.804096 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:05Z","lastTransitionTime":"2025-12-03T06:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.907476 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.907543 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.907593 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.907621 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:05 crc kubenswrapper[4831]: I1203 06:32:05.907639 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:05Z","lastTransitionTime":"2025-12-03T06:32:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.010594 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.010659 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.010681 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.010710 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.010731 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:06Z","lastTransitionTime":"2025-12-03T06:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.011716 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.011769 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.011797 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.011797 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:06 crc kubenswrapper[4831]: E1203 06:32:06.011907 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:06 crc kubenswrapper[4831]: E1203 06:32:06.012065 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:06 crc kubenswrapper[4831]: E1203 06:32:06.012692 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:06 crc kubenswrapper[4831]: E1203 06:32:06.012924 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.013137 4831 scope.go:117] "RemoveContainer" containerID="c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1" Dec 03 06:32:06 crc kubenswrapper[4831]: E1203 06:32:06.013538 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.114402 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.114449 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.114465 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.114488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.114504 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:06Z","lastTransitionTime":"2025-12-03T06:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.218188 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.218282 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.218425 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.218454 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.218471 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:06Z","lastTransitionTime":"2025-12-03T06:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.321238 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.321381 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.321417 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.321449 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.321477 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:06Z","lastTransitionTime":"2025-12-03T06:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.424864 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.424913 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.424925 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.424944 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.424956 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:06Z","lastTransitionTime":"2025-12-03T06:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.527865 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.528079 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.528091 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.528107 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.528120 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:06Z","lastTransitionTime":"2025-12-03T06:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.630384 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.630442 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.630456 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.630477 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.630493 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:06Z","lastTransitionTime":"2025-12-03T06:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.732681 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.732719 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.732728 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.732743 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.732752 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:06Z","lastTransitionTime":"2025-12-03T06:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.835048 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.835121 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.835302 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.835375 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.835399 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:06Z","lastTransitionTime":"2025-12-03T06:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.937926 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.937984 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.938007 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.938035 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:06 crc kubenswrapper[4831]: I1203 06:32:06.938056 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:06Z","lastTransitionTime":"2025-12-03T06:32:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.040342 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.040389 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.040407 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.040431 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.040448 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.143515 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.143585 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.143602 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.143630 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.143649 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.183368 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.183418 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.183430 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.183447 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.183456 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: E1203 06:32:07.198364 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.202003 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.202061 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.202078 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.202101 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.202117 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: E1203 06:32:07.222796 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.227864 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.227890 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.227899 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.227912 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.227925 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: E1203 06:32:07.248075 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.252657 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.252776 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.252844 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.252939 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.253015 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: E1203 06:32:07.271222 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.275207 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.275271 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.275291 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.275344 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.275363 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: E1203 06:32:07.291104 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:07Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:07Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:07 crc kubenswrapper[4831]: E1203 06:32:07.291225 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.292732 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.292799 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.292824 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.292854 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.292907 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.396271 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.396309 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.396334 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.396348 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.396357 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.499143 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.499461 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.499543 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.499639 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.499722 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.601888 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.601921 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.601932 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.601947 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.601957 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.705188 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.705244 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.705259 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.705282 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.705299 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.812534 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.812590 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.812608 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.812630 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.812647 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.915379 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.915436 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.915453 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.915478 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:07 crc kubenswrapper[4831]: I1203 06:32:07.915495 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:07Z","lastTransitionTime":"2025-12-03T06:32:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.012092 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.012132 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.012191 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:08 crc kubenswrapper[4831]: E1203 06:32:08.012218 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:08 crc kubenswrapper[4831]: E1203 06:32:08.012341 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:08 crc kubenswrapper[4831]: E1203 06:32:08.012474 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.012877 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:08 crc kubenswrapper[4831]: E1203 06:32:08.013202 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.018557 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.018608 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.018625 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.018653 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.018672 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:08Z","lastTransitionTime":"2025-12-03T06:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.121405 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.121464 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.121481 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.121504 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.121521 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:08Z","lastTransitionTime":"2025-12-03T06:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.223649 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.223704 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.223722 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.223744 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.223762 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:08Z","lastTransitionTime":"2025-12-03T06:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.326389 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.326473 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.326495 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.326523 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.326540 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:08Z","lastTransitionTime":"2025-12-03T06:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.429701 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.430095 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.430239 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.430409 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.430548 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:08Z","lastTransitionTime":"2025-12-03T06:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.533265 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.533646 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.533784 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.533916 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.534036 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:08Z","lastTransitionTime":"2025-12-03T06:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.636815 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.636867 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.636879 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.636899 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.636913 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:08Z","lastTransitionTime":"2025-12-03T06:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.739631 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.739689 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.739706 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.739728 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.739745 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:08Z","lastTransitionTime":"2025-12-03T06:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.842044 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.842081 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.842088 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.842102 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.842113 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:08Z","lastTransitionTime":"2025-12-03T06:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.943692 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.943738 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.943750 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.943766 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:08 crc kubenswrapper[4831]: I1203 06:32:08.943776 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:08Z","lastTransitionTime":"2025-12-03T06:32:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.045522 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.045557 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.045566 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.045578 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.045589 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:09Z","lastTransitionTime":"2025-12-03T06:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.147793 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.147857 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.147875 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.147898 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.147914 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:09Z","lastTransitionTime":"2025-12-03T06:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.250867 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.250935 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.250952 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.250976 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.250992 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:09Z","lastTransitionTime":"2025-12-03T06:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.353824 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.353882 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.353901 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.353924 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.353942 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:09Z","lastTransitionTime":"2025-12-03T06:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.455931 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.455997 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.456019 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.456050 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.456072 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:09Z","lastTransitionTime":"2025-12-03T06:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.558821 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.558891 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.558922 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.558953 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.558976 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:09Z","lastTransitionTime":"2025-12-03T06:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.661900 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.661963 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.661991 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.662023 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.662060 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:09Z","lastTransitionTime":"2025-12-03T06:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.764446 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.764529 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.764551 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.764584 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.764602 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:09Z","lastTransitionTime":"2025-12-03T06:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.839189 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:09 crc kubenswrapper[4831]: E1203 06:32:09.839339 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:32:09 crc kubenswrapper[4831]: E1203 06:32:09.839388 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs podName:8283839a-a189-493f-bde7-e0193d575963 nodeName:}" failed. No retries permitted until 2025-12-03 06:32:41.839372599 +0000 UTC m=+99.182956107 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs") pod "network-metrics-daemon-lllsw" (UID: "8283839a-a189-493f-bde7-e0193d575963") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.867004 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.867064 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.867081 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.867104 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.867123 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:09Z","lastTransitionTime":"2025-12-03T06:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.969476 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.969550 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.969636 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.969679 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:09 crc kubenswrapper[4831]: I1203 06:32:09.969708 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:09Z","lastTransitionTime":"2025-12-03T06:32:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.012632 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.012673 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.012730 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.012733 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:10 crc kubenswrapper[4831]: E1203 06:32:10.012848 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:10 crc kubenswrapper[4831]: E1203 06:32:10.012943 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:10 crc kubenswrapper[4831]: E1203 06:32:10.013050 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:10 crc kubenswrapper[4831]: E1203 06:32:10.013185 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.072269 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.072360 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.072377 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.072402 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.072420 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:10Z","lastTransitionTime":"2025-12-03T06:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.175095 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.175149 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.175169 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.175526 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.175667 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:10Z","lastTransitionTime":"2025-12-03T06:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.279755 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.279811 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.279827 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.279852 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.279870 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:10Z","lastTransitionTime":"2025-12-03T06:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.382932 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.382986 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.383001 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.383024 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.383038 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:10Z","lastTransitionTime":"2025-12-03T06:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.485531 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.485574 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.485584 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.485598 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.485609 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:10Z","lastTransitionTime":"2025-12-03T06:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.587804 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.587840 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.587849 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.587865 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.587873 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:10Z","lastTransitionTime":"2025-12-03T06:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.691095 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.691139 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.691157 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.691180 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.691197 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:10Z","lastTransitionTime":"2025-12-03T06:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.793611 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.793645 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.793657 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.793676 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.793686 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:10Z","lastTransitionTime":"2025-12-03T06:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.895696 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.895761 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.895779 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.895804 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.895821 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:10Z","lastTransitionTime":"2025-12-03T06:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.999537 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.999638 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.999658 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.999684 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:10 crc kubenswrapper[4831]: I1203 06:32:10.999702 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:10Z","lastTransitionTime":"2025-12-03T06:32:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.103229 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.103288 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.103304 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.103375 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.103399 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:11Z","lastTransitionTime":"2025-12-03T06:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.206828 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.206890 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.206908 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.206933 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.206950 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:11Z","lastTransitionTime":"2025-12-03T06:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.309380 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.309419 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.309427 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.309439 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.309448 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:11Z","lastTransitionTime":"2025-12-03T06:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.411774 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.411827 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.411836 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.411850 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.411858 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:11Z","lastTransitionTime":"2025-12-03T06:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.452287 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vz8ft_74a16df4-1f25-4b0f-9e08-f6486f262a68/kube-multus/0.log" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.452349 4831 generic.go:334] "Generic (PLEG): container finished" podID="74a16df4-1f25-4b0f-9e08-f6486f262a68" containerID="7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5" exitCode=1 Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.452374 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vz8ft" event={"ID":"74a16df4-1f25-4b0f-9e08-f6486f262a68","Type":"ContainerDied","Data":"7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5"} Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.452686 4831 scope.go:117] "RemoveContainer" containerID="7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.470958 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.491784 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:11Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:32:10Z\\\",\\\"message\\\":\\\"2025-12-03T06:31:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de\\\\n2025-12-03T06:31:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de to /host/opt/cni/bin/\\\\n2025-12-03T06:31:25Z [verbose] multus-daemon started\\\\n2025-12-03T06:31:25Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:32:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.509250 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.514041 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.514072 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.514081 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.514096 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.514105 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:11Z","lastTransitionTime":"2025-12-03T06:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.530523 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.548586 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069455ec-45ad-4fb1-b102-c0624f3c7f15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d320c688af94e83163a489c45472fa0c4aa5a3d1f4aeade63c37d536888361d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://978aff94d5ba5b47d721fd41ecdb14563b76a93959caae023dcd2add83e622d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16c4ad7cfee1247394e7bbb77ce47852d4f18af08ffa88a4a2e61f5d2a15371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.566959 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.581236 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.606492 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:48Z\\\",\\\"message\\\":\\\"moval\\\\nI1203 06:31:47.938671 6468 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:47.938679 6468 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:47.938707 6468 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 06:31:47.938714 6468 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:47.938719 6468 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 06:31:47.938719 6468 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:47.938719 6468 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 06:31:47.938738 6468 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:47.938754 6468 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 06:31:47.938766 6468 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 06:31:47.939102 6468 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:47.939122 6468 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:47.939157 6468 factory.go:656] Stopping watch factory\\\\nI1203 06:31:47.939173 6468 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:31:47.939200 6468 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.616015 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.616052 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.616061 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.616075 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.616084 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:11Z","lastTransitionTime":"2025-12-03T06:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.617428 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.627228 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.643176 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.656022 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.669621 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.679982 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.702881 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.717497 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.718850 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.718900 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.718917 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.718940 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.718958 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:11Z","lastTransitionTime":"2025-12-03T06:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.735450 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:11Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.821939 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.821987 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.822003 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.822025 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.822044 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:11Z","lastTransitionTime":"2025-12-03T06:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.924453 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.924495 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.924506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.924521 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:11 crc kubenswrapper[4831]: I1203 06:32:11.924533 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:11Z","lastTransitionTime":"2025-12-03T06:32:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.011833 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.011869 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.011891 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.011931 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:12 crc kubenswrapper[4831]: E1203 06:32:12.012038 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:12 crc kubenswrapper[4831]: E1203 06:32:12.012188 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:12 crc kubenswrapper[4831]: E1203 06:32:12.012284 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:12 crc kubenswrapper[4831]: E1203 06:32:12.012374 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.026412 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.026447 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.026455 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.026469 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.026479 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:12Z","lastTransitionTime":"2025-12-03T06:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.128762 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.128813 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.128833 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.128861 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.128886 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:12Z","lastTransitionTime":"2025-12-03T06:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.231591 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.231646 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.231662 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.231686 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.231703 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:12Z","lastTransitionTime":"2025-12-03T06:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.333925 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.333975 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.333989 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.334006 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.334019 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:12Z","lastTransitionTime":"2025-12-03T06:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.435887 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.435915 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.435924 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.435935 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.435943 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:12Z","lastTransitionTime":"2025-12-03T06:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.457179 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vz8ft_74a16df4-1f25-4b0f-9e08-f6486f262a68/kube-multus/0.log" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.457222 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vz8ft" event={"ID":"74a16df4-1f25-4b0f-9e08-f6486f262a68","Type":"ContainerStarted","Data":"2eb1b0783eb62f680604ecf7af5a64083dfba10f7beb1cb91448817c62828da4"} Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.473159 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069455ec-45ad-4fb1-b102-c0624f3c7f15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d320c688af94e83163a489c45472fa0c4aa5a3d1f4aeade63c37d536888361d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://978aff94d5ba5b47d721fd41ecdb14563b76a93959caae023dcd2add83e622d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16c4ad7cfee1247394e7bbb77ce47852d4f18af08ffa88a4a2e61f5d2a15371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.489342 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.506830 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.527854 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:48Z\\\",\\\"message\\\":\\\"moval\\\\nI1203 06:31:47.938671 6468 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:47.938679 6468 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:47.938707 6468 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 06:31:47.938714 6468 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:47.938719 6468 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 06:31:47.938719 6468 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:47.938719 6468 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 06:31:47.938738 6468 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:47.938754 6468 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 06:31:47.938766 6468 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 06:31:47.939102 6468 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:47.939122 6468 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:47.939157 6468 factory.go:656] Stopping watch factory\\\\nI1203 06:31:47.939173 6468 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:31:47.939200 6468 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.537876 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.537918 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.537929 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.537946 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.537959 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:12Z","lastTransitionTime":"2025-12-03T06:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.541072 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.559943 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.574447 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.591181 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.603648 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.614805 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.630388 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.640822 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.640884 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.640906 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.640934 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.640970 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:12Z","lastTransitionTime":"2025-12-03T06:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.648301 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.668183 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.687126 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.706922 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.721269 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb1b0783eb62f680604ecf7af5a64083dfba10f7beb1cb91448817c62828da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:32:10Z\\\",\\\"message\\\":\\\"2025-12-03T06:31:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de\\\\n2025-12-03T06:31:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de to /host/opt/cni/bin/\\\\n2025-12-03T06:31:25Z [verbose] multus-daemon started\\\\n2025-12-03T06:31:25Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:32:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.736362 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:12Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.743138 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.743183 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.743202 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.743225 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.743242 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:12Z","lastTransitionTime":"2025-12-03T06:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.845751 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.845805 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.845822 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.845845 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.845862 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:12Z","lastTransitionTime":"2025-12-03T06:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.948454 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.948502 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.948514 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.948529 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:12 crc kubenswrapper[4831]: I1203 06:32:12.948539 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:12Z","lastTransitionTime":"2025-12-03T06:32:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.026347 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.039508 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb1b0783eb62f680604ecf7af5a64083dfba10f7beb1cb91448817c62828da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:32:10Z\\\",\\\"message\\\":\\\"2025-12-03T06:31:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de\\\\n2025-12-03T06:31:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de to /host/opt/cni/bin/\\\\n2025-12-03T06:31:25Z [verbose] multus-daemon started\\\\n2025-12-03T06:31:25Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:32:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.051002 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.051035 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.051048 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.051067 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.051079 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:13Z","lastTransitionTime":"2025-12-03T06:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.053437 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.066136 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.080345 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.101395 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:48Z\\\",\\\"message\\\":\\\"moval\\\\nI1203 06:31:47.938671 6468 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:47.938679 6468 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:47.938707 6468 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 06:31:47.938714 6468 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:47.938719 6468 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 06:31:47.938719 6468 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:47.938719 6468 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 06:31:47.938738 6468 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:47.938754 6468 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 06:31:47.938766 6468 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 06:31:47.939102 6468 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:47.939122 6468 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:47.939157 6468 factory.go:656] Stopping watch factory\\\\nI1203 06:31:47.939173 6468 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:31:47.939200 6468 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.114777 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.127301 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069455ec-45ad-4fb1-b102-c0624f3c7f15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d320c688af94e83163a489c45472fa0c4aa5a3d1f4aeade63c37d536888361d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://978aff94d5ba5b47d721fd41ecdb14563b76a93959caae023dcd2add83e622d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16c4ad7cfee1247394e7bbb77ce47852d4f18af08ffa88a4a2e61f5d2a15371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.139462 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.155299 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.155629 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.155661 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.155673 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.155688 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.155699 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:13Z","lastTransitionTime":"2025-12-03T06:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.167049 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.177645 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.188752 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.200311 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.214762 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.227283 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.238454 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:13Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.258455 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.258486 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.258494 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.258509 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.258519 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:13Z","lastTransitionTime":"2025-12-03T06:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.362061 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.362123 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.362145 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.362175 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.362198 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:13Z","lastTransitionTime":"2025-12-03T06:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.464585 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.464650 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.464670 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.464700 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.464718 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:13Z","lastTransitionTime":"2025-12-03T06:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.568168 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.568401 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.568465 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.568547 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.568636 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:13Z","lastTransitionTime":"2025-12-03T06:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.671508 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.671790 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.671850 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.671928 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.671996 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:13Z","lastTransitionTime":"2025-12-03T06:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.774259 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.774373 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.774402 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.774434 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.774456 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:13Z","lastTransitionTime":"2025-12-03T06:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.877234 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.877271 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.877287 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.877308 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.877369 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:13Z","lastTransitionTime":"2025-12-03T06:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.980191 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.980253 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.980269 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.980293 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:13 crc kubenswrapper[4831]: I1203 06:32:13.980309 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:13Z","lastTransitionTime":"2025-12-03T06:32:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.012092 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.012113 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.012205 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:14 crc kubenswrapper[4831]: E1203 06:32:14.012688 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:14 crc kubenswrapper[4831]: E1203 06:32:14.012810 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.012213 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:14 crc kubenswrapper[4831]: E1203 06:32:14.012994 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:14 crc kubenswrapper[4831]: E1203 06:32:14.013203 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.082816 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.082866 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.082884 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.082907 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.082922 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:14Z","lastTransitionTime":"2025-12-03T06:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.186718 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.186764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.186776 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.186796 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.186807 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:14Z","lastTransitionTime":"2025-12-03T06:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.288712 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.288763 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.288771 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.288786 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.288798 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:14Z","lastTransitionTime":"2025-12-03T06:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.391522 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.391561 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.391571 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.391586 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.391594 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:14Z","lastTransitionTime":"2025-12-03T06:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.494887 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.494938 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.494953 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.494984 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.494998 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:14Z","lastTransitionTime":"2025-12-03T06:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.597190 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.597243 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.597259 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.597340 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.597390 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:14Z","lastTransitionTime":"2025-12-03T06:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.700162 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.700200 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.700209 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.700222 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.700230 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:14Z","lastTransitionTime":"2025-12-03T06:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.802932 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.802999 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.803015 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.803488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.803544 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:14Z","lastTransitionTime":"2025-12-03T06:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.906412 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.906472 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.906488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.906512 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:14 crc kubenswrapper[4831]: I1203 06:32:14.906528 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:14Z","lastTransitionTime":"2025-12-03T06:32:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.009601 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.009662 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.009679 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.009706 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.009728 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:15Z","lastTransitionTime":"2025-12-03T06:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.112461 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.112523 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.112541 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.112565 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.112584 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:15Z","lastTransitionTime":"2025-12-03T06:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.215631 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.215694 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.215714 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.215739 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.215759 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:15Z","lastTransitionTime":"2025-12-03T06:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.318840 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.318960 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.318982 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.319009 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.319026 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:15Z","lastTransitionTime":"2025-12-03T06:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.421743 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.421800 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.421817 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.421906 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.421978 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:15Z","lastTransitionTime":"2025-12-03T06:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.524071 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.524133 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.524154 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.524180 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.524200 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:15Z","lastTransitionTime":"2025-12-03T06:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.627671 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.627736 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.627754 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.627777 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.627794 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:15Z","lastTransitionTime":"2025-12-03T06:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.730963 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.731017 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.731035 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.731057 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.731073 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:15Z","lastTransitionTime":"2025-12-03T06:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.834005 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.834049 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.834057 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.834070 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.834080 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:15Z","lastTransitionTime":"2025-12-03T06:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.936862 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.936919 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.936935 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.936960 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:15 crc kubenswrapper[4831]: I1203 06:32:15.936979 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:15Z","lastTransitionTime":"2025-12-03T06:32:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.012411 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.012462 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.012519 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.012419 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:16 crc kubenswrapper[4831]: E1203 06:32:16.012587 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:16 crc kubenswrapper[4831]: E1203 06:32:16.012720 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:16 crc kubenswrapper[4831]: E1203 06:32:16.012862 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:16 crc kubenswrapper[4831]: E1203 06:32:16.012790 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.039735 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.039819 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.039841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.039868 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.039885 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:16Z","lastTransitionTime":"2025-12-03T06:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.142762 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.142830 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.142852 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.142880 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.142903 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:16Z","lastTransitionTime":"2025-12-03T06:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.246048 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.246081 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.246089 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.246103 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.246111 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:16Z","lastTransitionTime":"2025-12-03T06:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.348571 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.348661 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.348686 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.348711 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.348729 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:16Z","lastTransitionTime":"2025-12-03T06:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.452883 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.452946 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.452966 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.452991 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.453008 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:16Z","lastTransitionTime":"2025-12-03T06:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.556251 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.556342 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.556359 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.556389 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.556406 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:16Z","lastTransitionTime":"2025-12-03T06:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.659422 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.659488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.659510 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.659538 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.659636 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:16Z","lastTransitionTime":"2025-12-03T06:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.762580 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.762623 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.762634 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.762649 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.762662 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:16Z","lastTransitionTime":"2025-12-03T06:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.865024 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.865076 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.865088 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.865111 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.865123 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:16Z","lastTransitionTime":"2025-12-03T06:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.967895 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.967933 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.967946 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.967962 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:16 crc kubenswrapper[4831]: I1203 06:32:16.967974 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:16Z","lastTransitionTime":"2025-12-03T06:32:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.070955 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.071042 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.071064 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.071099 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.071123 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:17Z","lastTransitionTime":"2025-12-03T06:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.174856 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.174924 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.174942 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.174968 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.174987 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:17Z","lastTransitionTime":"2025-12-03T06:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.278377 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.278433 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.278450 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.278473 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.278489 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:17Z","lastTransitionTime":"2025-12-03T06:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.362945 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.363068 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.363150 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.363238 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.363270 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:17Z","lastTransitionTime":"2025-12-03T06:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:17 crc kubenswrapper[4831]: E1203 06:32:17.385202 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:17Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.390404 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.390454 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.390465 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.390483 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.390494 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:17Z","lastTransitionTime":"2025-12-03T06:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:17 crc kubenswrapper[4831]: E1203 06:32:17.410797 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:17Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.416006 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.416104 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.416170 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.416194 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.416210 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:17Z","lastTransitionTime":"2025-12-03T06:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:17 crc kubenswrapper[4831]: E1203 06:32:17.437403 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:17Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.442811 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.442866 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.442884 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.442910 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.442927 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:17Z","lastTransitionTime":"2025-12-03T06:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:17 crc kubenswrapper[4831]: E1203 06:32:17.463155 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:17Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.468472 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.468545 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.468564 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.468589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.468606 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:17Z","lastTransitionTime":"2025-12-03T06:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:17 crc kubenswrapper[4831]: E1203 06:32:17.488948 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:17Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:17Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:17 crc kubenswrapper[4831]: E1203 06:32:17.489509 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.493073 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.493131 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.493149 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.493174 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.493192 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:17Z","lastTransitionTime":"2025-12-03T06:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.596284 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.596394 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.596407 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.596456 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.596469 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:17Z","lastTransitionTime":"2025-12-03T06:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.701841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.701933 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.701952 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.701975 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.702022 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:17Z","lastTransitionTime":"2025-12-03T06:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.805498 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.805579 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.805601 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.805633 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.805652 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:17Z","lastTransitionTime":"2025-12-03T06:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.908813 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.909144 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.909296 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.909527 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:17 crc kubenswrapper[4831]: I1203 06:32:17.909661 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:17Z","lastTransitionTime":"2025-12-03T06:32:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.011731 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.011750 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.011810 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.011846 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:18 crc kubenswrapper[4831]: E1203 06:32:18.011980 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:18 crc kubenswrapper[4831]: E1203 06:32:18.012077 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:18 crc kubenswrapper[4831]: E1203 06:32:18.012578 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.012769 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.012812 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.012829 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:18 crc kubenswrapper[4831]: E1203 06:32:18.012803 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.012850 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.012898 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:18Z","lastTransitionTime":"2025-12-03T06:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.013208 4831 scope.go:117] "RemoveContainer" containerID="c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.116258 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.116431 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.116455 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.116484 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.116505 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:18Z","lastTransitionTime":"2025-12-03T06:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.220076 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.220148 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.220172 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.220202 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.220226 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:18Z","lastTransitionTime":"2025-12-03T06:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.329092 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.329163 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.329200 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.329225 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.329247 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:18Z","lastTransitionTime":"2025-12-03T06:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.431868 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.431916 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.431933 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.431956 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.431971 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:18Z","lastTransitionTime":"2025-12-03T06:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.480876 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/2.log" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.484842 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerStarted","Data":"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2"} Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.485219 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.500351 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.524992 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.536444 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.536489 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.536508 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.536531 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.536558 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:18Z","lastTransitionTime":"2025-12-03T06:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.556820 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:48Z\\\",\\\"message\\\":\\\"moval\\\\nI1203 06:31:47.938671 6468 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:47.938679 6468 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:47.938707 6468 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 06:31:47.938714 6468 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:47.938719 6468 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 06:31:47.938719 6468 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:47.938719 6468 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 06:31:47.938738 6468 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:47.938754 6468 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 06:31:47.938766 6468 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 06:31:47.939102 6468 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:47.939122 6468 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:47.939157 6468 factory.go:656] Stopping watch factory\\\\nI1203 06:31:47.939173 6468 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:31:47.939200 6468 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.576587 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.597983 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069455ec-45ad-4fb1-b102-c0624f3c7f15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d320c688af94e83163a489c45472fa0c4aa5a3d1f4aeade63c37d536888361d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://978aff94d5ba5b47d721fd41ecdb14563b76a93959caae023dcd2add83e622d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16c4ad7cfee1247394e7bbb77ce47852d4f18af08ffa88a4a2e61f5d2a15371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.621257 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.640053 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.640101 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.640117 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.640137 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.640151 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:18Z","lastTransitionTime":"2025-12-03T06:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.651534 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.666498 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.680825 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.696275 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.719230 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.738411 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.742349 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.742372 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.742380 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.742393 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.742402 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:18Z","lastTransitionTime":"2025-12-03T06:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.753772 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.779571 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.794631 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.807940 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb1b0783eb62f680604ecf7af5a64083dfba10f7beb1cb91448817c62828da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:32:10Z\\\",\\\"message\\\":\\\"2025-12-03T06:31:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de\\\\n2025-12-03T06:31:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de to /host/opt/cni/bin/\\\\n2025-12-03T06:31:25Z [verbose] multus-daemon started\\\\n2025-12-03T06:31:25Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:32:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.819569 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:18Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.845434 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.845471 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.845481 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.845495 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.845503 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:18Z","lastTransitionTime":"2025-12-03T06:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.948156 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.948218 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.948236 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.948260 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:18 crc kubenswrapper[4831]: I1203 06:32:18.948278 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:18Z","lastTransitionTime":"2025-12-03T06:32:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.051448 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.051574 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.051591 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.051615 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.051631 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:19Z","lastTransitionTime":"2025-12-03T06:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.154536 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.154587 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.154603 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.154652 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.154669 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:19Z","lastTransitionTime":"2025-12-03T06:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.257309 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.257409 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.257428 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.257454 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.257473 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:19Z","lastTransitionTime":"2025-12-03T06:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.360863 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.360936 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.360958 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.360988 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.361008 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:19Z","lastTransitionTime":"2025-12-03T06:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.463758 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.463841 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.463869 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.463899 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.463918 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:19Z","lastTransitionTime":"2025-12-03T06:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.491173 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/3.log" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.491987 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/2.log" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.496736 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerID="b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2" exitCode=1 Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.496807 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2"} Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.496880 4831 scope.go:117] "RemoveContainer" containerID="c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.497814 4831 scope.go:117] "RemoveContainer" containerID="b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2" Dec 03 06:32:19 crc kubenswrapper[4831]: E1203 06:32:19.498171 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.525983 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.545050 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069455ec-45ad-4fb1-b102-c0624f3c7f15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d320c688af94e83163a489c45472fa0c4aa5a3d1f4aeade63c37d536888361d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://978aff94d5ba5b47d721fd41ecdb14563b76a93959caae023dcd2add83e622d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16c4ad7cfee1247394e7bbb77ce47852d4f18af08ffa88a4a2e61f5d2a15371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.565832 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.569308 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.569407 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.569426 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.569500 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.569517 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:19Z","lastTransitionTime":"2025-12-03T06:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.589979 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.628670 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c789947f0ea317fa4de4eab214d8f4694a130f829b0345895f20285c59cfcaf1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:31:48Z\\\",\\\"message\\\":\\\"moval\\\\nI1203 06:31:47.938671 6468 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI1203 06:31:47.938679 6468 handler.go:208] Removed *v1.Node event handler 2\\\\nI1203 06:31:47.938707 6468 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI1203 06:31:47.938714 6468 handler.go:208] Removed *v1.Node event handler 7\\\\nI1203 06:31:47.938719 6468 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI1203 06:31:47.938719 6468 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI1203 06:31:47.938719 6468 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI1203 06:31:47.938738 6468 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI1203 06:31:47.938754 6468 handler.go:208] Removed *v1.Pod event handler 6\\\\nI1203 06:31:47.938766 6468 handler.go:208] Removed *v1.Pod event handler 3\\\\nI1203 06:31:47.939102 6468 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI1203 06:31:47.939122 6468 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI1203 06:31:47.939157 6468 factory.go:656] Stopping watch factory\\\\nI1203 06:31:47.939173 6468 ovnkube.go:599] Stopped ovnkube\\\\nI1203 06:31:47.939200 6468 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI1203 06\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:32:19Z\\\",\\\"message\\\":\\\"F1203 06:32:19.104042 6817 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z]\\\\nI1203 06:32:19.104067 6817 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:32:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.647892 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.664900 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.671592 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.671640 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.671657 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.671679 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.671696 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:19Z","lastTransitionTime":"2025-12-03T06:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.687876 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.708616 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.727052 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.744665 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.765005 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.774682 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.774747 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.774769 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.774796 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.774818 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:19Z","lastTransitionTime":"2025-12-03T06:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.785377 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.804722 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.824231 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.840954 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb1b0783eb62f680604ecf7af5a64083dfba10f7beb1cb91448817c62828da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:32:10Z\\\",\\\"message\\\":\\\"2025-12-03T06:31:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de\\\\n2025-12-03T06:31:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de to /host/opt/cni/bin/\\\\n2025-12-03T06:31:25Z [verbose] multus-daemon started\\\\n2025-12-03T06:31:25Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:32:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.856156 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.877291 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.877353 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.877365 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.877384 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.877396 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:19Z","lastTransitionTime":"2025-12-03T06:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.980789 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.980866 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.980884 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.980909 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:19 crc kubenswrapper[4831]: I1203 06:32:19.980927 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:19Z","lastTransitionTime":"2025-12-03T06:32:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.012659 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:20 crc kubenswrapper[4831]: E1203 06:32:20.012799 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.013021 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:20 crc kubenswrapper[4831]: E1203 06:32:20.013097 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.013253 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:20 crc kubenswrapper[4831]: E1203 06:32:20.013341 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.013476 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:20 crc kubenswrapper[4831]: E1203 06:32:20.013548 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.083879 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.083938 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.083955 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.083982 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.084000 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:20Z","lastTransitionTime":"2025-12-03T06:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.187298 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.187369 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.187387 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.187413 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.187432 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:20Z","lastTransitionTime":"2025-12-03T06:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.290547 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.290623 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.290646 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.290676 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.290698 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:20Z","lastTransitionTime":"2025-12-03T06:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.393626 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.393925 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.394115 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.394373 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.394602 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:20Z","lastTransitionTime":"2025-12-03T06:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.498433 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.498489 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.498508 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.498530 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.498548 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:20Z","lastTransitionTime":"2025-12-03T06:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.504158 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/3.log" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.508835 4831 scope.go:117] "RemoveContainer" containerID="b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2" Dec 03 06:32:20 crc kubenswrapper[4831]: E1203 06:32:20.509173 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.529644 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.550030 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb1b0783eb62f680604ecf7af5a64083dfba10f7beb1cb91448817c62828da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:32:10Z\\\",\\\"message\\\":\\\"2025-12-03T06:31:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de\\\\n2025-12-03T06:31:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de to /host/opt/cni/bin/\\\\n2025-12-03T06:31:25Z [verbose] multus-daemon started\\\\n2025-12-03T06:31:25Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:32:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.564551 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.581513 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.600697 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.602502 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.602564 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.602586 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.602615 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.602639 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:20Z","lastTransitionTime":"2025-12-03T06:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.632161 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:32:19Z\\\",\\\"message\\\":\\\"F1203 06:32:19.104042 6817 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z]\\\\nI1203 06:32:19.104067 6817 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:32:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.651818 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.671474 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069455ec-45ad-4fb1-b102-c0624f3c7f15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d320c688af94e83163a489c45472fa0c4aa5a3d1f4aeade63c37d536888361d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://978aff94d5ba5b47d721fd41ecdb14563b76a93959caae023dcd2add83e622d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16c4ad7cfee1247394e7bbb77ce47852d4f18af08ffa88a4a2e61f5d2a15371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.693312 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.704889 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.704965 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.704990 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.705020 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.705046 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:20Z","lastTransitionTime":"2025-12-03T06:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.715434 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.733381 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.748490 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.764453 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.784480 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.802509 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.807489 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.807528 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.807543 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.807562 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.807579 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:20Z","lastTransitionTime":"2025-12-03T06:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.818213 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.839905 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:20Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.909896 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.909973 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.909986 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.910008 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:20 crc kubenswrapper[4831]: I1203 06:32:20.910023 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:20Z","lastTransitionTime":"2025-12-03T06:32:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.014108 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.014173 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.014190 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.014212 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.014230 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:21Z","lastTransitionTime":"2025-12-03T06:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.034442 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.116711 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.116776 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.116794 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.116820 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.116842 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:21Z","lastTransitionTime":"2025-12-03T06:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.219648 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.219707 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.219724 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.219749 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.219766 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:21Z","lastTransitionTime":"2025-12-03T06:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.322131 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.322191 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.322208 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.322232 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.322248 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:21Z","lastTransitionTime":"2025-12-03T06:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.425558 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.425624 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.425642 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.425668 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.425684 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:21Z","lastTransitionTime":"2025-12-03T06:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.528727 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.528774 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.528790 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.528813 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.528830 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:21Z","lastTransitionTime":"2025-12-03T06:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.632171 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.632246 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.632264 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.632291 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.632339 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:21Z","lastTransitionTime":"2025-12-03T06:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.735002 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.735061 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.735081 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.735104 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.735123 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:21Z","lastTransitionTime":"2025-12-03T06:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.837755 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.837819 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.837837 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.837862 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.837880 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:21Z","lastTransitionTime":"2025-12-03T06:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.941150 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.941235 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.941261 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.941293 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:21 crc kubenswrapper[4831]: I1203 06:32:21.941323 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:21Z","lastTransitionTime":"2025-12-03T06:32:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.011736 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.011755 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.011813 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.011871 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:22 crc kubenswrapper[4831]: E1203 06:32:22.012072 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:22 crc kubenswrapper[4831]: E1203 06:32:22.012204 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:22 crc kubenswrapper[4831]: E1203 06:32:22.012404 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:22 crc kubenswrapper[4831]: E1203 06:32:22.012530 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.044756 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.044816 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.044832 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.044857 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.044874 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:22Z","lastTransitionTime":"2025-12-03T06:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.148050 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.148092 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.148102 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.148118 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.148129 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:22Z","lastTransitionTime":"2025-12-03T06:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.250587 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.250651 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.250675 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.250702 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.250722 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:22Z","lastTransitionTime":"2025-12-03T06:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.353374 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.353419 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.353441 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.353462 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.353479 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:22Z","lastTransitionTime":"2025-12-03T06:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.455225 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.455276 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.455292 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.455353 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.455372 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:22Z","lastTransitionTime":"2025-12-03T06:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.558507 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.558596 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.558622 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.558652 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.558675 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:22Z","lastTransitionTime":"2025-12-03T06:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.662452 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.662511 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.662544 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.662568 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.662589 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:22Z","lastTransitionTime":"2025-12-03T06:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.765247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.765318 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.765373 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.765407 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.765431 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:22Z","lastTransitionTime":"2025-12-03T06:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.870129 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.870215 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.870237 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.870262 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.870281 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:22Z","lastTransitionTime":"2025-12-03T06:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.973829 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.974271 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.974488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.974700 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:22 crc kubenswrapper[4831]: I1203 06:32:22.974963 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:22Z","lastTransitionTime":"2025-12-03T06:32:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.038250 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a0f25af83afa0f7048bd7260d24328a8ea840b5a45301b6265ea765da3ce26e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.059958 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.078655 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.078724 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.078741 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.078764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.078796 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:23Z","lastTransitionTime":"2025-12-03T06:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.081305 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ac835bcd34334774c32df5003ef67729db26dcab284a78432bea45ad4e212205\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.097975 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6214a0d0-5231-4b11-a9d9-8f3bf85dc6b0\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca2e6c838b6a2917c53bec08f8abd4dc5e5cf0e279e7352a225ddd7703853343\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://50223f94ad8bd7d3e4a1fd96c6525e2344cce1c7d8172dfb849ff4177d29d35c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://50223f94ad8bd7d3e4a1fd96c6525e2344cce1c7d8172dfb849ff4177d29d35c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.115884 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e2f17409ffbadecdf7ee5bf76f994ba215fe82bd1c6befe0ecd99d89f66520e5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://79d8c0cad77c9a8b07ed6e88c59adf081ec89718c8561aa3ecd53b419b7cf311\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.137548 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-vz8ft" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"74a16df4-1f25-4b0f-9e08-f6486f262a68\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:12Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2eb1b0783eb62f680604ecf7af5a64083dfba10f7beb1cb91448817c62828da4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:32:10Z\\\",\\\"message\\\":\\\"2025-12-03T06:31:25+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de\\\\n2025-12-03T06:31:25+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_2fef4864-8b78-4a57-80a8-70e2ab8a18de to /host/opt/cni/bin/\\\\n2025-12-03T06:31:25Z [verbose] multus-daemon started\\\\n2025-12-03T06:31:25Z [verbose] Readiness Indicator file check\\\\n2025-12-03T06:32:10Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:32:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qbzrh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-vz8ft\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.153683 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"4e04caf2-8e18-4af8-9779-c5711262077b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3914b7dc4ef824d42c5c34473609e35c9ddb228293025b852e5e423d0823e646\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zkqgc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-dvcq5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.171147 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"069455ec-45ad-4fb1-b102-c0624f3c7f15\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2d320c688af94e83163a489c45472fa0c4aa5a3d1f4aeade63c37d536888361d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://978aff94d5ba5b47d721fd41ecdb14563b76a93959caae023dcd2add83e622d1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d16c4ad7cfee1247394e7bbb77ce47852d4f18af08ffa88a4a2e61f5d2a15371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f305b22b6166f0990e4a6b5fb6d5c77aaa48e9def01889ea7d41837f1139a13f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.181991 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.182039 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.182057 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.182079 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.182096 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:23Z","lastTransitionTime":"2025-12-03T06:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.188482 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.210998 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2cc17c62-00e2-4756-afa5-60655e6a5a71\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af148ab18b6851bea20907347dd5df18a286823ff11d89bf8fdbae5986d177b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://72f30ecb58b47b7b7668322fd937deb6c23c6b4b19793db3ff7410d4553fc52e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fcfd7d3029661330394ab0f98af5ecc8ea26602a5ba1efe2505d0ad2fde693f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3140cf6860dd0eca4aea4de9c7e2825e8c3e38df68066702f6dffbd40ab60578\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://22fcbc3ad960f66b7c2d34455a6c92b913ad24bb9834a142fbab0d5038b77738\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a0ca733e5f8d52cf0f1fb3381a26eea72a2d0529c9f07dc1016035546e520bc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cd6435bcdd430ed81543937b0c77705d852391af7d4c4e31a59de4520e589266\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-jr6sx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-j2xfs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.235525 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d7d0c92-6857-4846-93ab-3364282a1e85\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:25Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2025-12-03T06:32:19Z\\\",\\\"message\\\":\\\"F1203 06:32:19.104042 6817 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:19Z is after 2025-08-24T17:21:41Z]\\\\nI1203 06:32:19.104067 6817 services_controller.go:451] Built service openshift-kube-controller-manager/kube-controller-manager cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-controller-manager/kube-controller-manager_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-kube-controller-manager/kube-controller-manager\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:32:18Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-sntzl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:24Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-ps95j\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.251995 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"877c82a5-4683-47a1-8a61-639e563263af\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://34d8bce6b898980cd7df9c505406975be951530334c5e3abad1c99b4f28d9cc2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://610be2c865c6b97753368226340ec6f0673ef761258593cb5b8f803c752ff0b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2wl7l\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:36Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-nbmt9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.278025 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8801f972-95eb-4de4-ae44-00da9ab048b3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:04Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI1203 06:31:15.732716 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI1203 06:31:15.734060 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3115159719/tls.crt::/tmp/serving-cert-3115159719/tls.key\\\\\\\"\\\\nI1203 06:31:21.329198 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI1203 06:31:21.332863 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI1203 06:31:21.332898 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI1203 06:31:21.332927 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI1203 06:31:21.332937 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI1203 06:31:21.341253 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI1203 06:31:21.341294 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW1203 06:31:21.341298 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341355 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW1203 06:31:21.341368 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW1203 06:31:21.341379 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW1203 06:31:21.341386 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW1203 06:31:21.341397 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF1203 06:31:21.347615 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:05Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:22Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.284362 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.284402 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.284416 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.284436 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.284451 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:23Z","lastTransitionTime":"2025-12-03T06:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.292433 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.309076 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.323924 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.335246 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.350309 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:23Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.387867 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.387931 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.387952 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.387978 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.387995 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:23Z","lastTransitionTime":"2025-12-03T06:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.490518 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.490583 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.490600 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.490624 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.490641 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:23Z","lastTransitionTime":"2025-12-03T06:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.593656 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.594040 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.594057 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.594078 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.594094 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:23Z","lastTransitionTime":"2025-12-03T06:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.698023 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.698067 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.698078 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.698094 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.698106 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:23Z","lastTransitionTime":"2025-12-03T06:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.807113 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.807158 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.807315 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.807350 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.807369 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:23Z","lastTransitionTime":"2025-12-03T06:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.910512 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.910572 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.910589 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.910616 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:23 crc kubenswrapper[4831]: I1203 06:32:23.910638 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:23Z","lastTransitionTime":"2025-12-03T06:32:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.011802 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.011927 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.012272 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:24 crc kubenswrapper[4831]: E1203 06:32:24.012506 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.012367 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:24 crc kubenswrapper[4831]: E1203 06:32:24.012987 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:24 crc kubenswrapper[4831]: E1203 06:32:24.013257 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:24 crc kubenswrapper[4831]: E1203 06:32:24.013429 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.013840 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.013877 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.013886 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.013906 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.013917 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:24Z","lastTransitionTime":"2025-12-03T06:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.116833 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.117136 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.117227 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.117311 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.117448 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:24Z","lastTransitionTime":"2025-12-03T06:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.220769 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.220845 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.220865 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.220890 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.220908 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:24Z","lastTransitionTime":"2025-12-03T06:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.323218 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.323302 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.323365 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.323388 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.323404 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:24Z","lastTransitionTime":"2025-12-03T06:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.425637 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.425672 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.425679 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.425694 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.425702 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:24Z","lastTransitionTime":"2025-12-03T06:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.528533 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.528568 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.528575 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.528588 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.528597 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:24Z","lastTransitionTime":"2025-12-03T06:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.632162 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.632237 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.632260 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.632288 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.632311 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:24Z","lastTransitionTime":"2025-12-03T06:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.734886 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.734932 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.734946 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.734967 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.734982 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:24Z","lastTransitionTime":"2025-12-03T06:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.837728 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.837771 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.837780 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.837798 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.837810 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:24Z","lastTransitionTime":"2025-12-03T06:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.940719 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.940770 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.940782 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.940801 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:24 crc kubenswrapper[4831]: I1203 06:32:24.940811 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:24Z","lastTransitionTime":"2025-12-03T06:32:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.043766 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.043834 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.043858 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.043889 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.043912 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:25Z","lastTransitionTime":"2025-12-03T06:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.146458 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.146527 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.146553 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.146582 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.146603 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:25Z","lastTransitionTime":"2025-12-03T06:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.249296 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.249415 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.249451 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.249476 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.249495 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:25Z","lastTransitionTime":"2025-12-03T06:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.352489 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.352549 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.352566 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.352588 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.352614 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:25Z","lastTransitionTime":"2025-12-03T06:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.455594 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.455643 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.455659 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.455683 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.455698 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:25Z","lastTransitionTime":"2025-12-03T06:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.558179 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.558230 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.558247 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.558269 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.558287 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:25Z","lastTransitionTime":"2025-12-03T06:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.661969 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.662067 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.662093 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.662126 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.662152 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:25Z","lastTransitionTime":"2025-12-03T06:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.766439 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.766529 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.766548 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.766575 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.766594 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:25Z","lastTransitionTime":"2025-12-03T06:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.869263 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.869368 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.869394 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.869428 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.869456 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:25Z","lastTransitionTime":"2025-12-03T06:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.972933 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.972976 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.972987 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.973003 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:25 crc kubenswrapper[4831]: I1203 06:32:25.973018 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:25Z","lastTransitionTime":"2025-12-03T06:32:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.012433 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.012481 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.012481 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.012617 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.012641 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.012781 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.012953 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.013051 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.028884 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.028990 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.029020 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.029053 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:30.029018442 +0000 UTC m=+147.372601990 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.029123 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.029123 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.029137 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.029268 4831 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.029213 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.029147 4831 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.029378 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2025-12-03 06:33:30.029351122 +0000 UTC m=+147.372934660 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.029399 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.029160 4831 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.029434 4831 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.029460 4831 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.029417 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:33:30.029406214 +0000 UTC m=+147.372989722 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.029530 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2025-12-03 06:33:30.029502947 +0000 UTC m=+147.373086545 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Dec 03 06:32:26 crc kubenswrapper[4831]: E1203 06:32:26.029612 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2025-12-03 06:33:30.029552879 +0000 UTC m=+147.373136507 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.075120 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.075168 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.075180 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.075199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.075212 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:26Z","lastTransitionTime":"2025-12-03T06:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.177953 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.178034 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.178060 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.178086 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.178103 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:26Z","lastTransitionTime":"2025-12-03T06:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.280878 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.280933 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.280950 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.280976 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.281010 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:26Z","lastTransitionTime":"2025-12-03T06:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.384809 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.384884 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.384900 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.384929 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.384950 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:26Z","lastTransitionTime":"2025-12-03T06:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.489221 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.489279 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.489296 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.489346 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.489364 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:26Z","lastTransitionTime":"2025-12-03T06:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.592469 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.592532 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.592548 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.592574 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.592591 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:26Z","lastTransitionTime":"2025-12-03T06:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.695480 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.695523 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.695559 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.695573 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.695582 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:26Z","lastTransitionTime":"2025-12-03T06:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.799396 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.799482 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.799508 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.799545 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.799570 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:26Z","lastTransitionTime":"2025-12-03T06:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.903403 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.903474 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.903491 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.903515 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:26 crc kubenswrapper[4831]: I1203 06:32:26.903532 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:26Z","lastTransitionTime":"2025-12-03T06:32:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.006966 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.007016 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.007028 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.007048 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.007060 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.109479 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.109548 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.109565 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.109592 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.109611 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.212804 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.212875 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.212898 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.212931 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.212954 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.316637 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.316703 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.316721 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.316745 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.316768 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.419819 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.419873 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.419890 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.419912 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.419929 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.523060 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.523522 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.523544 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.523569 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.523586 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.562365 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.562413 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.562426 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.562446 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.562458 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: E1203 06:32:27.577860 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.581970 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.582019 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.582035 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.582057 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.582073 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: E1203 06:32:27.600628 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.606199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.606277 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.606306 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.606373 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.606399 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: E1203 06:32:27.631130 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.635596 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.635646 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.635666 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.635691 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.635707 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: E1203 06:32:27.657006 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.661908 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.662024 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.662043 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.662066 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.662083 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: E1203 06:32:27.677814 4831 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"lastTransitionTime\\\":\\\"2025-12-03T06:32:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0b87e1b8-395c-4ff9-834e-79e149dbf129\\\",\\\"systemUUID\\\":\\\"1b42c798-2812-40ef-a506-f181e54d7ef9\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:27Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:27 crc kubenswrapper[4831]: E1203 06:32:27.678034 4831 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.680125 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.680183 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.680195 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.680214 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.680228 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.788913 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.788967 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.788985 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.789009 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.789028 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.892524 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.892600 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.892618 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.892644 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.892662 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.995605 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.995658 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.995676 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.995699 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:27 crc kubenswrapper[4831]: I1203 06:32:27.995716 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:27Z","lastTransitionTime":"2025-12-03T06:32:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.012509 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.012590 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.012588 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.012518 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:28 crc kubenswrapper[4831]: E1203 06:32:28.012726 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:28 crc kubenswrapper[4831]: E1203 06:32:28.012899 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:28 crc kubenswrapper[4831]: E1203 06:32:28.013023 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:28 crc kubenswrapper[4831]: E1203 06:32:28.013308 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.098828 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.098897 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.098920 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.098948 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.098968 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:28Z","lastTransitionTime":"2025-12-03T06:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.201729 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.201789 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.201816 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.201848 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.201868 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:28Z","lastTransitionTime":"2025-12-03T06:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.305367 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.305420 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.305436 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.305460 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.305476 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:28Z","lastTransitionTime":"2025-12-03T06:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.408707 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.408751 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.408768 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.408791 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.408810 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:28Z","lastTransitionTime":"2025-12-03T06:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.511269 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.511377 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.511403 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.511433 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.511454 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:28Z","lastTransitionTime":"2025-12-03T06:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.614512 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.614585 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.614603 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.614626 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.614646 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:28Z","lastTransitionTime":"2025-12-03T06:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.717420 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.717489 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.717513 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.717543 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.717564 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:28Z","lastTransitionTime":"2025-12-03T06:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.821398 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.821468 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.821488 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.821569 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.821589 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:28Z","lastTransitionTime":"2025-12-03T06:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.925084 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.925229 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.925258 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.925291 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:28 crc kubenswrapper[4831]: I1203 06:32:28.925415 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:28Z","lastTransitionTime":"2025-12-03T06:32:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.028405 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.028482 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.028506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.028638 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.028665 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:29Z","lastTransitionTime":"2025-12-03T06:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.131671 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.131743 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.131767 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.131798 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.131818 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:29Z","lastTransitionTime":"2025-12-03T06:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.234947 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.235012 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.235031 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.235057 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.235074 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:29Z","lastTransitionTime":"2025-12-03T06:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.338246 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.338310 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.338366 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.338391 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.338409 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:29Z","lastTransitionTime":"2025-12-03T06:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.440960 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.441082 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.441103 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.441127 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.441144 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:29Z","lastTransitionTime":"2025-12-03T06:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.543440 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.543506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.543523 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.543548 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.543565 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:29Z","lastTransitionTime":"2025-12-03T06:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.647015 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.647104 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.647121 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.647145 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.647164 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:29Z","lastTransitionTime":"2025-12-03T06:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.750477 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.750545 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.750562 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.750587 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.750604 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:29Z","lastTransitionTime":"2025-12-03T06:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.854209 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.854374 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.854401 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.854431 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.854461 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:29Z","lastTransitionTime":"2025-12-03T06:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.959002 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.959061 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.959078 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.959109 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:29 crc kubenswrapper[4831]: I1203 06:32:29.959125 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:29Z","lastTransitionTime":"2025-12-03T06:32:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.012376 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.012422 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.012539 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:30 crc kubenswrapper[4831]: E1203 06:32:30.012579 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.012574 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:30 crc kubenswrapper[4831]: E1203 06:32:30.012743 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:30 crc kubenswrapper[4831]: E1203 06:32:30.012830 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:30 crc kubenswrapper[4831]: E1203 06:32:30.012994 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.061725 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.061794 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.061811 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.061840 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.061855 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:30Z","lastTransitionTime":"2025-12-03T06:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.165134 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.165218 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.165244 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.165284 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.165306 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:30Z","lastTransitionTime":"2025-12-03T06:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.268969 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.269039 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.269055 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.269081 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.269099 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:30Z","lastTransitionTime":"2025-12-03T06:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.372582 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.372657 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.372677 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.372701 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.372719 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:30Z","lastTransitionTime":"2025-12-03T06:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.476640 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.476721 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.476751 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.476781 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.476802 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:30Z","lastTransitionTime":"2025-12-03T06:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.581423 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.581497 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.581516 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.581545 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.581566 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:30Z","lastTransitionTime":"2025-12-03T06:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.684966 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.685028 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.685045 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.685073 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.685092 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:30Z","lastTransitionTime":"2025-12-03T06:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.788057 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.788143 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.788361 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.788420 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.788447 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:30Z","lastTransitionTime":"2025-12-03T06:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.892885 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.892958 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.892987 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.893024 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.893045 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:30Z","lastTransitionTime":"2025-12-03T06:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.996359 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.996445 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.996470 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.996499 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:30 crc kubenswrapper[4831]: I1203 06:32:30.996517 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:30Z","lastTransitionTime":"2025-12-03T06:32:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.100114 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.100239 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.100263 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.100387 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.100473 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:31Z","lastTransitionTime":"2025-12-03T06:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.203614 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.203683 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.203707 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.203736 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.203757 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:31Z","lastTransitionTime":"2025-12-03T06:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.306805 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.306875 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.306894 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.306929 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.306950 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:31Z","lastTransitionTime":"2025-12-03T06:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.410334 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.410376 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.410386 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.410401 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.410412 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:31Z","lastTransitionTime":"2025-12-03T06:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.513630 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.513672 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.513686 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.513704 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.513717 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:31Z","lastTransitionTime":"2025-12-03T06:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.616932 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.616992 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.617017 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.617045 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.617065 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:31Z","lastTransitionTime":"2025-12-03T06:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.720452 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.720520 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.720537 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.720562 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.720579 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:31Z","lastTransitionTime":"2025-12-03T06:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.823470 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.823537 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.823554 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.823579 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.823597 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:31Z","lastTransitionTime":"2025-12-03T06:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.926387 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.926464 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.926481 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.926510 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:31 crc kubenswrapper[4831]: I1203 06:32:31.926530 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:31Z","lastTransitionTime":"2025-12-03T06:32:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.012922 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.012962 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.012972 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.013048 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:32 crc kubenswrapper[4831]: E1203 06:32:32.013217 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:32 crc kubenswrapper[4831]: E1203 06:32:32.013449 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:32 crc kubenswrapper[4831]: E1203 06:32:32.013631 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:32 crc kubenswrapper[4831]: E1203 06:32:32.013735 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.029430 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.029522 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.029580 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.029609 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.029634 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:32Z","lastTransitionTime":"2025-12-03T06:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.133219 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.133290 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.133308 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.133365 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.133384 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:32Z","lastTransitionTime":"2025-12-03T06:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.236707 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.236765 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.236781 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.236805 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.236823 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:32Z","lastTransitionTime":"2025-12-03T06:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.340264 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.340358 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.340376 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.340401 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.340418 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:32Z","lastTransitionTime":"2025-12-03T06:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.443503 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.443587 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.443610 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.443637 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.443659 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:32Z","lastTransitionTime":"2025-12-03T06:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.547795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.547836 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.547844 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.547859 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.547871 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:32Z","lastTransitionTime":"2025-12-03T06:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.650687 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.650748 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.650764 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.650790 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.650809 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:32Z","lastTransitionTime":"2025-12-03T06:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.754164 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.754205 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.754217 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.754236 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.754248 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:32Z","lastTransitionTime":"2025-12-03T06:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.857854 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.857903 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.857920 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.857942 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.857957 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:32Z","lastTransitionTime":"2025-12-03T06:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.960916 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.960967 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.960983 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.961007 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:32 crc kubenswrapper[4831]: I1203 06:32:32.961023 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:32Z","lastTransitionTime":"2025-12-03T06:32:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.014692 4831 scope.go:117] "RemoveContainer" containerID="b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2" Dec 03 06:32:33 crc kubenswrapper[4831]: E1203 06:32:33.015737 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.034875 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"96351b1b-146a-4679-8199-64af3225ce78\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:05Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c62b61d03e6a760318ecd93acb4e1d3c7532eca3183a8560571fbaa187996f26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:03Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d75c559b0c6d7e615f0460b11fffa6cc05995fb1457815c945bf667691e4bcc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e9df51823bdfb6f36cbd626e00c19869b9aff6d476c7b4dff400a6cb40bf3558\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:04Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:03Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.055828 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:21Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.064681 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.064736 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.064754 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.064777 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.064794 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:33Z","lastTransitionTime":"2025-12-03T06:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.072663 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-cjft5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bdcd6b2b-8124-46f0-9b94-e32e05ef6e49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:23Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:24Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a106cd4af8b4eb13a66e8b67d8c462b39d00d092f22a56684fd6bcc6da151638\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:23Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4hvz\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:23Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-cjft5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.089255 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dm6hd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8c66c9ba-10ff-43b1-baab-d4bb0b32d7a1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8b74eeb971d78af15e2e6c262a0e2eaeda1d87158eae0f74d3ab234fac08ae83\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2025-12-03T06:31:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zppkv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:25Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dm6hd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.106911 4831 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-lllsw" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8283839a-a189-493f-bde7-e0193d575963\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T06:31:37Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-8jjmv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T06:31:37Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-lllsw\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2025-12-03T06:32:33Z is after 2025-08-24T17:21:41Z" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.146879 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=71.146854784 podStartE2EDuration="1m11.146854784s" podCreationTimestamp="2025-12-03 06:31:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:32:33.146195183 +0000 UTC m=+90.489778721" watchObservedRunningTime="2025-12-03 06:32:33.146854784 +0000 UTC m=+90.490438332" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.167669 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.168103 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.168361 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.168626 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.168851 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:33Z","lastTransitionTime":"2025-12-03T06:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.268438 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vz8ft" podStartSLOduration=70.268418258 podStartE2EDuration="1m10.268418258s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:32:33.250526741 +0000 UTC m=+90.594110259" watchObservedRunningTime="2025-12-03 06:32:33.268418258 +0000 UTC m=+90.612001776" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.268637 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podStartSLOduration=70.268630905 podStartE2EDuration="1m10.268630905s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:32:33.267967275 +0000 UTC m=+90.611550793" watchObservedRunningTime="2025-12-03 06:32:33.268630905 +0000 UTC m=+90.612214423" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.271441 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.271485 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.271493 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.271510 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.271521 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:33Z","lastTransitionTime":"2025-12-03T06:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.296753 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=12.296720886 podStartE2EDuration="12.296720886s" podCreationTimestamp="2025-12-03 06:32:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:32:33.28014688 +0000 UTC m=+90.623730428" watchObservedRunningTime="2025-12-03 06:32:33.296720886 +0000 UTC m=+90.640304434" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.351058 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-j2xfs" podStartSLOduration=70.351033158 podStartE2EDuration="1m10.351033158s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:32:33.318301871 +0000 UTC m=+90.661885419" watchObservedRunningTime="2025-12-03 06:32:33.351033158 +0000 UTC m=+90.694616706" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.367202 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-nbmt9" podStartSLOduration=70.36717699 podStartE2EDuration="1m10.36717699s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:32:33.367036955 +0000 UTC m=+90.710620503" watchObservedRunningTime="2025-12-03 06:32:33.36717699 +0000 UTC m=+90.710760538" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.374289 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.374340 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.374352 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.374367 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.374379 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:33Z","lastTransitionTime":"2025-12-03T06:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.386454 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=41.38642758 podStartE2EDuration="41.38642758s" podCreationTimestamp="2025-12-03 06:31:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:32:33.386376158 +0000 UTC m=+90.729959696" watchObservedRunningTime="2025-12-03 06:32:33.38642758 +0000 UTC m=+90.730011108" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.477771 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.477820 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.477837 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.477859 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.477876 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:33Z","lastTransitionTime":"2025-12-03T06:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.580563 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.580621 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.580639 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.580665 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.580682 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:33Z","lastTransitionTime":"2025-12-03T06:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.683807 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.683860 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.683876 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.683902 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.683920 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:33Z","lastTransitionTime":"2025-12-03T06:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.787772 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.787895 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.787922 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.787991 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.788014 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:33Z","lastTransitionTime":"2025-12-03T06:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.891404 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.891479 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.891503 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.891532 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.891554 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:33Z","lastTransitionTime":"2025-12-03T06:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.994812 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.994887 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.994911 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.994943 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:33 crc kubenswrapper[4831]: I1203 06:32:33.994967 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:33Z","lastTransitionTime":"2025-12-03T06:32:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.012741 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.012827 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:34 crc kubenswrapper[4831]: E1203 06:32:34.012950 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.012759 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.012962 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:34 crc kubenswrapper[4831]: E1203 06:32:34.013090 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:34 crc kubenswrapper[4831]: E1203 06:32:34.013244 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:34 crc kubenswrapper[4831]: E1203 06:32:34.013372 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.099260 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.099379 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.099405 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.099437 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.099460 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:34Z","lastTransitionTime":"2025-12-03T06:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.202735 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.202792 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.202809 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.202833 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.202851 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:34Z","lastTransitionTime":"2025-12-03T06:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.305967 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.306033 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.306050 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.306076 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.306093 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:34Z","lastTransitionTime":"2025-12-03T06:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.409795 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.409873 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.409892 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.409917 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.409934 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:34Z","lastTransitionTime":"2025-12-03T06:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.513636 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.513700 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.513723 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.513753 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.513774 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:34Z","lastTransitionTime":"2025-12-03T06:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.617008 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.617079 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.617097 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.617121 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.617144 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:34Z","lastTransitionTime":"2025-12-03T06:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.720636 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.720699 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.720716 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.720740 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.720759 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:34Z","lastTransitionTime":"2025-12-03T06:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.823656 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.823720 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.823738 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.823763 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.823781 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:34Z","lastTransitionTime":"2025-12-03T06:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.927741 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.927791 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.927808 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.927831 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:34 crc kubenswrapper[4831]: I1203 06:32:34.927848 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:34Z","lastTransitionTime":"2025-12-03T06:32:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.030074 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.030110 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.030119 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.030132 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.030142 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:35Z","lastTransitionTime":"2025-12-03T06:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.132689 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.132756 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.132774 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.132798 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.132815 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:35Z","lastTransitionTime":"2025-12-03T06:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.235867 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.235921 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.235938 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.235962 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.235979 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:35Z","lastTransitionTime":"2025-12-03T06:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.339604 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.339683 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.339707 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.339738 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.339760 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:35Z","lastTransitionTime":"2025-12-03T06:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.442861 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.442909 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.442925 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.442970 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.442987 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:35Z","lastTransitionTime":"2025-12-03T06:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.546293 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.546384 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.546401 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.546422 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.546438 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:35Z","lastTransitionTime":"2025-12-03T06:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.649991 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.650223 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.650240 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.650270 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.650287 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:35Z","lastTransitionTime":"2025-12-03T06:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.753577 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.753641 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.753657 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.753679 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.753696 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:35Z","lastTransitionTime":"2025-12-03T06:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.856307 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.856426 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.856447 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.856476 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.856495 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:35Z","lastTransitionTime":"2025-12-03T06:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.959974 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.960042 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.960063 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.960087 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:35 crc kubenswrapper[4831]: I1203 06:32:35.960104 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:35Z","lastTransitionTime":"2025-12-03T06:32:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.011920 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.012026 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.012048 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:36 crc kubenswrapper[4831]: E1203 06:32:36.012218 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.012270 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:36 crc kubenswrapper[4831]: E1203 06:32:36.012457 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:36 crc kubenswrapper[4831]: E1203 06:32:36.012591 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:36 crc kubenswrapper[4831]: E1203 06:32:36.012782 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.063166 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.063215 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.063231 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.063254 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.063271 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:36Z","lastTransitionTime":"2025-12-03T06:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.166835 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.166922 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.166941 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.166968 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.166990 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:36Z","lastTransitionTime":"2025-12-03T06:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.270352 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.270410 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.270427 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.270452 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.270469 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:36Z","lastTransitionTime":"2025-12-03T06:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.373842 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.373904 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.373920 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.373945 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.373961 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:36Z","lastTransitionTime":"2025-12-03T06:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.477417 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.477544 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.477604 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.477635 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.477652 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:36Z","lastTransitionTime":"2025-12-03T06:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.580249 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.580354 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.580372 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.580395 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.580445 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:36Z","lastTransitionTime":"2025-12-03T06:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.683965 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.684075 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.684110 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.684150 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.684173 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:36Z","lastTransitionTime":"2025-12-03T06:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.787615 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.787682 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.787707 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.787735 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.787756 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:36Z","lastTransitionTime":"2025-12-03T06:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.891108 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.891163 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.891179 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.891204 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.891221 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:36Z","lastTransitionTime":"2025-12-03T06:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.993825 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.993883 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.993899 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.993921 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:36 crc kubenswrapper[4831]: I1203 06:32:36.993937 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:36Z","lastTransitionTime":"2025-12-03T06:32:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.097463 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.097522 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.097533 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.097560 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.097574 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:37Z","lastTransitionTime":"2025-12-03T06:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.201777 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.201817 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.201849 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.201866 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.201876 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:37Z","lastTransitionTime":"2025-12-03T06:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.305432 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.305506 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.305530 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.305563 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.305587 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:37Z","lastTransitionTime":"2025-12-03T06:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.409123 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.409183 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.409199 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.409223 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.409238 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:37Z","lastTransitionTime":"2025-12-03T06:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.511990 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.512059 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.512077 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.512100 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.512119 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:37Z","lastTransitionTime":"2025-12-03T06:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.615177 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.615232 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.615249 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.615275 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.615292 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:37Z","lastTransitionTime":"2025-12-03T06:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.718018 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.718063 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.718077 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.718096 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.718108 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:37Z","lastTransitionTime":"2025-12-03T06:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.820697 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.820741 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.820754 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.820770 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.820781 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:37Z","lastTransitionTime":"2025-12-03T06:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.924156 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.924205 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.924216 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.924236 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.924248 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:37Z","lastTransitionTime":"2025-12-03T06:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.980657 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.980721 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.980737 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.980762 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Dec 03 06:32:37 crc kubenswrapper[4831]: I1203 06:32:37.980779 4831 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2025-12-03T06:32:37Z","lastTransitionTime":"2025-12-03T06:32:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.011670 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:38 crc kubenswrapper[4831]: E1203 06:32:38.011802 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.012017 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:38 crc kubenswrapper[4831]: E1203 06:32:38.012073 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.012206 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:38 crc kubenswrapper[4831]: E1203 06:32:38.012262 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.012536 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:38 crc kubenswrapper[4831]: E1203 06:32:38.012679 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.045755 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6"] Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.046400 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.049344 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.049365 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.050185 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.052045 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.067807 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f271728e-bc6e-463f-aaaa-8c469ad48f6c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.067882 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f271728e-bc6e-463f-aaaa-8c469ad48f6c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.067922 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f271728e-bc6e-463f-aaaa-8c469ad48f6c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.068027 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f271728e-bc6e-463f-aaaa-8c469ad48f6c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.068123 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f271728e-bc6e-463f-aaaa-8c469ad48f6c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.084032 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-cjft5" podStartSLOduration=75.084004674 podStartE2EDuration="1m15.084004674s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:32:38.083101836 +0000 UTC m=+95.426685374" watchObservedRunningTime="2025-12-03 06:32:38.084004674 +0000 UTC m=+95.427588212" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.099012 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dm6hd" podStartSLOduration=75.098982139 podStartE2EDuration="1m15.098982139s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:32:38.098980999 +0000 UTC m=+95.442564547" watchObservedRunningTime="2025-12-03 06:32:38.098982139 +0000 UTC m=+95.442565697" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.167364 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=71.167342187 podStartE2EDuration="1m11.167342187s" podCreationTimestamp="2025-12-03 06:31:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:32:38.141537229 +0000 UTC m=+95.485120807" watchObservedRunningTime="2025-12-03 06:32:38.167342187 +0000 UTC m=+95.510925735" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.168903 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f271728e-bc6e-463f-aaaa-8c469ad48f6c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.168963 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f271728e-bc6e-463f-aaaa-8c469ad48f6c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.169002 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f271728e-bc6e-463f-aaaa-8c469ad48f6c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.169085 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f271728e-bc6e-463f-aaaa-8c469ad48f6c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.169129 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f271728e-bc6e-463f-aaaa-8c469ad48f6c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.170059 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/f271728e-bc6e-463f-aaaa-8c469ad48f6c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.170137 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/f271728e-bc6e-463f-aaaa-8c469ad48f6c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.170987 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f271728e-bc6e-463f-aaaa-8c469ad48f6c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.178911 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f271728e-bc6e-463f-aaaa-8c469ad48f6c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.198727 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f271728e-bc6e-463f-aaaa-8c469ad48f6c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-2fcg6\" (UID: \"f271728e-bc6e-463f-aaaa-8c469ad48f6c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.363563 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.572368 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" event={"ID":"f271728e-bc6e-463f-aaaa-8c469ad48f6c","Type":"ContainerStarted","Data":"6e412af8fd79c9ab9cdd548fe503d93a9292b1ce971358f266f07f005c81793d"} Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.572442 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" event={"ID":"f271728e-bc6e-463f-aaaa-8c469ad48f6c","Type":"ContainerStarted","Data":"3ad2bf72e3a0e5f5ad2b632ca37456efa5d8f96e286f9e5cb309018d1328650d"} Dec 03 06:32:38 crc kubenswrapper[4831]: I1203 06:32:38.594380 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-2fcg6" podStartSLOduration=75.594352816 podStartE2EDuration="1m15.594352816s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:32:38.593060884 +0000 UTC m=+95.936644452" watchObservedRunningTime="2025-12-03 06:32:38.594352816 +0000 UTC m=+95.937936384" Dec 03 06:32:40 crc kubenswrapper[4831]: I1203 06:32:40.011964 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:40 crc kubenswrapper[4831]: I1203 06:32:40.012014 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:40 crc kubenswrapper[4831]: E1203 06:32:40.012451 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:40 crc kubenswrapper[4831]: I1203 06:32:40.012183 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:40 crc kubenswrapper[4831]: I1203 06:32:40.012045 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:40 crc kubenswrapper[4831]: E1203 06:32:40.012634 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:40 crc kubenswrapper[4831]: E1203 06:32:40.012728 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:40 crc kubenswrapper[4831]: E1203 06:32:40.012842 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:42 crc kubenswrapper[4831]: I1203 06:32:42.011744 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:42 crc kubenswrapper[4831]: E1203 06:32:42.011877 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:42 crc kubenswrapper[4831]: I1203 06:32:42.012067 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:42 crc kubenswrapper[4831]: E1203 06:32:42.012150 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:42 crc kubenswrapper[4831]: I1203 06:32:42.012294 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:42 crc kubenswrapper[4831]: E1203 06:32:42.012428 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:42 crc kubenswrapper[4831]: I1203 06:32:42.012601 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:42 crc kubenswrapper[4831]: E1203 06:32:42.012677 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:42 crc kubenswrapper[4831]: I1203 06:32:42.485872 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:42 crc kubenswrapper[4831]: E1203 06:32:42.486265 4831 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:32:42 crc kubenswrapper[4831]: E1203 06:32:42.486350 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs podName:8283839a-a189-493f-bde7-e0193d575963 nodeName:}" failed. No retries permitted until 2025-12-03 06:33:46.486312237 +0000 UTC m=+163.829895745 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs") pod "network-metrics-daemon-lllsw" (UID: "8283839a-a189-493f-bde7-e0193d575963") : object "openshift-multus"/"metrics-daemon-secret" not registered Dec 03 06:32:43 crc kubenswrapper[4831]: I1203 06:32:43.035398 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Dec 03 06:32:44 crc kubenswrapper[4831]: I1203 06:32:44.011967 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:44 crc kubenswrapper[4831]: I1203 06:32:44.011997 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:44 crc kubenswrapper[4831]: I1203 06:32:44.012139 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:44 crc kubenswrapper[4831]: E1203 06:32:44.012351 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:44 crc kubenswrapper[4831]: E1203 06:32:44.012512 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:44 crc kubenswrapper[4831]: E1203 06:32:44.012683 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:44 crc kubenswrapper[4831]: I1203 06:32:44.013140 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:44 crc kubenswrapper[4831]: E1203 06:32:44.013789 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:45 crc kubenswrapper[4831]: I1203 06:32:45.013224 4831 scope.go:117] "RemoveContainer" containerID="b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2" Dec 03 06:32:45 crc kubenswrapper[4831]: E1203 06:32:45.013535 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-ps95j_openshift-ovn-kubernetes(3d7d0c92-6857-4846-93ab-3364282a1e85)\"" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" Dec 03 06:32:46 crc kubenswrapper[4831]: I1203 06:32:46.012092 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:46 crc kubenswrapper[4831]: I1203 06:32:46.012184 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:46 crc kubenswrapper[4831]: I1203 06:32:46.012132 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:46 crc kubenswrapper[4831]: E1203 06:32:46.012359 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:46 crc kubenswrapper[4831]: I1203 06:32:46.012460 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:46 crc kubenswrapper[4831]: E1203 06:32:46.012653 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:46 crc kubenswrapper[4831]: E1203 06:32:46.012757 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:46 crc kubenswrapper[4831]: E1203 06:32:46.012934 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:48 crc kubenswrapper[4831]: I1203 06:32:48.012487 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:48 crc kubenswrapper[4831]: I1203 06:32:48.012516 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:48 crc kubenswrapper[4831]: I1203 06:32:48.012555 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:48 crc kubenswrapper[4831]: I1203 06:32:48.012613 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:48 crc kubenswrapper[4831]: E1203 06:32:48.013612 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:48 crc kubenswrapper[4831]: E1203 06:32:48.014047 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:48 crc kubenswrapper[4831]: E1203 06:32:48.014178 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:48 crc kubenswrapper[4831]: E1203 06:32:48.014241 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:50 crc kubenswrapper[4831]: I1203 06:32:50.012051 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:50 crc kubenswrapper[4831]: I1203 06:32:50.012051 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:50 crc kubenswrapper[4831]: I1203 06:32:50.012050 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:50 crc kubenswrapper[4831]: I1203 06:32:50.012272 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:50 crc kubenswrapper[4831]: E1203 06:32:50.012916 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:50 crc kubenswrapper[4831]: E1203 06:32:50.013157 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:50 crc kubenswrapper[4831]: E1203 06:32:50.013271 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:50 crc kubenswrapper[4831]: E1203 06:32:50.013395 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:52 crc kubenswrapper[4831]: I1203 06:32:52.012445 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:52 crc kubenswrapper[4831]: I1203 06:32:52.012528 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:52 crc kubenswrapper[4831]: E1203 06:32:52.012595 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:52 crc kubenswrapper[4831]: E1203 06:32:52.012723 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:52 crc kubenswrapper[4831]: I1203 06:32:52.013937 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:52 crc kubenswrapper[4831]: I1203 06:32:52.014011 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:52 crc kubenswrapper[4831]: E1203 06:32:52.014287 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:52 crc kubenswrapper[4831]: E1203 06:32:52.014384 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:53 crc kubenswrapper[4831]: I1203 06:32:53.053248 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=10.053224746 podStartE2EDuration="10.053224746s" podCreationTimestamp="2025-12-03 06:32:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:32:53.05174477 +0000 UTC m=+110.395328338" watchObservedRunningTime="2025-12-03 06:32:53.053224746 +0000 UTC m=+110.396808294" Dec 03 06:32:54 crc kubenswrapper[4831]: I1203 06:32:54.012375 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:54 crc kubenswrapper[4831]: I1203 06:32:54.012440 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:54 crc kubenswrapper[4831]: I1203 06:32:54.012465 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:54 crc kubenswrapper[4831]: I1203 06:32:54.012385 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:54 crc kubenswrapper[4831]: E1203 06:32:54.012566 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:54 crc kubenswrapper[4831]: E1203 06:32:54.012744 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:54 crc kubenswrapper[4831]: E1203 06:32:54.012835 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:54 crc kubenswrapper[4831]: E1203 06:32:54.013022 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:56 crc kubenswrapper[4831]: I1203 06:32:56.012569 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:56 crc kubenswrapper[4831]: I1203 06:32:56.012662 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:56 crc kubenswrapper[4831]: I1203 06:32:56.012613 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:56 crc kubenswrapper[4831]: I1203 06:32:56.012569 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:56 crc kubenswrapper[4831]: E1203 06:32:56.012784 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:56 crc kubenswrapper[4831]: E1203 06:32:56.012911 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:56 crc kubenswrapper[4831]: E1203 06:32:56.013042 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:56 crc kubenswrapper[4831]: E1203 06:32:56.013194 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:57 crc kubenswrapper[4831]: I1203 06:32:57.648487 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vz8ft_74a16df4-1f25-4b0f-9e08-f6486f262a68/kube-multus/1.log" Dec 03 06:32:57 crc kubenswrapper[4831]: I1203 06:32:57.649377 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vz8ft_74a16df4-1f25-4b0f-9e08-f6486f262a68/kube-multus/0.log" Dec 03 06:32:57 crc kubenswrapper[4831]: I1203 06:32:57.649446 4831 generic.go:334] "Generic (PLEG): container finished" podID="74a16df4-1f25-4b0f-9e08-f6486f262a68" containerID="2eb1b0783eb62f680604ecf7af5a64083dfba10f7beb1cb91448817c62828da4" exitCode=1 Dec 03 06:32:57 crc kubenswrapper[4831]: I1203 06:32:57.649486 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vz8ft" event={"ID":"74a16df4-1f25-4b0f-9e08-f6486f262a68","Type":"ContainerDied","Data":"2eb1b0783eb62f680604ecf7af5a64083dfba10f7beb1cb91448817c62828da4"} Dec 03 06:32:57 crc kubenswrapper[4831]: I1203 06:32:57.649528 4831 scope.go:117] "RemoveContainer" containerID="7f6d1e1770cba034ba5b945ae968da11fa845ac47dee5ff6c7274bce97596fc5" Dec 03 06:32:57 crc kubenswrapper[4831]: I1203 06:32:57.651508 4831 scope.go:117] "RemoveContainer" containerID="2eb1b0783eb62f680604ecf7af5a64083dfba10f7beb1cb91448817c62828da4" Dec 03 06:32:57 crc kubenswrapper[4831]: E1203 06:32:57.652073 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-vz8ft_openshift-multus(74a16df4-1f25-4b0f-9e08-f6486f262a68)\"" pod="openshift-multus/multus-vz8ft" podUID="74a16df4-1f25-4b0f-9e08-f6486f262a68" Dec 03 06:32:58 crc kubenswrapper[4831]: I1203 06:32:58.012453 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:32:58 crc kubenswrapper[4831]: I1203 06:32:58.012601 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:32:58 crc kubenswrapper[4831]: E1203 06:32:58.012606 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:32:58 crc kubenswrapper[4831]: I1203 06:32:58.012472 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:32:58 crc kubenswrapper[4831]: E1203 06:32:58.012762 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:32:58 crc kubenswrapper[4831]: E1203 06:32:58.012903 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:32:58 crc kubenswrapper[4831]: I1203 06:32:58.013265 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:32:58 crc kubenswrapper[4831]: E1203 06:32:58.013655 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:32:58 crc kubenswrapper[4831]: I1203 06:32:58.663289 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vz8ft_74a16df4-1f25-4b0f-9e08-f6486f262a68/kube-multus/1.log" Dec 03 06:33:00 crc kubenswrapper[4831]: I1203 06:33:00.012103 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:33:00 crc kubenswrapper[4831]: E1203 06:33:00.012894 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:33:00 crc kubenswrapper[4831]: I1203 06:33:00.012253 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:33:00 crc kubenswrapper[4831]: I1203 06:33:00.012243 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:33:00 crc kubenswrapper[4831]: I1203 06:33:00.012206 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:33:00 crc kubenswrapper[4831]: E1203 06:33:00.013145 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:33:00 crc kubenswrapper[4831]: E1203 06:33:00.013369 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:33:00 crc kubenswrapper[4831]: E1203 06:33:00.013505 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:33:00 crc kubenswrapper[4831]: I1203 06:33:00.013785 4831 scope.go:117] "RemoveContainer" containerID="b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2" Dec 03 06:33:00 crc kubenswrapper[4831]: I1203 06:33:00.674459 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/3.log" Dec 03 06:33:00 crc kubenswrapper[4831]: I1203 06:33:00.678770 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerStarted","Data":"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c"} Dec 03 06:33:00 crc kubenswrapper[4831]: I1203 06:33:00.679331 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:33:00 crc kubenswrapper[4831]: I1203 06:33:00.720136 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podStartSLOduration=97.720113557 podStartE2EDuration="1m37.720113557s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:00.718455964 +0000 UTC m=+118.062039512" watchObservedRunningTime="2025-12-03 06:33:00.720113557 +0000 UTC m=+118.063697105" Dec 03 06:33:00 crc kubenswrapper[4831]: I1203 06:33:00.907430 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lllsw"] Dec 03 06:33:00 crc kubenswrapper[4831]: I1203 06:33:00.907646 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:33:00 crc kubenswrapper[4831]: E1203 06:33:00.907750 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:33:02 crc kubenswrapper[4831]: I1203 06:33:02.013264 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:33:02 crc kubenswrapper[4831]: I1203 06:33:02.013358 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:33:02 crc kubenswrapper[4831]: E1203 06:33:02.013470 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:33:02 crc kubenswrapper[4831]: I1203 06:33:02.013540 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:33:02 crc kubenswrapper[4831]: E1203 06:33:02.013657 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:33:02 crc kubenswrapper[4831]: E1203 06:33:02.013723 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:33:02 crc kubenswrapper[4831]: E1203 06:33:02.954918 4831 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Dec 03 06:33:03 crc kubenswrapper[4831]: I1203 06:33:03.012394 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:33:03 crc kubenswrapper[4831]: E1203 06:33:03.014308 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:33:03 crc kubenswrapper[4831]: E1203 06:33:03.117538 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 06:33:04 crc kubenswrapper[4831]: I1203 06:33:04.011962 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:33:04 crc kubenswrapper[4831]: I1203 06:33:04.012053 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:33:04 crc kubenswrapper[4831]: I1203 06:33:04.012081 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:33:04 crc kubenswrapper[4831]: E1203 06:33:04.012238 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:33:04 crc kubenswrapper[4831]: E1203 06:33:04.012379 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:33:04 crc kubenswrapper[4831]: E1203 06:33:04.012514 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:33:05 crc kubenswrapper[4831]: I1203 06:33:05.012701 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:33:05 crc kubenswrapper[4831]: E1203 06:33:05.013109 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:33:06 crc kubenswrapper[4831]: I1203 06:33:06.011886 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:33:06 crc kubenswrapper[4831]: I1203 06:33:06.011902 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:33:06 crc kubenswrapper[4831]: I1203 06:33:06.011907 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:33:06 crc kubenswrapper[4831]: E1203 06:33:06.011996 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:33:06 crc kubenswrapper[4831]: E1203 06:33:06.012241 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:33:06 crc kubenswrapper[4831]: E1203 06:33:06.012504 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:33:07 crc kubenswrapper[4831]: I1203 06:33:07.012390 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:33:07 crc kubenswrapper[4831]: E1203 06:33:07.012619 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:33:08 crc kubenswrapper[4831]: I1203 06:33:08.012510 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:33:08 crc kubenswrapper[4831]: I1203 06:33:08.012562 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:33:08 crc kubenswrapper[4831]: I1203 06:33:08.012570 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:33:08 crc kubenswrapper[4831]: E1203 06:33:08.012686 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:33:08 crc kubenswrapper[4831]: E1203 06:33:08.012817 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:33:08 crc kubenswrapper[4831]: E1203 06:33:08.012917 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:33:08 crc kubenswrapper[4831]: E1203 06:33:08.119232 4831 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 06:33:09 crc kubenswrapper[4831]: I1203 06:33:09.012606 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:33:09 crc kubenswrapper[4831]: E1203 06:33:09.012964 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:33:10 crc kubenswrapper[4831]: I1203 06:33:10.012203 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:33:10 crc kubenswrapper[4831]: I1203 06:33:10.012265 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:33:10 crc kubenswrapper[4831]: E1203 06:33:10.012425 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:33:10 crc kubenswrapper[4831]: E1203 06:33:10.012577 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:33:10 crc kubenswrapper[4831]: I1203 06:33:10.013500 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:33:10 crc kubenswrapper[4831]: E1203 06:33:10.013767 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:33:11 crc kubenswrapper[4831]: I1203 06:33:11.012114 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:33:11 crc kubenswrapper[4831]: I1203 06:33:11.012687 4831 scope.go:117] "RemoveContainer" containerID="2eb1b0783eb62f680604ecf7af5a64083dfba10f7beb1cb91448817c62828da4" Dec 03 06:33:11 crc kubenswrapper[4831]: E1203 06:33:11.012765 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:33:11 crc kubenswrapper[4831]: I1203 06:33:11.726051 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vz8ft_74a16df4-1f25-4b0f-9e08-f6486f262a68/kube-multus/1.log" Dec 03 06:33:11 crc kubenswrapper[4831]: I1203 06:33:11.726444 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vz8ft" event={"ID":"74a16df4-1f25-4b0f-9e08-f6486f262a68","Type":"ContainerStarted","Data":"cccd1d19fe7a46c7f1bfe0299b4666ece8cecce74a0354d94cf9edcb4d647bd5"} Dec 03 06:33:12 crc kubenswrapper[4831]: I1203 06:33:12.011699 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:33:12 crc kubenswrapper[4831]: I1203 06:33:12.011711 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:33:12 crc kubenswrapper[4831]: E1203 06:33:12.011896 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Dec 03 06:33:12 crc kubenswrapper[4831]: I1203 06:33:12.011710 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:33:12 crc kubenswrapper[4831]: E1203 06:33:12.012090 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Dec 03 06:33:12 crc kubenswrapper[4831]: E1203 06:33:12.012217 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Dec 03 06:33:13 crc kubenswrapper[4831]: I1203 06:33:13.012654 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:33:13 crc kubenswrapper[4831]: E1203 06:33:13.014657 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-lllsw" podUID="8283839a-a189-493f-bde7-e0193d575963" Dec 03 06:33:14 crc kubenswrapper[4831]: I1203 06:33:14.013412 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:33:14 crc kubenswrapper[4831]: I1203 06:33:14.013523 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:33:14 crc kubenswrapper[4831]: I1203 06:33:14.013412 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:33:14 crc kubenswrapper[4831]: I1203 06:33:14.015927 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 06:33:14 crc kubenswrapper[4831]: I1203 06:33:14.015941 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 06:33:14 crc kubenswrapper[4831]: I1203 06:33:14.017769 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 06:33:14 crc kubenswrapper[4831]: I1203 06:33:14.017827 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 06:33:15 crc kubenswrapper[4831]: I1203 06:33:15.012533 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:33:15 crc kubenswrapper[4831]: I1203 06:33:15.016005 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 06:33:15 crc kubenswrapper[4831]: I1203 06:33:15.017943 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.890091 4831 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.941997 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wcqwh"] Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.943058 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.945283 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-22sdc"] Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.945962 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.947611 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7pqsq"] Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.948538 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.949442 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.951012 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch"] Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.951672 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.952059 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.957144 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57"] Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.957885 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.970827 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-dwsb6"] Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.971476 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.977784 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.979705 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.989949 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.993441 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-2t7pp"] Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.993617 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.994109 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2t7pp" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.994374 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 06:33:18 crc kubenswrapper[4831]: I1203 06:33:18.996064 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.000840 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.001227 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.001373 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.001410 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.001542 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.001605 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.001675 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.001753 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.001820 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.001937 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.001973 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.002120 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.002186 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.002351 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.002127 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.002495 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.002544 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.002630 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.002706 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.002807 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.002893 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.002965 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.003048 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.003131 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.003180 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.003589 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.003900 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.004426 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-shxrz"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.004826 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.005535 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.005588 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.005793 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.005953 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.006136 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.006266 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.006442 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.006590 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.006741 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.006955 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.007297 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.007724 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.007905 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.008359 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.009640 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.023302 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.046171 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.046745 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.046980 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.049340 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.050365 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rddpj"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.051053 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.051198 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.051397 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.051538 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.051925 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.052054 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.052132 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.052248 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.052464 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.053134 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.053371 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.053494 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.053519 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.053559 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.053643 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.053747 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.053805 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.054970 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.055469 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.060126 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.060309 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.060529 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.060781 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.061030 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.061141 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.061231 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.061675 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.061802 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.062168 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.062377 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.062531 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.064712 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.069022 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.069816 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-6n7vj"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.070150 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.070279 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.070418 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-d7hl4"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.070506 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.070653 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncw4v"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.070774 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.070903 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.071037 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.071233 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-k8c4l"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.071340 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.071387 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.072065 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-l2jlm"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.072468 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2jlm" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.072641 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-k8c4l" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.072942 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.074221 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.074669 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.074814 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.075359 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-knpqw"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.075848 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.095464 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.095936 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.096008 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.096157 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.096183 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.096508 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.096669 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.096833 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.097201 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.097924 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.098628 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120630 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f2985436-5396-4ae0-936a-890d28feee53-node-pullsecrets\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120681 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6m8wf\" (UniqueName: \"kubernetes.io/projected/15853325-cf95-43bc-a17f-baecad9d4282-kube-api-access-6m8wf\") pod \"machine-config-operator-74547568cd-pzrgg\" (UID: \"15853325-cf95-43bc-a17f-baecad9d4282\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120708 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-config\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120732 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk5z4\" (UniqueName: \"kubernetes.io/projected/98db08ff-cad0-4e36-8b93-7dcb3692a7ef-kube-api-access-tk5z4\") pod \"openshift-apiserver-operator-796bbdcf4f-6pn5x\" (UID: \"98db08ff-cad0-4e36-8b93-7dcb3692a7ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120756 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6-images\") pod \"machine-api-operator-5694c8668f-7pqsq\" (UID: \"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120776 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5286\" (UniqueName: \"kubernetes.io/projected/bfba7fc4-12b2-40ff-b18c-170051c75374-kube-api-access-j5286\") pod \"downloads-7954f5f757-2t7pp\" (UID: \"bfba7fc4-12b2-40ff-b18c-170051c75374\") " pod="openshift-console/downloads-7954f5f757-2t7pp" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120796 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98db08ff-cad0-4e36-8b93-7dcb3692a7ef-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6pn5x\" (UID: \"98db08ff-cad0-4e36-8b93-7dcb3692a7ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120817 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f2985436-5396-4ae0-936a-890d28feee53-encryption-config\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120837 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/66d7f813-acc8-4e09-9575-9c7848a3b062-audit-policies\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120860 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15853325-cf95-43bc-a17f-baecad9d4282-proxy-tls\") pod \"machine-config-operator-74547568cd-pzrgg\" (UID: \"15853325-cf95-43bc-a17f-baecad9d4282\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120883 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-service-ca\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120906 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9acc4f76-607c-4694-aa50-cacf7fa07f50-config\") pod \"console-operator-58897d9998-shxrz\" (UID: \"9acc4f76-607c-4694-aa50-cacf7fa07f50\") " pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120911 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.120927 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-audit\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121077 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66d7f813-acc8-4e09-9575-9c7848a3b062-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121096 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-trusted-ca-bundle\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121120 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-oauth-serving-cert\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121147 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cfeb8ef-4262-4aef-a179-3018896ace13-serving-cert\") pod \"route-controller-manager-6576b87f9c-4jrch\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121169 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9acc4f76-607c-4694-aa50-cacf7fa07f50-serving-cert\") pod \"console-operator-58897d9998-shxrz\" (UID: \"9acc4f76-607c-4694-aa50-cacf7fa07f50\") " pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121208 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9acc4f76-607c-4694-aa50-cacf7fa07f50-trusted-ca\") pod \"console-operator-58897d9998-shxrz\" (UID: \"9acc4f76-607c-4694-aa50-cacf7fa07f50\") " pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121228 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cfeb8ef-4262-4aef-a179-3018896ace13-config\") pod \"route-controller-manager-6576b87f9c-4jrch\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121252 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/15853325-cf95-43bc-a17f-baecad9d4282-images\") pod \"machine-config-operator-74547568cd-pzrgg\" (UID: \"15853325-cf95-43bc-a17f-baecad9d4282\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121274 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98db08ff-cad0-4e36-8b93-7dcb3692a7ef-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6pn5x\" (UID: \"98db08ff-cad0-4e36-8b93-7dcb3692a7ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121296 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c289d28d-642e-4cc4-9d25-f025800585d1-console-oauth-config\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121344 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2985436-5396-4ae0-936a-890d28feee53-serving-cert\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121366 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/15853325-cf95-43bc-a17f-baecad9d4282-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pzrgg\" (UID: \"15853325-cf95-43bc-a17f-baecad9d4282\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121390 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cfeb8ef-4262-4aef-a179-3018896ace13-client-ca\") pod \"route-controller-manager-6576b87f9c-4jrch\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121417 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66d7f813-acc8-4e09-9575-9c7848a3b062-serving-cert\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121453 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jc4qm\" (UniqueName: \"kubernetes.io/projected/c289d28d-642e-4cc4-9d25-f025800585d1-kube-api-access-jc4qm\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121476 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zptfx\" (UniqueName: \"kubernetes.io/projected/f2985436-5396-4ae0-936a-890d28feee53-kube-api-access-zptfx\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121499 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121520 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w55j7\" (UniqueName: \"kubernetes.io/projected/8cfeb8ef-4262-4aef-a179-3018896ace13-kube-api-access-w55j7\") pod \"route-controller-manager-6576b87f9c-4jrch\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121555 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121554 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699c38a7-81dd-4614-8bb9-cd97b5756fc4-serving-cert\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121679 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66d7f813-acc8-4e09-9575-9c7848a3b062-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121705 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2985436-5396-4ae0-936a-890d28feee53-etcd-client\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121730 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-client-ca\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121742 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121754 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66d7f813-acc8-4e09-9575-9c7848a3b062-audit-dir\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121787 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121796 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6-config\") pod \"machine-api-operator-5694c8668f-7pqsq\" (UID: \"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121826 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-etcd-serving-ca\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121850 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c289d28d-642e-4cc4-9d25-f025800585d1-console-serving-cert\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121858 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121872 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-console-config\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121902 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed20032c-5db7-416c-9881-d576c432e4ac-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rfpf6\" (UID: \"ed20032c-5db7-416c-9881-d576c432e4ac\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121954 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7pqsq\" (UID: \"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121977 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-config\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.121998 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2985436-5396-4ae0-936a-890d28feee53-audit-dir\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122020 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wr8h\" (UniqueName: \"kubernetes.io/projected/9acc4f76-607c-4694-aa50-cacf7fa07f50-kube-api-access-2wr8h\") pod \"console-operator-58897d9998-shxrz\" (UID: \"9acc4f76-607c-4694-aa50-cacf7fa07f50\") " pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122040 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-image-import-ca\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122061 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66d7f813-acc8-4e09-9575-9c7848a3b062-etcd-client\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122083 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb8bs\" (UniqueName: \"kubernetes.io/projected/66d7f813-acc8-4e09-9575-9c7848a3b062-kube-api-access-bb8bs\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122117 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nl58p\" (UniqueName: \"kubernetes.io/projected/0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6-kube-api-access-nl58p\") pod \"machine-api-operator-5694c8668f-7pqsq\" (UID: \"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122138 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66d7f813-acc8-4e09-9575-9c7848a3b062-encryption-config\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122171 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed20032c-5db7-416c-9881-d576c432e4ac-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rfpf6\" (UID: \"ed20032c-5db7-416c-9881-d576c432e4ac\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122192 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122214 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvkpr\" (UniqueName: \"kubernetes.io/projected/699c38a7-81dd-4614-8bb9-cd97b5756fc4-kube-api-access-cvkpr\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122254 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed20032c-5db7-416c-9881-d576c432e4ac-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rfpf6\" (UID: \"ed20032c-5db7-416c-9881-d576c432e4ac\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122282 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wd7mt\" (UniqueName: \"kubernetes.io/projected/ed20032c-5db7-416c-9881-d576c432e4ac-kube-api-access-wd7mt\") pod \"cluster-image-registry-operator-dc59b4c8b-rfpf6\" (UID: \"ed20032c-5db7-416c-9881-d576c432e4ac\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122290 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-87j4p"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122605 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.122895 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.123245 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.123717 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.123945 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.124231 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.125218 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.125364 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.125453 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.125533 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.125568 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.125694 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.125799 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.125849 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kjgd7"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.125890 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.126001 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.126104 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.126213 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.126258 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kjgd7" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.126325 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.126447 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.126564 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.126995 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.126447 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.126573 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.126864 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.126975 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.127020 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.127539 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.128004 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.128414 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.128842 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.129302 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z4sqp"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.129640 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.131093 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-px6wl"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.132081 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vjhwt"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.132417 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.132496 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.132818 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.133248 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.134346 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7pqsq"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.143668 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dwsb6"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.144697 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-d2msr"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.145404 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d2msr" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.145563 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hj64t"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.146118 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hj64t" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.146616 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-shxrz"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.146829 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.154426 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.156340 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wcqwh"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.158392 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.158968 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-knpqw"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.161760 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.162222 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.164274 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2t7pp"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.165263 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.167612 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.169046 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-d7hl4"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.170422 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-22sdc"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.171985 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.175534 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rddpj"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.176424 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.177717 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.179918 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-74p5m"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.180394 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-74p5m" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.181149 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kjgd7"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.184996 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.186500 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.194225 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.195159 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-l2jlm"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.196346 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-k8c4l"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.197368 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.198461 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d2msr"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.199492 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.200519 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.201575 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.203290 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncw4v"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.204804 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.205564 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-87j4p"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.207119 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vjhwt"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.208537 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.209908 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z4sqp"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.211353 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-px6wl"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.212926 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.214282 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.216286 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.217717 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.219440 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hj64t"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.221710 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dv9z6"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223051 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dv9z6"] Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223144 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223145 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cfeb8ef-4262-4aef-a179-3018896ace13-serving-cert\") pod \"route-controller-manager-6576b87f9c-4jrch\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223254 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9acc4f76-607c-4694-aa50-cacf7fa07f50-serving-cert\") pod \"console-operator-58897d9998-shxrz\" (UID: \"9acc4f76-607c-4694-aa50-cacf7fa07f50\") " pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223283 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce1e278b-fdf9-436e-87cb-01c6bb162a6e-metrics-tls\") pod \"ingress-operator-5b745b69d9-xdfq7\" (UID: \"ce1e278b-fdf9-436e-87cb-01c6bb162a6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223305 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce1e278b-fdf9-436e-87cb-01c6bb162a6e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xdfq7\" (UID: \"ce1e278b-fdf9-436e-87cb-01c6bb162a6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223358 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9acc4f76-607c-4694-aa50-cacf7fa07f50-trusted-ca\") pod \"console-operator-58897d9998-shxrz\" (UID: \"9acc4f76-607c-4694-aa50-cacf7fa07f50\") " pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223379 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cfeb8ef-4262-4aef-a179-3018896ace13-config\") pod \"route-controller-manager-6576b87f9c-4jrch\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223401 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/15853325-cf95-43bc-a17f-baecad9d4282-images\") pod \"machine-config-operator-74547568cd-pzrgg\" (UID: \"15853325-cf95-43bc-a17f-baecad9d4282\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223420 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98db08ff-cad0-4e36-8b93-7dcb3692a7ef-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6pn5x\" (UID: \"98db08ff-cad0-4e36-8b93-7dcb3692a7ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223443 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c289d28d-642e-4cc4-9d25-f025800585d1-console-oauth-config\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223468 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2985436-5396-4ae0-936a-890d28feee53-serving-cert\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223489 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cfeb8ef-4262-4aef-a179-3018896ace13-client-ca\") pod \"route-controller-manager-6576b87f9c-4jrch\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223513 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/15853325-cf95-43bc-a17f-baecad9d4282-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pzrgg\" (UID: \"15853325-cf95-43bc-a17f-baecad9d4282\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223537 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7528353d-35b1-42b9-84b7-14d53336d3ef-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jtlgt\" (UID: \"7528353d-35b1-42b9-84b7-14d53336d3ef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223560 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66d7f813-acc8-4e09-9575-9c7848a3b062-serving-cert\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223593 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jc4qm\" (UniqueName: \"kubernetes.io/projected/c289d28d-642e-4cc4-9d25-f025800585d1-kube-api-access-jc4qm\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223614 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e528cdde-c61e-42dd-8c55-e5276df017c6-secret-volume\") pod \"collect-profiles-29412390-2zb7w\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223637 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zptfx\" (UniqueName: \"kubernetes.io/projected/f2985436-5396-4ae0-936a-890d28feee53-kube-api-access-zptfx\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223660 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtpvx\" (UniqueName: \"kubernetes.io/projected/ce1e278b-fdf9-436e-87cb-01c6bb162a6e-kube-api-access-qtpvx\") pod \"ingress-operator-5b745b69d9-xdfq7\" (UID: \"ce1e278b-fdf9-436e-87cb-01c6bb162a6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223681 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bxw6\" (UniqueName: \"kubernetes.io/projected/2e8e18de-44f2-4319-980f-2c29f0e86336-kube-api-access-6bxw6\") pod \"cluster-samples-operator-665b6dd947-vxszx\" (UID: \"2e8e18de-44f2-4319-980f-2c29f0e86336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223706 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223728 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w55j7\" (UniqueName: \"kubernetes.io/projected/8cfeb8ef-4262-4aef-a179-3018896ace13-kube-api-access-w55j7\") pod \"route-controller-manager-6576b87f9c-4jrch\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223750 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49b94e7c-b663-40c2-a89b-ef31f9402ad4-profile-collector-cert\") pod \"catalog-operator-68c6474976-fxwnr\" (UID: \"49b94e7c-b663-40c2-a89b-ef31f9402ad4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223773 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699c38a7-81dd-4614-8bb9-cd97b5756fc4-serving-cert\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223796 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e8e18de-44f2-4319-980f-2c29f0e86336-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vxszx\" (UID: \"2e8e18de-44f2-4319-980f-2c29f0e86336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223821 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2985436-5396-4ae0-936a-890d28feee53-etcd-client\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223845 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66d7f813-acc8-4e09-9575-9c7848a3b062-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223871 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-client-ca\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223900 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49b94e7c-b663-40c2-a89b-ef31f9402ad4-srv-cert\") pod \"catalog-operator-68c6474976-fxwnr\" (UID: \"49b94e7c-b663-40c2-a89b-ef31f9402ad4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223926 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66d7f813-acc8-4e09-9575-9c7848a3b062-audit-dir\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223949 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x48c\" (UID: \"7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.223974 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7245e283-43e6-4dc1-b93a-0e3452b903f8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rddpj\" (UID: \"7245e283-43e6-4dc1-b93a-0e3452b903f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.224002 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6-config\") pod \"machine-api-operator-5694c8668f-7pqsq\" (UID: \"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.224291 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/66d7f813-acc8-4e09-9575-9c7848a3b062-audit-dir\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.224673 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-etcd-serving-ca\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.224712 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7528353d-35b1-42b9-84b7-14d53336d3ef-config\") pod \"kube-apiserver-operator-766d6c64bb-jtlgt\" (UID: \"7528353d-35b1-42b9-84b7-14d53336d3ef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225016 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwlc5\" (UniqueName: \"kubernetes.io/projected/7245e283-43e6-4dc1-b93a-0e3452b903f8-kube-api-access-gwlc5\") pod \"openshift-config-operator-7777fb866f-rddpj\" (UID: \"7245e283-43e6-4dc1-b93a-0e3452b903f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225056 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c289d28d-642e-4cc4-9d25-f025800585d1-console-serving-cert\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225082 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-console-config\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225105 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7r5n\" (UniqueName: \"kubernetes.io/projected/51eb384f-9a82-4a1b-ab8d-749a23376b2f-kube-api-access-c7r5n\") pod \"control-plane-machine-set-operator-78cbb6b69f-d8p7m\" (UID: \"51eb384f-9a82-4a1b-ab8d-749a23376b2f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225128 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7245e283-43e6-4dc1-b93a-0e3452b903f8-serving-cert\") pod \"openshift-config-operator-7777fb866f-rddpj\" (UID: \"7245e283-43e6-4dc1-b93a-0e3452b903f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225152 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed20032c-5db7-416c-9881-d576c432e4ac-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rfpf6\" (UID: \"ed20032c-5db7-416c-9881-d576c432e4ac\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225181 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7pqsq\" (UID: \"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225191 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-client-ca\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225211 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-config\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225233 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2985436-5396-4ae0-936a-890d28feee53-audit-dir\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225258 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20204b88-623e-47c6-bad0-4d08eba387f7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d9gxq\" (UID: \"20204b88-623e-47c6-bad0-4d08eba387f7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225264 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-trusted-ca-bundle\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225283 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wr8h\" (UniqueName: \"kubernetes.io/projected/9acc4f76-607c-4694-aa50-cacf7fa07f50-kube-api-access-2wr8h\") pod \"console-operator-58897d9998-shxrz\" (UID: \"9acc4f76-607c-4694-aa50-cacf7fa07f50\") " pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225360 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-image-import-ca\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225384 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66d7f813-acc8-4e09-9575-9c7848a3b062-etcd-client\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225415 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dt7f\" (UniqueName: \"kubernetes.io/projected/20204b88-623e-47c6-bad0-4d08eba387f7-kube-api-access-8dt7f\") pod \"openshift-controller-manager-operator-756b6f6bc6-d9gxq\" (UID: \"20204b88-623e-47c6-bad0-4d08eba387f7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225441 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vzbs\" (UniqueName: \"kubernetes.io/projected/49b94e7c-b663-40c2-a89b-ef31f9402ad4-kube-api-access-6vzbs\") pod \"catalog-operator-68c6474976-fxwnr\" (UID: \"49b94e7c-b663-40c2-a89b-ef31f9402ad4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225465 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb8bs\" (UniqueName: \"kubernetes.io/projected/66d7f813-acc8-4e09-9575-9c7848a3b062-kube-api-access-bb8bs\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7528353d-35b1-42b9-84b7-14d53336d3ef-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jtlgt\" (UID: \"7528353d-35b1-42b9-84b7-14d53336d3ef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225506 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x48c\" (UID: \"7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225525 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x48c\" (UID: \"7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225562 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nl58p\" (UniqueName: \"kubernetes.io/projected/0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6-kube-api-access-nl58p\") pod \"machine-api-operator-5694c8668f-7pqsq\" (UID: \"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225584 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66d7f813-acc8-4e09-9575-9c7848a3b062-encryption-config\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225615 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed20032c-5db7-416c-9881-d576c432e4ac-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rfpf6\" (UID: \"ed20032c-5db7-416c-9881-d576c432e4ac\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225634 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225654 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvkpr\" (UniqueName: \"kubernetes.io/projected/699c38a7-81dd-4614-8bb9-cd97b5756fc4-kube-api-access-cvkpr\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225689 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed20032c-5db7-416c-9881-d576c432e4ac-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rfpf6\" (UID: \"ed20032c-5db7-416c-9881-d576c432e4ac\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225710 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wd7mt\" (UniqueName: \"kubernetes.io/projected/ed20032c-5db7-416c-9881-d576c432e4ac-kube-api-access-wd7mt\") pod \"cluster-image-registry-operator-dc59b4c8b-rfpf6\" (UID: \"ed20032c-5db7-416c-9881-d576c432e4ac\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225737 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f2985436-5396-4ae0-936a-890d28feee53-node-pullsecrets\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225753 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cfeb8ef-4262-4aef-a179-3018896ace13-config\") pod \"route-controller-manager-6576b87f9c-4jrch\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225759 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6m8wf\" (UniqueName: \"kubernetes.io/projected/15853325-cf95-43bc-a17f-baecad9d4282-kube-api-access-6m8wf\") pod \"machine-config-operator-74547568cd-pzrgg\" (UID: \"15853325-cf95-43bc-a17f-baecad9d4282\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225783 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-config\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225836 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tk5z4\" (UniqueName: \"kubernetes.io/projected/98db08ff-cad0-4e36-8b93-7dcb3692a7ef-kube-api-access-tk5z4\") pod \"openshift-apiserver-operator-796bbdcf4f-6pn5x\" (UID: \"98db08ff-cad0-4e36-8b93-7dcb3692a7ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225863 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/51eb384f-9a82-4a1b-ab8d-749a23376b2f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d8p7m\" (UID: \"51eb384f-9a82-4a1b-ab8d-749a23376b2f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225886 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce1e278b-fdf9-436e-87cb-01c6bb162a6e-trusted-ca\") pod \"ingress-operator-5b745b69d9-xdfq7\" (UID: \"ce1e278b-fdf9-436e-87cb-01c6bb162a6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225912 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj249\" (UniqueName: \"kubernetes.io/projected/e528cdde-c61e-42dd-8c55-e5276df017c6-kube-api-access-rj249\") pod \"collect-profiles-29412390-2zb7w\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225936 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6-images\") pod \"machine-api-operator-5694c8668f-7pqsq\" (UID: \"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225957 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5286\" (UniqueName: \"kubernetes.io/projected/bfba7fc4-12b2-40ff-b18c-170051c75374-kube-api-access-j5286\") pod \"downloads-7954f5f757-2t7pp\" (UID: \"bfba7fc4-12b2-40ff-b18c-170051c75374\") " pod="openshift-console/downloads-7954f5f757-2t7pp" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.225979 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98db08ff-cad0-4e36-8b93-7dcb3692a7ef-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6pn5x\" (UID: \"98db08ff-cad0-4e36-8b93-7dcb3692a7ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226000 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20204b88-623e-47c6-bad0-4d08eba387f7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d9gxq\" (UID: \"20204b88-623e-47c6-bad0-4d08eba387f7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226011 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9acc4f76-607c-4694-aa50-cacf7fa07f50-trusted-ca\") pod \"console-operator-58897d9998-shxrz\" (UID: \"9acc4f76-607c-4694-aa50-cacf7fa07f50\") " pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226023 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f2985436-5396-4ae0-936a-890d28feee53-encryption-config\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226062 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/66d7f813-acc8-4e09-9575-9c7848a3b062-audit-policies\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226084 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15853325-cf95-43bc-a17f-baecad9d4282-proxy-tls\") pod \"machine-config-operator-74547568cd-pzrgg\" (UID: \"15853325-cf95-43bc-a17f-baecad9d4282\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226101 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-service-ca\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226118 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9acc4f76-607c-4694-aa50-cacf7fa07f50-config\") pod \"console-operator-58897d9998-shxrz\" (UID: \"9acc4f76-607c-4694-aa50-cacf7fa07f50\") " pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226134 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-audit\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226152 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66d7f813-acc8-4e09-9575-9c7848a3b062-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226167 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-trusted-ca-bundle\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226181 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-oauth-serving-cert\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226185 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6-config\") pod \"machine-api-operator-5694c8668f-7pqsq\" (UID: \"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226201 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e528cdde-c61e-42dd-8c55-e5276df017c6-config-volume\") pod \"collect-profiles-29412390-2zb7w\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226624 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-image-import-ca\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226721 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/66d7f813-acc8-4e09-9575-9c7848a3b062-audit-policies\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.226806 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-etcd-serving-ca\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.224750 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66d7f813-acc8-4e09-9575-9c7848a3b062-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.224779 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/15853325-cf95-43bc-a17f-baecad9d4282-images\") pod \"machine-config-operator-74547568cd-pzrgg\" (UID: \"15853325-cf95-43bc-a17f-baecad9d4282\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.228921 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-service-ca\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.228952 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/66d7f813-acc8-4e09-9575-9c7848a3b062-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.229857 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-oauth-serving-cert\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.229998 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9acc4f76-607c-4694-aa50-cacf7fa07f50-config\") pod \"console-operator-58897d9998-shxrz\" (UID: \"9acc4f76-607c-4694-aa50-cacf7fa07f50\") " pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.230209 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-console-config\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.230225 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-config\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.230375 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6-images\") pod \"machine-api-operator-5694c8668f-7pqsq\" (UID: \"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.230491 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f2985436-5396-4ae0-936a-890d28feee53-audit-dir\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.230627 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-audit\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.230663 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-trusted-ca-bundle\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.230751 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f2985436-5396-4ae0-936a-890d28feee53-node-pullsecrets\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.231006 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f2985436-5396-4ae0-936a-890d28feee53-config\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.231138 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98db08ff-cad0-4e36-8b93-7dcb3692a7ef-config\") pod \"openshift-apiserver-operator-796bbdcf4f-6pn5x\" (UID: \"98db08ff-cad0-4e36-8b93-7dcb3692a7ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.232216 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cfeb8ef-4262-4aef-a179-3018896ace13-client-ca\") pod \"route-controller-manager-6576b87f9c-4jrch\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.232778 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/15853325-cf95-43bc-a17f-baecad9d4282-auth-proxy-config\") pod \"machine-config-operator-74547568cd-pzrgg\" (UID: \"15853325-cf95-43bc-a17f-baecad9d4282\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.233165 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98db08ff-cad0-4e36-8b93-7dcb3692a7ef-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-6pn5x\" (UID: \"98db08ff-cad0-4e36-8b93-7dcb3692a7ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.233289 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.233301 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/66d7f813-acc8-4e09-9575-9c7848a3b062-serving-cert\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.234030 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.234180 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9acc4f76-607c-4694-aa50-cacf7fa07f50-serving-cert\") pod \"console-operator-58897d9998-shxrz\" (UID: \"9acc4f76-607c-4694-aa50-cacf7fa07f50\") " pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.234449 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15853325-cf95-43bc-a17f-baecad9d4282-proxy-tls\") pod \"machine-config-operator-74547568cd-pzrgg\" (UID: \"15853325-cf95-43bc-a17f-baecad9d4282\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.234515 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cfeb8ef-4262-4aef-a179-3018896ace13-serving-cert\") pod \"route-controller-manager-6576b87f9c-4jrch\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.234567 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ed20032c-5db7-416c-9881-d576c432e4ac-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-rfpf6\" (UID: \"ed20032c-5db7-416c-9881-d576c432e4ac\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.234675 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f2985436-5396-4ae0-936a-890d28feee53-etcd-client\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.235053 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f2985436-5396-4ae0-936a-890d28feee53-serving-cert\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.235333 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c289d28d-642e-4cc4-9d25-f025800585d1-console-oauth-config\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.235415 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c289d28d-642e-4cc4-9d25-f025800585d1-console-serving-cert\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.235774 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/66d7f813-acc8-4e09-9575-9c7848a3b062-etcd-client\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.236053 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/66d7f813-acc8-4e09-9575-9c7848a3b062-encryption-config\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.236131 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699c38a7-81dd-4614-8bb9-cd97b5756fc4-serving-cert\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.236884 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ed20032c-5db7-416c-9881-d576c432e4ac-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-rfpf6\" (UID: \"ed20032c-5db7-416c-9881-d576c432e4ac\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.237028 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f2985436-5396-4ae0-936a-890d28feee53-encryption-config\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.237092 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7pqsq\" (UID: \"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.252171 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.264932 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.284746 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.305918 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.325521 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.326623 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/51eb384f-9a82-4a1b-ab8d-749a23376b2f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d8p7m\" (UID: \"51eb384f-9a82-4a1b-ab8d-749a23376b2f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.326725 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce1e278b-fdf9-436e-87cb-01c6bb162a6e-trusted-ca\") pod \"ingress-operator-5b745b69d9-xdfq7\" (UID: \"ce1e278b-fdf9-436e-87cb-01c6bb162a6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.326804 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj249\" (UniqueName: \"kubernetes.io/projected/e528cdde-c61e-42dd-8c55-e5276df017c6-kube-api-access-rj249\") pod \"collect-profiles-29412390-2zb7w\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.326881 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20204b88-623e-47c6-bad0-4d08eba387f7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d9gxq\" (UID: \"20204b88-623e-47c6-bad0-4d08eba387f7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.326959 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e528cdde-c61e-42dd-8c55-e5276df017c6-config-volume\") pod \"collect-profiles-29412390-2zb7w\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.327043 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce1e278b-fdf9-436e-87cb-01c6bb162a6e-metrics-tls\") pod \"ingress-operator-5b745b69d9-xdfq7\" (UID: \"ce1e278b-fdf9-436e-87cb-01c6bb162a6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.327138 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce1e278b-fdf9-436e-87cb-01c6bb162a6e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xdfq7\" (UID: \"ce1e278b-fdf9-436e-87cb-01c6bb162a6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.327269 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7528353d-35b1-42b9-84b7-14d53336d3ef-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jtlgt\" (UID: \"7528353d-35b1-42b9-84b7-14d53336d3ef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.327410 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e528cdde-c61e-42dd-8c55-e5276df017c6-secret-volume\") pod \"collect-profiles-29412390-2zb7w\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.327548 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qtpvx\" (UniqueName: \"kubernetes.io/projected/ce1e278b-fdf9-436e-87cb-01c6bb162a6e-kube-api-access-qtpvx\") pod \"ingress-operator-5b745b69d9-xdfq7\" (UID: \"ce1e278b-fdf9-436e-87cb-01c6bb162a6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.327660 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bxw6\" (UniqueName: \"kubernetes.io/projected/2e8e18de-44f2-4319-980f-2c29f0e86336-kube-api-access-6bxw6\") pod \"cluster-samples-operator-665b6dd947-vxszx\" (UID: \"2e8e18de-44f2-4319-980f-2c29f0e86336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.327768 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49b94e7c-b663-40c2-a89b-ef31f9402ad4-profile-collector-cert\") pod \"catalog-operator-68c6474976-fxwnr\" (UID: \"49b94e7c-b663-40c2-a89b-ef31f9402ad4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.327873 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e8e18de-44f2-4319-980f-2c29f0e86336-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vxszx\" (UID: \"2e8e18de-44f2-4319-980f-2c29f0e86336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.327970 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49b94e7c-b663-40c2-a89b-ef31f9402ad4-srv-cert\") pod \"catalog-operator-68c6474976-fxwnr\" (UID: \"49b94e7c-b663-40c2-a89b-ef31f9402ad4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.328074 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x48c\" (UID: \"7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.328175 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7245e283-43e6-4dc1-b93a-0e3452b903f8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rddpj\" (UID: \"7245e283-43e6-4dc1-b93a-0e3452b903f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.328283 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7528353d-35b1-42b9-84b7-14d53336d3ef-config\") pod \"kube-apiserver-operator-766d6c64bb-jtlgt\" (UID: \"7528353d-35b1-42b9-84b7-14d53336d3ef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.328414 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwlc5\" (UniqueName: \"kubernetes.io/projected/7245e283-43e6-4dc1-b93a-0e3452b903f8-kube-api-access-gwlc5\") pod \"openshift-config-operator-7777fb866f-rddpj\" (UID: \"7245e283-43e6-4dc1-b93a-0e3452b903f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.328504 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7245e283-43e6-4dc1-b93a-0e3452b903f8-serving-cert\") pod \"openshift-config-operator-7777fb866f-rddpj\" (UID: \"7245e283-43e6-4dc1-b93a-0e3452b903f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.328606 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7r5n\" (UniqueName: \"kubernetes.io/projected/51eb384f-9a82-4a1b-ab8d-749a23376b2f-kube-api-access-c7r5n\") pod \"control-plane-machine-set-operator-78cbb6b69f-d8p7m\" (UID: \"51eb384f-9a82-4a1b-ab8d-749a23376b2f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.328694 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20204b88-623e-47c6-bad0-4d08eba387f7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d9gxq\" (UID: \"20204b88-623e-47c6-bad0-4d08eba387f7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.328796 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dt7f\" (UniqueName: \"kubernetes.io/projected/20204b88-623e-47c6-bad0-4d08eba387f7-kube-api-access-8dt7f\") pod \"openshift-controller-manager-operator-756b6f6bc6-d9gxq\" (UID: \"20204b88-623e-47c6-bad0-4d08eba387f7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.328889 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vzbs\" (UniqueName: \"kubernetes.io/projected/49b94e7c-b663-40c2-a89b-ef31f9402ad4-kube-api-access-6vzbs\") pod \"catalog-operator-68c6474976-fxwnr\" (UID: \"49b94e7c-b663-40c2-a89b-ef31f9402ad4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.329146 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7528353d-35b1-42b9-84b7-14d53336d3ef-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jtlgt\" (UID: \"7528353d-35b1-42b9-84b7-14d53336d3ef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.329225 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x48c\" (UID: \"7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.329375 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x48c\" (UID: \"7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.328693 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/7245e283-43e6-4dc1-b93a-0e3452b903f8-available-featuregates\") pod \"openshift-config-operator-7777fb866f-rddpj\" (UID: \"7245e283-43e6-4dc1-b93a-0e3452b903f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.329189 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7528353d-35b1-42b9-84b7-14d53336d3ef-config\") pod \"kube-apiserver-operator-766d6c64bb-jtlgt\" (UID: \"7528353d-35b1-42b9-84b7-14d53336d3ef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.329666 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/20204b88-623e-47c6-bad0-4d08eba387f7-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-d9gxq\" (UID: \"20204b88-623e-47c6-bad0-4d08eba387f7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.330131 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7528353d-35b1-42b9-84b7-14d53336d3ef-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-jtlgt\" (UID: \"7528353d-35b1-42b9-84b7-14d53336d3ef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.330424 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x48c\" (UID: \"7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.331293 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x48c\" (UID: \"7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.331406 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/20204b88-623e-47c6-bad0-4d08eba387f7-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-d9gxq\" (UID: \"20204b88-623e-47c6-bad0-4d08eba387f7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.332283 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2e8e18de-44f2-4319-980f-2c29f0e86336-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-vxszx\" (UID: \"2e8e18de-44f2-4319-980f-2c29f0e86336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.345087 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7245e283-43e6-4dc1-b93a-0e3452b903f8-serving-cert\") pod \"openshift-config-operator-7777fb866f-rddpj\" (UID: \"7245e283-43e6-4dc1-b93a-0e3452b903f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.346221 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.365611 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.385680 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.406960 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.426032 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.446509 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.485764 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.507039 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.526570 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.546035 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.566192 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.587225 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.606132 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.626240 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.646659 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.665971 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.686236 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.706189 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.743743 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.751217 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.766737 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.787988 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.806904 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.826782 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.856278 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.866902 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.886161 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.906763 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.926855 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.946932 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.967046 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 06:33:19 crc kubenswrapper[4831]: I1203 06:33:19.986502 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.006869 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.026758 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.045999 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.066937 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.085873 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.091547 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ce1e278b-fdf9-436e-87cb-01c6bb162a6e-metrics-tls\") pod \"ingress-operator-5b745b69d9-xdfq7\" (UID: \"ce1e278b-fdf9-436e-87cb-01c6bb162a6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.116094 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.119298 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ce1e278b-fdf9-436e-87cb-01c6bb162a6e-trusted-ca\") pod \"ingress-operator-5b745b69d9-xdfq7\" (UID: \"ce1e278b-fdf9-436e-87cb-01c6bb162a6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.124656 4831 request.go:700] Waited for 1.000627694s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-operator/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&limit=500&resourceVersion=0 Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.127401 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.145790 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.150981 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/51eb384f-9a82-4a1b-ab8d-749a23376b2f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-d8p7m\" (UID: \"51eb384f-9a82-4a1b-ab8d-749a23376b2f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.165379 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.185825 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.206127 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.225509 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.247091 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.265822 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.285304 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.305826 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.326613 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 06:33:20 crc kubenswrapper[4831]: E1203 06:33:20.327492 4831 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Dec 03 06:33:20 crc kubenswrapper[4831]: E1203 06:33:20.327617 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e528cdde-c61e-42dd-8c55-e5276df017c6-config-volume podName:e528cdde-c61e-42dd-8c55-e5276df017c6 nodeName:}" failed. No retries permitted until 2025-12-03 06:33:20.827581263 +0000 UTC m=+138.171164811 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/e528cdde-c61e-42dd-8c55-e5276df017c6-config-volume") pod "collect-profiles-29412390-2zb7w" (UID: "e528cdde-c61e-42dd-8c55-e5276df017c6") : failed to sync configmap cache: timed out waiting for the condition Dec 03 06:33:20 crc kubenswrapper[4831]: E1203 06:33:20.327684 4831 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 06:33:20 crc kubenswrapper[4831]: E1203 06:33:20.327788 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e528cdde-c61e-42dd-8c55-e5276df017c6-secret-volume podName:e528cdde-c61e-42dd-8c55-e5276df017c6 nodeName:}" failed. No retries permitted until 2025-12-03 06:33:20.827764979 +0000 UTC m=+138.171348527 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "secret-volume" (UniqueName: "kubernetes.io/secret/e528cdde-c61e-42dd-8c55-e5276df017c6-secret-volume") pod "collect-profiles-29412390-2zb7w" (UID: "e528cdde-c61e-42dd-8c55-e5276df017c6") : failed to sync secret cache: timed out waiting for the condition Dec 03 06:33:20 crc kubenswrapper[4831]: E1203 06:33:20.327870 4831 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/pprof-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 06:33:20 crc kubenswrapper[4831]: E1203 06:33:20.327949 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49b94e7c-b663-40c2-a89b-ef31f9402ad4-profile-collector-cert podName:49b94e7c-b663-40c2-a89b-ef31f9402ad4 nodeName:}" failed. No retries permitted until 2025-12-03 06:33:20.827933604 +0000 UTC m=+138.171517152 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "profile-collector-cert" (UniqueName: "kubernetes.io/secret/49b94e7c-b663-40c2-a89b-ef31f9402ad4-profile-collector-cert") pod "catalog-operator-68c6474976-fxwnr" (UID: "49b94e7c-b663-40c2-a89b-ef31f9402ad4") : failed to sync secret cache: timed out waiting for the condition Dec 03 06:33:20 crc kubenswrapper[4831]: E1203 06:33:20.328114 4831 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Dec 03 06:33:20 crc kubenswrapper[4831]: E1203 06:33:20.328214 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/49b94e7c-b663-40c2-a89b-ef31f9402ad4-srv-cert podName:49b94e7c-b663-40c2-a89b-ef31f9402ad4 nodeName:}" failed. No retries permitted until 2025-12-03 06:33:20.828189473 +0000 UTC m=+138.171773021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/49b94e7c-b663-40c2-a89b-ef31f9402ad4-srv-cert") pod "catalog-operator-68c6474976-fxwnr" (UID: "49b94e7c-b663-40c2-a89b-ef31f9402ad4") : failed to sync secret cache: timed out waiting for the condition Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.345610 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.365791 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.385423 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.405202 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.426494 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.446061 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.466265 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.485893 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.506378 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.527370 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.546490 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.566257 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.599137 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.606901 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.625701 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.646528 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.666782 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.687094 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.707274 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.745860 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.765827 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.785943 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.806470 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.826586 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.842415 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.846419 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.850347 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e528cdde-c61e-42dd-8c55-e5276df017c6-secret-volume\") pod \"collect-profiles-29412390-2zb7w\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.850460 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49b94e7c-b663-40c2-a89b-ef31f9402ad4-profile-collector-cert\") pod \"catalog-operator-68c6474976-fxwnr\" (UID: \"49b94e7c-b663-40c2-a89b-ef31f9402ad4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.850504 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49b94e7c-b663-40c2-a89b-ef31f9402ad4-srv-cert\") pod \"catalog-operator-68c6474976-fxwnr\" (UID: \"49b94e7c-b663-40c2-a89b-ef31f9402ad4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.850844 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e528cdde-c61e-42dd-8c55-e5276df017c6-config-volume\") pod \"collect-profiles-29412390-2zb7w\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.852277 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e528cdde-c61e-42dd-8c55-e5276df017c6-config-volume\") pod \"collect-profiles-29412390-2zb7w\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.855896 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e528cdde-c61e-42dd-8c55-e5276df017c6-secret-volume\") pod \"collect-profiles-29412390-2zb7w\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.856284 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/49b94e7c-b663-40c2-a89b-ef31f9402ad4-srv-cert\") pod \"catalog-operator-68c6474976-fxwnr\" (UID: \"49b94e7c-b663-40c2-a89b-ef31f9402ad4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.857580 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/49b94e7c-b663-40c2-a89b-ef31f9402ad4-profile-collector-cert\") pod \"catalog-operator-68c6474976-fxwnr\" (UID: \"49b94e7c-b663-40c2-a89b-ef31f9402ad4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.866409 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.886701 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.906791 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.925763 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.946038 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.966054 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 06:33:20 crc kubenswrapper[4831]: I1203 06:33:20.986793 4831 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.035395 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w55j7\" (UniqueName: \"kubernetes.io/projected/8cfeb8ef-4262-4aef-a179-3018896ace13-kube-api-access-w55j7\") pod \"route-controller-manager-6576b87f9c-4jrch\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.048269 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jc4qm\" (UniqueName: \"kubernetes.io/projected/c289d28d-642e-4cc4-9d25-f025800585d1-kube-api-access-jc4qm\") pod \"console-f9d7485db-dwsb6\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.075009 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wr8h\" (UniqueName: \"kubernetes.io/projected/9acc4f76-607c-4694-aa50-cacf7fa07f50-kube-api-access-2wr8h\") pod \"console-operator-58897d9998-shxrz\" (UID: \"9acc4f76-607c-4694-aa50-cacf7fa07f50\") " pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.081717 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nl58p\" (UniqueName: \"kubernetes.io/projected/0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6-kube-api-access-nl58p\") pod \"machine-api-operator-5694c8668f-7pqsq\" (UID: \"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.112585 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb8bs\" (UniqueName: \"kubernetes.io/projected/66d7f813-acc8-4e09-9575-9c7848a3b062-kube-api-access-bb8bs\") pod \"apiserver-7bbb656c7d-48l57\" (UID: \"66d7f813-acc8-4e09-9575-9c7848a3b062\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.131933 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zptfx\" (UniqueName: \"kubernetes.io/projected/f2985436-5396-4ae0-936a-890d28feee53-kube-api-access-zptfx\") pod \"apiserver-76f77b778f-wcqwh\" (UID: \"f2985436-5396-4ae0-936a-890d28feee53\") " pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.144586 4831 request.go:700] Waited for 1.915101316s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver-operator/serviceaccounts/openshift-apiserver-operator/token Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.150933 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvkpr\" (UniqueName: \"kubernetes.io/projected/699c38a7-81dd-4614-8bb9-cd97b5756fc4-kube-api-access-cvkpr\") pod \"controller-manager-879f6c89f-22sdc\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.159617 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.173872 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk5z4\" (UniqueName: \"kubernetes.io/projected/98db08ff-cad0-4e36-8b93-7dcb3692a7ef-kube-api-access-tk5z4\") pod \"openshift-apiserver-operator-796bbdcf4f-6pn5x\" (UID: \"98db08ff-cad0-4e36-8b93-7dcb3692a7ef\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.176493 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.194895 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5286\" (UniqueName: \"kubernetes.io/projected/bfba7fc4-12b2-40ff-b18c-170051c75374-kube-api-access-j5286\") pod \"downloads-7954f5f757-2t7pp\" (UID: \"bfba7fc4-12b2-40ff-b18c-170051c75374\") " pod="openshift-console/downloads-7954f5f757-2t7pp" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.223724 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wd7mt\" (UniqueName: \"kubernetes.io/projected/ed20032c-5db7-416c-9881-d576c432e4ac-kube-api-access-wd7mt\") pod \"cluster-image-registry-operator-dc59b4c8b-rfpf6\" (UID: \"ed20032c-5db7-416c-9881-d576c432e4ac\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.228227 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.232894 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ed20032c-5db7-416c-9881-d576c432e4ac-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-rfpf6\" (UID: \"ed20032c-5db7-416c-9881-d576c432e4ac\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.245879 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.267103 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6m8wf\" (UniqueName: \"kubernetes.io/projected/15853325-cf95-43bc-a17f-baecad9d4282-kube-api-access-6m8wf\") pod \"machine-config-operator-74547568cd-pzrgg\" (UID: \"15853325-cf95-43bc-a17f-baecad9d4282\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.269798 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.270307 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-2t7pp" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.272619 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.280298 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.281144 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj249\" (UniqueName: \"kubernetes.io/projected/e528cdde-c61e-42dd-8c55-e5276df017c6-kube-api-access-rj249\") pod \"collect-profiles-29412390-2zb7w\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.286653 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.290851 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ce1e278b-fdf9-436e-87cb-01c6bb162a6e-bound-sa-token\") pod \"ingress-operator-5b745b69d9-xdfq7\" (UID: \"ce1e278b-fdf9-436e-87cb-01c6bb162a6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.309989 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtpvx\" (UniqueName: \"kubernetes.io/projected/ce1e278b-fdf9-436e-87cb-01c6bb162a6e-kube-api-access-qtpvx\") pod \"ingress-operator-5b745b69d9-xdfq7\" (UID: \"ce1e278b-fdf9-436e-87cb-01c6bb162a6e\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.338077 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bxw6\" (UniqueName: \"kubernetes.io/projected/2e8e18de-44f2-4319-980f-2c29f0e86336-kube-api-access-6bxw6\") pod \"cluster-samples-operator-665b6dd947-vxszx\" (UID: \"2e8e18de-44f2-4319-980f-2c29f0e86336\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.340469 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwlc5\" (UniqueName: \"kubernetes.io/projected/7245e283-43e6-4dc1-b93a-0e3452b903f8-kube-api-access-gwlc5\") pod \"openshift-config-operator-7777fb866f-rddpj\" (UID: \"7245e283-43e6-4dc1-b93a-0e3452b903f8\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.368044 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7r5n\" (UniqueName: \"kubernetes.io/projected/51eb384f-9a82-4a1b-ab8d-749a23376b2f-kube-api-access-c7r5n\") pod \"control-plane-machine-set-operator-78cbb6b69f-d8p7m\" (UID: \"51eb384f-9a82-4a1b-ab8d-749a23376b2f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.381653 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.399304 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vzbs\" (UniqueName: \"kubernetes.io/projected/49b94e7c-b663-40c2-a89b-ef31f9402ad4-kube-api-access-6vzbs\") pod \"catalog-operator-68c6474976-fxwnr\" (UID: \"49b94e7c-b663-40c2-a89b-ef31f9402ad4\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.404024 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dt7f\" (UniqueName: \"kubernetes.io/projected/20204b88-623e-47c6-bad0-4d08eba387f7-kube-api-access-8dt7f\") pod \"openshift-controller-manager-operator-756b6f6bc6-d9gxq\" (UID: \"20204b88-623e-47c6-bad0-4d08eba387f7\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.419287 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7pqsq"] Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.420853 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7528353d-35b1-42b9-84b7-14d53336d3ef-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-jtlgt\" (UID: \"7528353d-35b1-42b9-84b7-14d53336d3ef\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.434333 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.443202 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-5x48c\" (UID: \"7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.469268 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.477212 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.477340 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/738e77b8-85e3-468e-8e03-81b14335094e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-57gps\" (UID: \"738e77b8-85e3-468e-8e03-81b14335094e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.477377 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-registry-tls\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.477413 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-trusted-ca\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.477460 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gb4l\" (UniqueName: \"kubernetes.io/projected/ab667483-406a-451a-9695-d9453775a063-kube-api-access-2gb4l\") pod \"service-ca-operator-777779d784-px6wl\" (UID: \"ab667483-406a-451a-9695-d9453775a063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.477977 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9df54\" (UniqueName: \"kubernetes.io/projected/7af4c3b5-563b-48c2-89d4-2a0975fad647-kube-api-access-9df54\") pod \"migrator-59844c95c7-l2jlm\" (UID: \"7af4c3b5-563b-48c2-89d4-2a0975fad647\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2jlm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.478438 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.478596 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/738e77b8-85e3-468e-8e03-81b14335094e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-57gps\" (UID: \"738e77b8-85e3-468e-8e03-81b14335094e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.479259 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9stz\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-kube-api-access-l9stz\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.479293 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab667483-406a-451a-9695-d9453775a063-serving-cert\") pod \"service-ca-operator-777779d784-px6wl\" (UID: \"ab667483-406a-451a-9695-d9453775a063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.479436 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/294bf078-98d9-4c39-8fd5-f39926fbfe58-default-certificate\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.479468 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.479511 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab667483-406a-451a-9695-d9453775a063-config\") pod \"service-ca-operator-777779d784-px6wl\" (UID: \"ab667483-406a-451a-9695-d9453775a063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.479659 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/64f831a1-076e-47f9-afba-8812335ee8b5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kjgd7\" (UID: \"64f831a1-076e-47f9-afba-8812335ee8b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kjgd7" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.479717 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.479741 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738e77b8-85e3-468e-8e03-81b14335094e-config\") pod \"kube-controller-manager-operator-78b949d7b-57gps\" (UID: \"738e77b8-85e3-468e-8e03-81b14335094e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.479763 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drmrm\" (UniqueName: \"kubernetes.io/projected/64f831a1-076e-47f9-afba-8812335ee8b5-kube-api-access-drmrm\") pod \"multus-admission-controller-857f4d67dd-kjgd7\" (UID: \"64f831a1-076e-47f9-afba-8812335ee8b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kjgd7" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.479837 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-bound-sa-token\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.479956 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-registry-certificates\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.479981 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/294bf078-98d9-4c39-8fd5-f39926fbfe58-service-ca-bundle\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.480083 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/294bf078-98d9-4c39-8fd5-f39926fbfe58-stats-auth\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: E1203 06:33:21.481700 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:21.981683544 +0000 UTC m=+139.325267052 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.513130 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.556811 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581113 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581222 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drgmj\" (UniqueName: \"kubernetes.io/projected/135075ee-2f44-402b-a071-36b3b720d928-kube-api-access-drgmj\") pod \"marketplace-operator-79b997595-vjhwt\" (UID: \"135075ee-2f44-402b-a071-36b3b720d928\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581254 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/294bf078-98d9-4c39-8fd5-f39926fbfe58-stats-auth\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581278 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f2c111fe-9bfd-4701-9d8f-6077978c9d87-signing-key\") pod \"service-ca-9c57cc56f-z4sqp\" (UID: \"f2c111fe-9bfd-4701-9d8f-6077978c9d87\") " pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581297 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-plugins-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581715 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/738e77b8-85e3-468e-8e03-81b14335094e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-57gps\" (UID: \"738e77b8-85e3-468e-8e03-81b14335094e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581749 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6c22156-9bf2-4e81-9dba-8657b5761a4f-metrics-tls\") pod \"dns-operator-744455d44c-k8c4l\" (UID: \"d6c22156-9bf2-4e81-9dba-8657b5761a4f\") " pod="openshift-dns-operator/dns-operator-744455d44c-k8c4l" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581773 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7757c\" (UniqueName: \"kubernetes.io/projected/691274e4-346e-4eed-860a-12513d61bd02-kube-api-access-7757c\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581817 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-mountpoint-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581839 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1358e7a0-8632-4701-9a97-151d31f553cc-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6w69b\" (UID: \"1358e7a0-8632-4701-9a97-151d31f553cc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581860 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581882 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpd2s\" (UniqueName: \"kubernetes.io/projected/19cb3565-2001-4348-bb7f-79e9b1ce8aa8-kube-api-access-vpd2s\") pod \"machine-approver-56656f9798-sb6sm\" (UID: \"19cb3565-2001-4348-bb7f-79e9b1ce8aa8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581909 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581932 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-audit-policies\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581953 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e25a224-94a7-41ce-ae8c-6f60660873c4-config-volume\") pod \"dns-default-d2msr\" (UID: \"8e25a224-94a7-41ce-ae8c-6f60660873c4\") " pod="openshift-dns/dns-default-d2msr" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581972 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/738e77b8-85e3-468e-8e03-81b14335094e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-57gps\" (UID: \"738e77b8-85e3-468e-8e03-81b14335094e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.581995 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/19cb3565-2001-4348-bb7f-79e9b1ce8aa8-machine-approver-tls\") pod \"machine-approver-56656f9798-sb6sm\" (UID: \"19cb3565-2001-4348-bb7f-79e9b1ce8aa8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.582016 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/691274e4-346e-4eed-860a-12513d61bd02-etcd-service-ca\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.582222 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/135075ee-2f44-402b-a071-36b3b720d928-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vjhwt\" (UID: \"135075ee-2f44-402b-a071-36b3b720d928\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.582244 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/691274e4-346e-4eed-860a-12513d61bd02-etcd-ca\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.582262 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.582281 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/19cb3565-2001-4348-bb7f-79e9b1ce8aa8-auth-proxy-config\") pod \"machine-approver-56656f9798-sb6sm\" (UID: \"19cb3565-2001-4348-bb7f-79e9b1ce8aa8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.582305 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhmwd\" (UniqueName: \"kubernetes.io/projected/453efca8-e843-4972-b9ae-eae3df7b02a6-kube-api-access-fhmwd\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.582343 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06fb5461-cc65-431e-9236-70177a346997-webhook-cert\") pod \"packageserver-d55dfcdfc-jtcpn\" (UID: \"06fb5461-cc65-431e-9236-70177a346997\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.582366 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.582418 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6xm2\" (UniqueName: \"kubernetes.io/projected/294bf078-98d9-4c39-8fd5-f39926fbfe58-kube-api-access-t6xm2\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.582447 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9lp8s\" (UniqueName: \"kubernetes.io/projected/16ddbafa-f8b0-4595-ad04-495f6c886dc3-kube-api-access-9lp8s\") pod \"machine-config-server-74p5m\" (UID: \"16ddbafa-f8b0-4595-ad04-495f6c886dc3\") " pod="openshift-machine-config-operator/machine-config-server-74p5m" Dec 03 06:33:21 crc kubenswrapper[4831]: E1203 06:33:21.582482 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:22.082453928 +0000 UTC m=+139.426037446 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.582849 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7222afd6-e87f-4f82-84a7-ce0316d03086-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7ghk\" (UID: \"7222afd6-e87f-4f82-84a7-ce0316d03086\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.582964 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hng82\" (UniqueName: \"kubernetes.io/projected/7222afd6-e87f-4f82-84a7-ce0316d03086-kube-api-access-hng82\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7ghk\" (UID: \"7222afd6-e87f-4f82-84a7-ce0316d03086\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.582995 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-registration-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.583025 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/64f831a1-076e-47f9-afba-8812335ee8b5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kjgd7\" (UID: \"64f831a1-076e-47f9-afba-8812335ee8b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kjgd7" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.583046 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/348c2fe3-0341-4203-adf6-719a6efbcd75-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.583062 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-csi-data-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.583080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/16ddbafa-f8b0-4595-ad04-495f6c886dc3-certs\") pod \"machine-config-server-74p5m\" (UID: \"16ddbafa-f8b0-4595-ad04-495f6c886dc3\") " pod="openshift-machine-config-operator/machine-config-server-74p5m" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.584659 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.584736 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgr7z\" (UniqueName: \"kubernetes.io/projected/d6c22156-9bf2-4e81-9dba-8657b5761a4f-kube-api-access-fgr7z\") pod \"dns-operator-744455d44c-k8c4l\" (UID: \"d6c22156-9bf2-4e81-9dba-8657b5761a4f\") " pod="openshift-dns-operator/dns-operator-744455d44c-k8c4l" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.584799 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-bound-sa-token\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.584825 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/691274e4-346e-4eed-860a-12513d61bd02-etcd-client\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.584849 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrjm7\" (UniqueName: \"kubernetes.io/projected/1358e7a0-8632-4701-9a97-151d31f553cc-kube-api-access-qrjm7\") pod \"machine-config-controller-84d6567774-6w69b\" (UID: \"1358e7a0-8632-4701-9a97-151d31f553cc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.584872 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348c2fe3-0341-4203-adf6-719a6efbcd75-config\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.584895 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/06fb5461-cc65-431e-9236-70177a346997-tmpfs\") pod \"packageserver-d55dfcdfc-jtcpn\" (UID: \"06fb5461-cc65-431e-9236-70177a346997\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.584952 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.584978 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrkps\" (UniqueName: \"kubernetes.io/projected/ae2cef55-64b8-4739-a0d6-4eca7d968107-kube-api-access-zrkps\") pod \"olm-operator-6b444d44fb-fv8cx\" (UID: \"ae2cef55-64b8-4739-a0d6-4eca7d968107\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585006 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbhrj\" (UniqueName: \"kubernetes.io/projected/199e29b9-0d3f-471b-bf0e-de1576f2654a-kube-api-access-fbhrj\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585057 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-registry-certificates\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585138 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/294bf078-98d9-4c39-8fd5-f39926fbfe58-service-ca-bundle\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585179 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/135075ee-2f44-402b-a071-36b3b720d928-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vjhwt\" (UID: \"135075ee-2f44-402b-a071-36b3b720d928\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585202 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-socket-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585227 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/691274e4-346e-4eed-860a-12513d61bd02-serving-cert\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585248 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585273 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19cb3565-2001-4348-bb7f-79e9b1ce8aa8-config\") pod \"machine-approver-56656f9798-sb6sm\" (UID: \"19cb3565-2001-4348-bb7f-79e9b1ce8aa8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585326 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-registry-tls\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585409 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/691274e4-346e-4eed-860a-12513d61bd02-config\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585444 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585467 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtdx9\" (UniqueName: \"kubernetes.io/projected/a8a0aade-9024-4ece-adbf-c962d36de3bd-kube-api-access-gtdx9\") pod \"ingress-canary-hj64t\" (UID: \"a8a0aade-9024-4ece-adbf-c962d36de3bd\") " pod="openshift-ingress-canary/ingress-canary-hj64t" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585494 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-trusted-ca\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585519 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585540 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8a0aade-9024-4ece-adbf-c962d36de3bd-cert\") pod \"ingress-canary-hj64t\" (UID: \"a8a0aade-9024-4ece-adbf-c962d36de3bd\") " pod="openshift-ingress-canary/ingress-canary-hj64t" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585592 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gb4l\" (UniqueName: \"kubernetes.io/projected/ab667483-406a-451a-9695-d9453775a063-kube-api-access-2gb4l\") pod \"service-ca-operator-777779d784-px6wl\" (UID: \"ab667483-406a-451a-9695-d9453775a063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585616 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585669 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585810 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae2cef55-64b8-4739-a0d6-4eca7d968107-srv-cert\") pod \"olm-operator-6b444d44fb-fv8cx\" (UID: \"ae2cef55-64b8-4739-a0d6-4eca7d968107\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585850 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9df54\" (UniqueName: \"kubernetes.io/projected/7af4c3b5-563b-48c2-89d4-2a0975fad647-kube-api-access-9df54\") pod \"migrator-59844c95c7-l2jlm\" (UID: \"7af4c3b5-563b-48c2-89d4-2a0975fad647\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2jlm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585873 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/16ddbafa-f8b0-4595-ad04-495f6c886dc3-node-bootstrap-token\") pod \"machine-config-server-74p5m\" (UID: \"16ddbafa-f8b0-4595-ad04-495f6c886dc3\") " pod="openshift-machine-config-operator/machine-config-server-74p5m" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585899 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e25a224-94a7-41ce-ae8c-6f60660873c4-metrics-tls\") pod \"dns-default-d2msr\" (UID: \"8e25a224-94a7-41ce-ae8c-6f60660873c4\") " pod="openshift-dns/dns-default-d2msr" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585941 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vtbb\" (UniqueName: \"kubernetes.io/projected/9c8c9c39-1846-4079-ada6-7a668288ac02-kube-api-access-4vtbb\") pod \"package-server-manager-789f6589d5-b6k5g\" (UID: \"9c8c9c39-1846-4079-ada6-7a668288ac02\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.585986 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f2c111fe-9bfd-4701-9d8f-6077978c9d87-signing-cabundle\") pod \"service-ca-9c57cc56f-z4sqp\" (UID: \"f2c111fe-9bfd-4701-9d8f-6077978c9d87\") " pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.586010 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l9stz\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-kube-api-access-l9stz\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.586033 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab667483-406a-451a-9695-d9453775a063-serving-cert\") pod \"service-ca-operator-777779d784-px6wl\" (UID: \"ab667483-406a-451a-9695-d9453775a063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.588623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-trusted-ca\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.588642 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-ca-trust-extracted\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.588981 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/294bf078-98d9-4c39-8fd5-f39926fbfe58-default-certificate\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.589043 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab667483-406a-451a-9695-d9453775a063-config\") pod \"service-ca-operator-777779d784-px6wl\" (UID: \"ab667483-406a-451a-9695-d9453775a063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.589078 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/348c2fe3-0341-4203-adf6-719a6efbcd75-service-ca-bundle\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.589231 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/738e77b8-85e3-468e-8e03-81b14335094e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-57gps\" (UID: \"738e77b8-85e3-468e-8e03-81b14335094e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.589528 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/294bf078-98d9-4c39-8fd5-f39926fbfe58-metrics-certs\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.589552 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnldq\" (UniqueName: \"kubernetes.io/projected/8e25a224-94a7-41ce-ae8c-6f60660873c4-kube-api-access-jnldq\") pod \"dns-default-d2msr\" (UID: \"8e25a224-94a7-41ce-ae8c-6f60660873c4\") " pod="openshift-dns/dns-default-d2msr" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.589607 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/294bf078-98d9-4c39-8fd5-f39926fbfe58-service-ca-bundle\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.589554 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-registry-certificates\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.589815 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06fb5461-cc65-431e-9236-70177a346997-apiservice-cert\") pod \"packageserver-d55dfcdfc-jtcpn\" (UID: \"06fb5461-cc65-431e-9236-70177a346997\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.589882 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.589980 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae2cef55-64b8-4739-a0d6-4eca7d968107-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fv8cx\" (UID: \"ae2cef55-64b8-4739-a0d6-4eca7d968107\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.590061 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tchxf\" (UniqueName: \"kubernetes.io/projected/06fb5461-cc65-431e-9236-70177a346997-kube-api-access-tchxf\") pod \"packageserver-d55dfcdfc-jtcpn\" (UID: \"06fb5461-cc65-431e-9236-70177a346997\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.590110 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2pn2\" (UniqueName: \"kubernetes.io/projected/f2c111fe-9bfd-4701-9d8f-6077978c9d87-kube-api-access-r2pn2\") pod \"service-ca-9c57cc56f-z4sqp\" (UID: \"f2c111fe-9bfd-4701-9d8f-6077978c9d87\") " pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.590164 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab667483-406a-451a-9695-d9453775a063-config\") pod \"service-ca-operator-777779d784-px6wl\" (UID: \"ab667483-406a-451a-9695-d9453775a063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.590479 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-registry-tls\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.590720 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.591308 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738e77b8-85e3-468e-8e03-81b14335094e-config\") pod \"kube-controller-manager-operator-78b949d7b-57gps\" (UID: \"738e77b8-85e3-468e-8e03-81b14335094e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.591357 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drmrm\" (UniqueName: \"kubernetes.io/projected/64f831a1-076e-47f9-afba-8812335ee8b5-kube-api-access-drmrm\") pod \"multus-admission-controller-857f4d67dd-kjgd7\" (UID: \"64f831a1-076e-47f9-afba-8812335ee8b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kjgd7" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.591376 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7222afd6-e87f-4f82-84a7-ce0316d03086-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7ghk\" (UID: \"7222afd6-e87f-4f82-84a7-ce0316d03086\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.591392 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1358e7a0-8632-4701-9a97-151d31f553cc-proxy-tls\") pod \"machine-config-controller-84d6567774-6w69b\" (UID: \"1358e7a0-8632-4701-9a97-151d31f553cc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.591431 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c8c9c39-1846-4079-ada6-7a668288ac02-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b6k5g\" (UID: \"9c8c9c39-1846-4079-ada6-7a668288ac02\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.591459 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/199e29b9-0d3f-471b-bf0e-de1576f2654a-audit-dir\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.591502 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.592169 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-installation-pull-secrets\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.592198 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/294bf078-98d9-4c39-8fd5-f39926fbfe58-default-certificate\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.592221 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/738e77b8-85e3-468e-8e03-81b14335094e-config\") pod \"kube-controller-manager-operator-78b949d7b-57gps\" (UID: \"738e77b8-85e3-468e-8e03-81b14335094e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.591742 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/348c2fe3-0341-4203-adf6-719a6efbcd75-serving-cert\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.592505 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfs6r\" (UniqueName: \"kubernetes.io/projected/348c2fe3-0341-4203-adf6-719a6efbcd75-kube-api-access-gfs6r\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.594586 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/294bf078-98d9-4c39-8fd5-f39926fbfe58-stats-auth\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.594935 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab667483-406a-451a-9695-d9453775a063-serving-cert\") pod \"service-ca-operator-777779d784-px6wl\" (UID: \"ab667483-406a-451a-9695-d9453775a063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.595476 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/64f831a1-076e-47f9-afba-8812335ee8b5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-kjgd7\" (UID: \"64f831a1-076e-47f9-afba-8812335ee8b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kjgd7" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.598836 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.619138 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.623575 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/738e77b8-85e3-468e-8e03-81b14335094e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-57gps\" (UID: \"738e77b8-85e3-468e-8e03-81b14335094e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.635530 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.639467 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9stz\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-kube-api-access-l9stz\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.667028 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gb4l\" (UniqueName: \"kubernetes.io/projected/ab667483-406a-451a-9695-d9453775a063-kube-api-access-2gb4l\") pod \"service-ca-operator-777779d784-px6wl\" (UID: \"ab667483-406a-451a-9695-d9453775a063\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.675978 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693424 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhmwd\" (UniqueName: \"kubernetes.io/projected/453efca8-e843-4972-b9ae-eae3df7b02a6-kube-api-access-fhmwd\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693474 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06fb5461-cc65-431e-9236-70177a346997-webhook-cert\") pod \"packageserver-d55dfcdfc-jtcpn\" (UID: \"06fb5461-cc65-431e-9236-70177a346997\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693498 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693521 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9lp8s\" (UniqueName: \"kubernetes.io/projected/16ddbafa-f8b0-4595-ad04-495f6c886dc3-kube-api-access-9lp8s\") pod \"machine-config-server-74p5m\" (UID: \"16ddbafa-f8b0-4595-ad04-495f6c886dc3\") " pod="openshift-machine-config-operator/machine-config-server-74p5m" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693547 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693570 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6xm2\" (UniqueName: \"kubernetes.io/projected/294bf078-98d9-4c39-8fd5-f39926fbfe58-kube-api-access-t6xm2\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693592 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7222afd6-e87f-4f82-84a7-ce0316d03086-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7ghk\" (UID: \"7222afd6-e87f-4f82-84a7-ce0316d03086\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693614 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hng82\" (UniqueName: \"kubernetes.io/projected/7222afd6-e87f-4f82-84a7-ce0316d03086-kube-api-access-hng82\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7ghk\" (UID: \"7222afd6-e87f-4f82-84a7-ce0316d03086\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693634 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-registration-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693655 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/348c2fe3-0341-4203-adf6-719a6efbcd75-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693674 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-csi-data-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693693 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/16ddbafa-f8b0-4595-ad04-495f6c886dc3-certs\") pod \"machine-config-server-74p5m\" (UID: \"16ddbafa-f8b0-4595-ad04-495f6c886dc3\") " pod="openshift-machine-config-operator/machine-config-server-74p5m" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693718 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgr7z\" (UniqueName: \"kubernetes.io/projected/d6c22156-9bf2-4e81-9dba-8657b5761a4f-kube-api-access-fgr7z\") pod \"dns-operator-744455d44c-k8c4l\" (UID: \"d6c22156-9bf2-4e81-9dba-8657b5761a4f\") " pod="openshift-dns-operator/dns-operator-744455d44c-k8c4l" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693746 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/691274e4-346e-4eed-860a-12513d61bd02-etcd-client\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693769 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrjm7\" (UniqueName: \"kubernetes.io/projected/1358e7a0-8632-4701-9a97-151d31f553cc-kube-api-access-qrjm7\") pod \"machine-config-controller-84d6567774-6w69b\" (UID: \"1358e7a0-8632-4701-9a97-151d31f553cc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693791 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348c2fe3-0341-4203-adf6-719a6efbcd75-config\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693810 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/06fb5461-cc65-431e-9236-70177a346997-tmpfs\") pod \"packageserver-d55dfcdfc-jtcpn\" (UID: \"06fb5461-cc65-431e-9236-70177a346997\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693832 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693854 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrkps\" (UniqueName: \"kubernetes.io/projected/ae2cef55-64b8-4739-a0d6-4eca7d968107-kube-api-access-zrkps\") pod \"olm-operator-6b444d44fb-fv8cx\" (UID: \"ae2cef55-64b8-4739-a0d6-4eca7d968107\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693878 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbhrj\" (UniqueName: \"kubernetes.io/projected/199e29b9-0d3f-471b-bf0e-de1576f2654a-kube-api-access-fbhrj\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693914 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/135075ee-2f44-402b-a071-36b3b720d928-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vjhwt\" (UID: \"135075ee-2f44-402b-a071-36b3b720d928\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693944 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-socket-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693968 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/691274e4-346e-4eed-860a-12513d61bd02-serving-cert\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.693991 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694024 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19cb3565-2001-4348-bb7f-79e9b1ce8aa8-config\") pod \"machine-approver-56656f9798-sb6sm\" (UID: \"19cb3565-2001-4348-bb7f-79e9b1ce8aa8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694048 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/691274e4-346e-4eed-860a-12513d61bd02-config\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694071 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694096 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtdx9\" (UniqueName: \"kubernetes.io/projected/a8a0aade-9024-4ece-adbf-c962d36de3bd-kube-api-access-gtdx9\") pod \"ingress-canary-hj64t\" (UID: \"a8a0aade-9024-4ece-adbf-c962d36de3bd\") " pod="openshift-ingress-canary/ingress-canary-hj64t" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694122 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694144 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8a0aade-9024-4ece-adbf-c962d36de3bd-cert\") pod \"ingress-canary-hj64t\" (UID: \"a8a0aade-9024-4ece-adbf-c962d36de3bd\") " pod="openshift-ingress-canary/ingress-canary-hj64t" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694168 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694191 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694214 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae2cef55-64b8-4739-a0d6-4eca7d968107-srv-cert\") pod \"olm-operator-6b444d44fb-fv8cx\" (UID: \"ae2cef55-64b8-4739-a0d6-4eca7d968107\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694242 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/16ddbafa-f8b0-4595-ad04-495f6c886dc3-node-bootstrap-token\") pod \"machine-config-server-74p5m\" (UID: \"16ddbafa-f8b0-4595-ad04-495f6c886dc3\") " pod="openshift-machine-config-operator/machine-config-server-74p5m" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694264 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e25a224-94a7-41ce-ae8c-6f60660873c4-metrics-tls\") pod \"dns-default-d2msr\" (UID: \"8e25a224-94a7-41ce-ae8c-6f60660873c4\") " pod="openshift-dns/dns-default-d2msr" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694286 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vtbb\" (UniqueName: \"kubernetes.io/projected/9c8c9c39-1846-4079-ada6-7a668288ac02-kube-api-access-4vtbb\") pod \"package-server-manager-789f6589d5-b6k5g\" (UID: \"9c8c9c39-1846-4079-ada6-7a668288ac02\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694310 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f2c111fe-9bfd-4701-9d8f-6077978c9d87-signing-cabundle\") pod \"service-ca-9c57cc56f-z4sqp\" (UID: \"f2c111fe-9bfd-4701-9d8f-6077978c9d87\") " pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694359 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/348c2fe3-0341-4203-adf6-719a6efbcd75-service-ca-bundle\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694380 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/294bf078-98d9-4c39-8fd5-f39926fbfe58-metrics-certs\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694401 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jnldq\" (UniqueName: \"kubernetes.io/projected/8e25a224-94a7-41ce-ae8c-6f60660873c4-kube-api-access-jnldq\") pod \"dns-default-d2msr\" (UID: \"8e25a224-94a7-41ce-ae8c-6f60660873c4\") " pod="openshift-dns/dns-default-d2msr" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694428 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06fb5461-cc65-431e-9236-70177a346997-apiservice-cert\") pod \"packageserver-d55dfcdfc-jtcpn\" (UID: \"06fb5461-cc65-431e-9236-70177a346997\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694449 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694478 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae2cef55-64b8-4739-a0d6-4eca7d968107-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fv8cx\" (UID: \"ae2cef55-64b8-4739-a0d6-4eca7d968107\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694508 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tchxf\" (UniqueName: \"kubernetes.io/projected/06fb5461-cc65-431e-9236-70177a346997-kube-api-access-tchxf\") pod \"packageserver-d55dfcdfc-jtcpn\" (UID: \"06fb5461-cc65-431e-9236-70177a346997\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694530 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2pn2\" (UniqueName: \"kubernetes.io/projected/f2c111fe-9bfd-4701-9d8f-6077978c9d87-kube-api-access-r2pn2\") pod \"service-ca-9c57cc56f-z4sqp\" (UID: \"f2c111fe-9bfd-4701-9d8f-6077978c9d87\") " pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694551 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c8c9c39-1846-4079-ada6-7a668288ac02-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b6k5g\" (UID: \"9c8c9c39-1846-4079-ada6-7a668288ac02\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694582 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7222afd6-e87f-4f82-84a7-ce0316d03086-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7ghk\" (UID: \"7222afd6-e87f-4f82-84a7-ce0316d03086\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694603 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1358e7a0-8632-4701-9a97-151d31f553cc-proxy-tls\") pod \"machine-config-controller-84d6567774-6w69b\" (UID: \"1358e7a0-8632-4701-9a97-151d31f553cc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694626 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694650 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/199e29b9-0d3f-471b-bf0e-de1576f2654a-audit-dir\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694673 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/348c2fe3-0341-4203-adf6-719a6efbcd75-serving-cert\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694697 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfs6r\" (UniqueName: \"kubernetes.io/projected/348c2fe3-0341-4203-adf6-719a6efbcd75-kube-api-access-gfs6r\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694715 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/348c2fe3-0341-4203-adf6-719a6efbcd75-config\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694831 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9df54\" (UniqueName: \"kubernetes.io/projected/7af4c3b5-563b-48c2-89d4-2a0975fad647-kube-api-access-9df54\") pod \"migrator-59844c95c7-l2jlm\" (UID: \"7af4c3b5-563b-48c2-89d4-2a0975fad647\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2jlm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.695141 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/06fb5461-cc65-431e-9236-70177a346997-tmpfs\") pod \"packageserver-d55dfcdfc-jtcpn\" (UID: \"06fb5461-cc65-431e-9236-70177a346997\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.695332 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7222afd6-e87f-4f82-84a7-ce0316d03086-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7ghk\" (UID: \"7222afd6-e87f-4f82-84a7-ce0316d03086\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.696509 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697068 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/06fb5461-cc65-431e-9236-70177a346997-webhook-cert\") pod \"packageserver-d55dfcdfc-jtcpn\" (UID: \"06fb5461-cc65-431e-9236-70177a346997\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:21 crc kubenswrapper[4831]: E1203 06:33:21.697101 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:22.19707923 +0000 UTC m=+139.540662738 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.694724 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drgmj\" (UniqueName: \"kubernetes.io/projected/135075ee-2f44-402b-a071-36b3b720d928-kube-api-access-drgmj\") pod \"marketplace-operator-79b997595-vjhwt\" (UID: \"135075ee-2f44-402b-a071-36b3b720d928\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697236 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f2c111fe-9bfd-4701-9d8f-6077978c9d87-signing-key\") pod \"service-ca-9c57cc56f-z4sqp\" (UID: \"f2c111fe-9bfd-4701-9d8f-6077978c9d87\") " pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697261 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-plugins-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697288 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6c22156-9bf2-4e81-9dba-8657b5761a4f-metrics-tls\") pod \"dns-operator-744455d44c-k8c4l\" (UID: \"d6c22156-9bf2-4e81-9dba-8657b5761a4f\") " pod="openshift-dns-operator/dns-operator-744455d44c-k8c4l" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697330 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7757c\" (UniqueName: \"kubernetes.io/projected/691274e4-346e-4eed-860a-12513d61bd02-kube-api-access-7757c\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697368 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-mountpoint-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697394 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1358e7a0-8632-4701-9a97-151d31f553cc-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6w69b\" (UID: \"1358e7a0-8632-4701-9a97-151d31f553cc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697420 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697445 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpd2s\" (UniqueName: \"kubernetes.io/projected/19cb3565-2001-4348-bb7f-79e9b1ce8aa8-kube-api-access-vpd2s\") pod \"machine-approver-56656f9798-sb6sm\" (UID: \"19cb3565-2001-4348-bb7f-79e9b1ce8aa8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697461 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/691274e4-346e-4eed-860a-12513d61bd02-etcd-client\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697470 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-audit-policies\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697506 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e25a224-94a7-41ce-ae8c-6f60660873c4-config-volume\") pod \"dns-default-d2msr\" (UID: \"8e25a224-94a7-41ce-ae8c-6f60660873c4\") " pod="openshift-dns/dns-default-d2msr" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697528 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/19cb3565-2001-4348-bb7f-79e9b1ce8aa8-machine-approver-tls\") pod \"machine-approver-56656f9798-sb6sm\" (UID: \"19cb3565-2001-4348-bb7f-79e9b1ce8aa8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697546 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/691274e4-346e-4eed-860a-12513d61bd02-etcd-service-ca\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697563 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/135075ee-2f44-402b-a071-36b3b720d928-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vjhwt\" (UID: \"135075ee-2f44-402b-a071-36b3b720d928\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697582 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/691274e4-346e-4eed-860a-12513d61bd02-etcd-ca\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697599 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697616 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/19cb3565-2001-4348-bb7f-79e9b1ce8aa8-auth-proxy-config\") pod \"machine-approver-56656f9798-sb6sm\" (UID: \"19cb3565-2001-4348-bb7f-79e9b1ce8aa8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.697983 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-audit-policies\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.698047 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-mountpoint-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.698135 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/19cb3565-2001-4348-bb7f-79e9b1ce8aa8-auth-proxy-config\") pod \"machine-approver-56656f9798-sb6sm\" (UID: \"19cb3565-2001-4348-bb7f-79e9b1ce8aa8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.699553 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1358e7a0-8632-4701-9a97-151d31f553cc-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6w69b\" (UID: \"1358e7a0-8632-4701-9a97-151d31f553cc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.699941 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-plugins-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.699973 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e25a224-94a7-41ce-ae8c-6f60660873c4-config-volume\") pod \"dns-default-d2msr\" (UID: \"8e25a224-94a7-41ce-ae8c-6f60660873c4\") " pod="openshift-dns/dns-default-d2msr" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.700230 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/135075ee-2f44-402b-a071-36b3b720d928-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-vjhwt\" (UID: \"135075ee-2f44-402b-a071-36b3b720d928\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.700300 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-registration-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.700307 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-socket-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.700482 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/691274e4-346e-4eed-860a-12513d61bd02-etcd-ca\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.701864 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/135075ee-2f44-402b-a071-36b3b720d928-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-vjhwt\" (UID: \"135075ee-2f44-402b-a071-36b3b720d928\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.702725 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/f2c111fe-9bfd-4701-9d8f-6077978c9d87-signing-cabundle\") pod \"service-ca-9c57cc56f-z4sqp\" (UID: \"f2c111fe-9bfd-4701-9d8f-6077978c9d87\") " pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.704115 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/d6c22156-9bf2-4e81-9dba-8657b5761a4f-metrics-tls\") pod \"dns-operator-744455d44c-k8c4l\" (UID: \"d6c22156-9bf2-4e81-9dba-8657b5761a4f\") " pod="openshift-dns-operator/dns-operator-744455d44c-k8c4l" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.705594 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/16ddbafa-f8b0-4595-ad04-495f6c886dc3-certs\") pod \"machine-config-server-74p5m\" (UID: \"16ddbafa-f8b0-4595-ad04-495f6c886dc3\") " pod="openshift-machine-config-operator/machine-config-server-74p5m" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.705704 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/453efca8-e843-4972-b9ae-eae3df7b02a6-csi-data-dir\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.706108 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.706412 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2jlm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.706698 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/691274e4-346e-4eed-860a-12513d61bd02-etcd-service-ca\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.706808 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19cb3565-2001-4348-bb7f-79e9b1ce8aa8-config\") pod \"machine-approver-56656f9798-sb6sm\" (UID: \"19cb3565-2001-4348-bb7f-79e9b1ce8aa8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.708229 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/9c8c9c39-1846-4079-ada6-7a668288ac02-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-b6k5g\" (UID: \"9c8c9c39-1846-4079-ada6-7a668288ac02\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.708305 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/199e29b9-0d3f-471b-bf0e-de1576f2654a-audit-dir\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.708375 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/691274e4-346e-4eed-860a-12513d61bd02-config\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.708844 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/348c2fe3-0341-4203-adf6-719a6efbcd75-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.709511 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.710027 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.710511 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/f2c111fe-9bfd-4701-9d8f-6077978c9d87-signing-key\") pod \"service-ca-9c57cc56f-z4sqp\" (UID: \"f2c111fe-9bfd-4701-9d8f-6077978c9d87\") " pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.711623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.711886 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7222afd6-e87f-4f82-84a7-ce0316d03086-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7ghk\" (UID: \"7222afd6-e87f-4f82-84a7-ce0316d03086\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.714143 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/19cb3565-2001-4348-bb7f-79e9b1ce8aa8-machine-approver-tls\") pod \"machine-approver-56656f9798-sb6sm\" (UID: \"19cb3565-2001-4348-bb7f-79e9b1ce8aa8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.714411 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/691274e4-346e-4eed-860a-12513d61bd02-serving-cert\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.714481 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.716038 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e25a224-94a7-41ce-ae8c-6f60660873c4-metrics-tls\") pod \"dns-default-d2msr\" (UID: \"8e25a224-94a7-41ce-ae8c-6f60660873c4\") " pod="openshift-dns/dns-default-d2msr" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.716153 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/294bf078-98d9-4c39-8fd5-f39926fbfe58-metrics-certs\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.716291 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/348c2fe3-0341-4203-adf6-719a6efbcd75-serving-cert\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.716484 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-bound-sa-token\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.716916 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.718059 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/06fb5461-cc65-431e-9236-70177a346997-apiservice-cert\") pod \"packageserver-d55dfcdfc-jtcpn\" (UID: \"06fb5461-cc65-431e-9236-70177a346997\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.719359 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/ae2cef55-64b8-4739-a0d6-4eca7d968107-profile-collector-cert\") pod \"olm-operator-6b444d44fb-fv8cx\" (UID: \"ae2cef55-64b8-4739-a0d6-4eca7d968107\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.719585 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.720949 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/16ddbafa-f8b0-4595-ad04-495f6c886dc3-node-bootstrap-token\") pod \"machine-config-server-74p5m\" (UID: \"16ddbafa-f8b0-4595-ad04-495f6c886dc3\") " pod="openshift-machine-config-operator/machine-config-server-74p5m" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.721375 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.721867 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/ae2cef55-64b8-4739-a0d6-4eca7d968107-srv-cert\") pod \"olm-operator-6b444d44fb-fv8cx\" (UID: \"ae2cef55-64b8-4739-a0d6-4eca7d968107\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.721987 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1358e7a0-8632-4701-9a97-151d31f553cc-proxy-tls\") pod \"machine-config-controller-84d6567774-6w69b\" (UID: \"1358e7a0-8632-4701-9a97-151d31f553cc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.721876 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.722473 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a8a0aade-9024-4ece-adbf-c962d36de3bd-cert\") pod \"ingress-canary-hj64t\" (UID: \"a8a0aade-9024-4ece-adbf-c962d36de3bd\") " pod="openshift-ingress-canary/ingress-canary-hj64t" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.722531 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.736707 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch"] Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.762950 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhmwd\" (UniqueName: \"kubernetes.io/projected/453efca8-e843-4972-b9ae-eae3df7b02a6-kube-api-access-fhmwd\") pod \"csi-hostpathplugin-dv9z6\" (UID: \"453efca8-e843-4972-b9ae-eae3df7b02a6\") " pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.765756 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/348c2fe3-0341-4203-adf6-719a6efbcd75-service-ca-bundle\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.765986 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drmrm\" (UniqueName: \"kubernetes.io/projected/64f831a1-076e-47f9-afba-8812335ee8b5-kube-api-access-drmrm\") pod \"multus-admission-controller-857f4d67dd-kjgd7\" (UID: \"64f831a1-076e-47f9-afba-8812335ee8b5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-kjgd7" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.781917 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.785235 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57"] Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.785285 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-kjgd7" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.790134 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6xm2\" (UniqueName: \"kubernetes.io/projected/294bf078-98d9-4c39-8fd5-f39926fbfe58-kube-api-access-t6xm2\") pod \"router-default-5444994796-6n7vj\" (UID: \"294bf078-98d9-4c39-8fd5-f39926fbfe58\") " pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.790405 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" event={"ID":"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6","Type":"ContainerStarted","Data":"821b4cd05e29cf58dc8362d02cd7a5a76671820d5aa75028a5f7bc885a0ecc10"} Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.798264 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:21 crc kubenswrapper[4831]: E1203 06:33:21.798891 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:22.298872536 +0000 UTC m=+139.642456034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.806360 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9lp8s\" (UniqueName: \"kubernetes.io/projected/16ddbafa-f8b0-4595-ad04-495f6c886dc3-kube-api-access-9lp8s\") pod \"machine-config-server-74p5m\" (UID: \"16ddbafa-f8b0-4595-ad04-495f6c886dc3\") " pod="openshift-machine-config-operator/machine-config-server-74p5m" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.831511 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drgmj\" (UniqueName: \"kubernetes.io/projected/135075ee-2f44-402b-a071-36b3b720d928-kube-api-access-drgmj\") pod \"marketplace-operator-79b997595-vjhwt\" (UID: \"135075ee-2f44-402b-a071-36b3b720d928\") " pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.837506 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.846065 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.852301 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-2t7pp"] Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.854218 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgr7z\" (UniqueName: \"kubernetes.io/projected/d6c22156-9bf2-4e81-9dba-8657b5761a4f-kube-api-access-fgr7z\") pod \"dns-operator-744455d44c-k8c4l\" (UID: \"d6c22156-9bf2-4e81-9dba-8657b5761a4f\") " pod="openshift-dns-operator/dns-operator-744455d44c-k8c4l" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.860307 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6"] Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.860383 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-shxrz"] Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.864195 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg"] Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.869720 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-dwsb6"] Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.877346 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x"] Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.877942 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-74p5m" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.892567 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrkps\" (UniqueName: \"kubernetes.io/projected/ae2cef55-64b8-4739-a0d6-4eca7d968107-kube-api-access-zrkps\") pod \"olm-operator-6b444d44fb-fv8cx\" (UID: \"ae2cef55-64b8-4739-a0d6-4eca7d968107\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.892875 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.900021 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:21 crc kubenswrapper[4831]: E1203 06:33:21.900570 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:22.40055562 +0000 UTC m=+139.744139138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.909880 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vtbb\" (UniqueName: \"kubernetes.io/projected/9c8c9c39-1846-4079-ada6-7a668288ac02-kube-api-access-4vtbb\") pod \"package-server-manager-789f6589d5-b6k5g\" (UID: \"9c8c9c39-1846-4079-ada6-7a668288ac02\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.916686 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrjm7\" (UniqueName: \"kubernetes.io/projected/1358e7a0-8632-4701-9a97-151d31f553cc-kube-api-access-qrjm7\") pod \"machine-config-controller-84d6567774-6w69b\" (UID: \"1358e7a0-8632-4701-9a97-151d31f553cc\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.924966 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbhrj\" (UniqueName: \"kubernetes.io/projected/199e29b9-0d3f-471b-bf0e-de1576f2654a-kube-api-access-fbhrj\") pod \"oauth-openshift-558db77b4-87j4p\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.947261 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7757c\" (UniqueName: \"kubernetes.io/projected/691274e4-346e-4eed-860a-12513d61bd02-kube-api-access-7757c\") pod \"etcd-operator-b45778765-knpqw\" (UID: \"691274e4-346e-4eed-860a-12513d61bd02\") " pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.976679 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:21 crc kubenswrapper[4831]: I1203 06:33:21.985257 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2pn2\" (UniqueName: \"kubernetes.io/projected/f2c111fe-9bfd-4701-9d8f-6077978c9d87-kube-api-access-r2pn2\") pod \"service-ca-9c57cc56f-z4sqp\" (UID: \"f2c111fe-9bfd-4701-9d8f-6077978c9d87\") " pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.002241 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:22 crc kubenswrapper[4831]: E1203 06:33:22.002734 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:22.50271482 +0000 UTC m=+139.846298328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.018620 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-k8c4l" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.028338 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.029900 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hng82\" (UniqueName: \"kubernetes.io/projected/7222afd6-e87f-4f82-84a7-ce0316d03086-kube-api-access-hng82\") pod \"kube-storage-version-migrator-operator-b67b599dd-b7ghk\" (UID: \"7222afd6-e87f-4f82-84a7-ce0316d03086\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.034343 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tchxf\" (UniqueName: \"kubernetes.io/projected/06fb5461-cc65-431e-9236-70177a346997-kube-api-access-tchxf\") pod \"packageserver-d55dfcdfc-jtcpn\" (UID: \"06fb5461-cc65-431e-9236-70177a346997\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.049271 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnldq\" (UniqueName: \"kubernetes.io/projected/8e25a224-94a7-41ce-ae8c-6f60660873c4-kube-api-access-jnldq\") pod \"dns-default-d2msr\" (UID: \"8e25a224-94a7-41ce-ae8c-6f60660873c4\") " pod="openshift-dns/dns-default-d2msr" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.059710 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpd2s\" (UniqueName: \"kubernetes.io/projected/19cb3565-2001-4348-bb7f-79e9b1ce8aa8-kube-api-access-vpd2s\") pod \"machine-approver-56656f9798-sb6sm\" (UID: \"19cb3565-2001-4348-bb7f-79e9b1ce8aa8\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.061001 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.062190 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.086440 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtdx9\" (UniqueName: \"kubernetes.io/projected/a8a0aade-9024-4ece-adbf-c962d36de3bd-kube-api-access-gtdx9\") pod \"ingress-canary-hj64t\" (UID: \"a8a0aade-9024-4ece-adbf-c962d36de3bd\") " pod="openshift-ingress-canary/ingress-canary-hj64t" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.087768 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.091801 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.094477 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.101274 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfs6r\" (UniqueName: \"kubernetes.io/projected/348c2fe3-0341-4203-adf6-719a6efbcd75-kube-api-access-gfs6r\") pod \"authentication-operator-69f744f599-d7hl4\" (UID: \"348c2fe3-0341-4203-adf6-719a6efbcd75\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.103597 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:22 crc kubenswrapper[4831]: E1203 06:33:22.104035 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:22.604020171 +0000 UTC m=+139.947603679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.108960 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-wcqwh"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.109023 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.110298 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.122479 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.130616 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.163217 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-d2msr" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.170460 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hj64t" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.204280 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:22 crc kubenswrapper[4831]: E1203 06:33:22.204643 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:22.70462489 +0000 UTC m=+140.048208398 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.209047 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.216777 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.284723 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.295338 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-rddpj"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.296253 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.297529 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.305902 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:22 crc kubenswrapper[4831]: E1203 06:33:22.306256 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:22.806238861 +0000 UTC m=+140.149822369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.327951 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.335710 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.341690 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-22sdc"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.373593 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-px6wl"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.406642 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:22 crc kubenswrapper[4831]: E1203 06:33:22.407940 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:22.907924545 +0000 UTC m=+140.251508043 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:22 crc kubenswrapper[4831]: W1203 06:33:22.408802 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode528cdde_c61e_42dd_8c55_e5276df017c6.slice/crio-0b2a6b11a65bd210057cb4a722cdfa0f3f4bc963c3be48c414a9fe4d2058aaf8 WatchSource:0}: Error finding container 0b2a6b11a65bd210057cb4a722cdfa0f3f4bc963c3be48c414a9fe4d2058aaf8: Status 404 returned error can't find the container with id 0b2a6b11a65bd210057cb4a722cdfa0f3f4bc963c3be48c414a9fe4d2058aaf8 Dec 03 06:33:22 crc kubenswrapper[4831]: W1203 06:33:22.409175 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51eb384f_9a82_4a1b_ab8d_749a23376b2f.slice/crio-0223e281f4e2fc30f642444c39054ced139233e8de7462433f0689f414073c5e WatchSource:0}: Error finding container 0223e281f4e2fc30f642444c39054ced139233e8de7462433f0689f414073c5e: Status 404 returned error can't find the container with id 0223e281f4e2fc30f642444c39054ced139233e8de7462433f0689f414073c5e Dec 03 06:33:22 crc kubenswrapper[4831]: W1203 06:33:22.409965 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2985436_5396_4ae0_936a_890d28feee53.slice/crio-a43bc4a159346641d6a693ebbfe45da3a2baff808010e8e646d591d6099faecc WatchSource:0}: Error finding container a43bc4a159346641d6a693ebbfe45da3a2baff808010e8e646d591d6099faecc: Status 404 returned error can't find the container with id a43bc4a159346641d6a693ebbfe45da3a2baff808010e8e646d591d6099faecc Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.432055 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-k8c4l"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.444997 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-l2jlm"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.451197 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vjhwt"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.468734 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.479194 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx"] Dec 03 06:33:22 crc kubenswrapper[4831]: W1203 06:33:22.488069 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49b94e7c_b663_40c2_a89b_ef31f9402ad4.slice/crio-84f83655d05f579a677ee660a51e3c9b58e9ae1bc73707a1f7ecb0b4144dce1b WatchSource:0}: Error finding container 84f83655d05f579a677ee660a51e3c9b58e9ae1bc73707a1f7ecb0b4144dce1b: Status 404 returned error can't find the container with id 84f83655d05f579a677ee660a51e3c9b58e9ae1bc73707a1f7ecb0b4144dce1b Dec 03 06:33:22 crc kubenswrapper[4831]: W1203 06:33:22.489519 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20204b88_623e_47c6_bad0_4d08eba387f7.slice/crio-7d37fc049da9fe2fbe690a39bf627324170e66c674826543e7b4a89dfaabb32c WatchSource:0}: Error finding container 7d37fc049da9fe2fbe690a39bf627324170e66c674826543e7b4a89dfaabb32c: Status 404 returned error can't find the container with id 7d37fc049da9fe2fbe690a39bf627324170e66c674826543e7b4a89dfaabb32c Dec 03 06:33:22 crc kubenswrapper[4831]: W1203 06:33:22.493079 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod699c38a7_81dd_4614_8bb9_cd97b5756fc4.slice/crio-ae95805608391884a4f8608836752fca345ab3acf2daa14ef0ff7f4d0bc2fa45 WatchSource:0}: Error finding container ae95805608391884a4f8608836752fca345ab3acf2daa14ef0ff7f4d0bc2fa45: Status 404 returned error can't find the container with id ae95805608391884a4f8608836752fca345ab3acf2daa14ef0ff7f4d0bc2fa45 Dec 03 06:33:22 crc kubenswrapper[4831]: W1203 06:33:22.495015 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7245e283_43e6_4dc1_b93a_0e3452b903f8.slice/crio-6bf69ac319809d81110131157a352b484bb363470d0640cb0062a8f735a91090 WatchSource:0}: Error finding container 6bf69ac319809d81110131157a352b484bb363470d0640cb0062a8f735a91090: Status 404 returned error can't find the container with id 6bf69ac319809d81110131157a352b484bb363470d0640cb0062a8f735a91090 Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.508225 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:22 crc kubenswrapper[4831]: E1203 06:33:22.508571 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:23.008560095 +0000 UTC m=+140.352143603 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.526585 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-dv9z6"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.527735 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-kjgd7"] Dec 03 06:33:22 crc kubenswrapper[4831]: W1203 06:33:22.533456 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6c22156_9bf2_4e81_9dba_8657b5761a4f.slice/crio-a03b607362b8058e763fec6c3b260044b2c4c57ae5622ad49b122e3a22ab6f84 WatchSource:0}: Error finding container a03b607362b8058e763fec6c3b260044b2c4c57ae5622ad49b122e3a22ab6f84: Status 404 returned error can't find the container with id a03b607362b8058e763fec6c3b260044b2c4c57ae5622ad49b122e3a22ab6f84 Dec 03 06:33:22 crc kubenswrapper[4831]: W1203 06:33:22.593096 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod453efca8_e843_4972_b9ae_eae3df7b02a6.slice/crio-e16c7bbe3ea489afce260084b7702cec50d61f162301d7b3213aea1b242836df WatchSource:0}: Error finding container e16c7bbe3ea489afce260084b7702cec50d61f162301d7b3213aea1b242836df: Status 404 returned error can't find the container with id e16c7bbe3ea489afce260084b7702cec50d61f162301d7b3213aea1b242836df Dec 03 06:33:22 crc kubenswrapper[4831]: W1203 06:33:22.601043 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64f831a1_076e_47f9_afba_8812335ee8b5.slice/crio-f08ed83969c2756fbf24d089f326fd33b50a641772df00ce8d54a6cb2fb21c7f WatchSource:0}: Error finding container f08ed83969c2756fbf24d089f326fd33b50a641772df00ce8d54a6cb2fb21c7f: Status 404 returned error can't find the container with id f08ed83969c2756fbf24d089f326fd33b50a641772df00ce8d54a6cb2fb21c7f Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.609009 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:22 crc kubenswrapper[4831]: E1203 06:33:22.609206 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:23.109182855 +0000 UTC m=+140.452766363 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.609278 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:22 crc kubenswrapper[4831]: E1203 06:33:22.609629 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:23.109616419 +0000 UTC m=+140.453199927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.649693 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-knpqw"] Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.710248 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:22 crc kubenswrapper[4831]: E1203 06:33:22.710418 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:23.210391413 +0000 UTC m=+140.553974921 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.710964 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:22 crc kubenswrapper[4831]: E1203 06:33:22.711279 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:23.211271162 +0000 UTC m=+140.554854670 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.812930 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:22 crc kubenswrapper[4831]: E1203 06:33:22.813524 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:23.313504053 +0000 UTC m=+140.657087561 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.814014 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" event={"ID":"7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc","Type":"ContainerStarted","Data":"096c3e418ec450a87f136af8156c20071df954174eb7a9fa98e76141311698f9"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.816957 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" event={"ID":"453efca8-e843-4972-b9ae-eae3df7b02a6","Type":"ContainerStarted","Data":"e16c7bbe3ea489afce260084b7702cec50d61f162301d7b3213aea1b242836df"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.818223 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2t7pp" event={"ID":"bfba7fc4-12b2-40ff-b18c-170051c75374","Type":"ContainerStarted","Data":"8a2d77a3e5f038d1dfca75d8b6cd7ea3b6c861f168db0485a366c11271b2fc0a"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.819226 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6n7vj" event={"ID":"294bf078-98d9-4c39-8fd5-f39926fbfe58","Type":"ContainerStarted","Data":"11d7f689811eff1e7f462d40eee022d51420012297597e14cfabcfc0c1eb5a02"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.819984 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" event={"ID":"691274e4-346e-4eed-860a-12513d61bd02","Type":"ContainerStarted","Data":"167755e7671509aaa99e04eadcb3c6895ea274f1ea52fb10a328a6021bdf3de8"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.820552 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" event={"ID":"66d7f813-acc8-4e09-9575-9c7848a3b062","Type":"ContainerStarted","Data":"0cbf668432853aa15e220a9884f914481140e554f3073dffd47ead734b1f123c"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.821194 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2jlm" event={"ID":"7af4c3b5-563b-48c2-89d4-2a0975fad647","Type":"ContainerStarted","Data":"ed310be3fadf28a84a058a1ff9b6496114746978f5c2034f494b631500c18a68"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.821851 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-74p5m" event={"ID":"16ddbafa-f8b0-4595-ad04-495f6c886dc3","Type":"ContainerStarted","Data":"0bbf3153851bf1b624a571aa5662a50a3d357d57a6df6e4472eef22f82c80e23"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.822530 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kjgd7" event={"ID":"64f831a1-076e-47f9-afba-8812335ee8b5","Type":"ContainerStarted","Data":"f08ed83969c2756fbf24d089f326fd33b50a641772df00ce8d54a6cb2fb21c7f"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.823270 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" event={"ID":"e528cdde-c61e-42dd-8c55-e5276df017c6","Type":"ContainerStarted","Data":"0b2a6b11a65bd210057cb4a722cdfa0f3f4bc963c3be48c414a9fe4d2058aaf8"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.824394 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" event={"ID":"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6","Type":"ContainerStarted","Data":"4d0252eae95e9c043cee7868a22dd3ea2c0ba97a4f8a0e606f6413b0241e7d4f"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.825087 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" event={"ID":"699c38a7-81dd-4614-8bb9-cd97b5756fc4","Type":"ContainerStarted","Data":"ae95805608391884a4f8608836752fca345ab3acf2daa14ef0ff7f4d0bc2fa45"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.826328 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m" event={"ID":"51eb384f-9a82-4a1b-ab8d-749a23376b2f","Type":"ContainerStarted","Data":"0223e281f4e2fc30f642444c39054ced139233e8de7462433f0689f414073c5e"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.827779 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" event={"ID":"ce1e278b-fdf9-436e-87cb-01c6bb162a6e","Type":"ContainerStarted","Data":"cbd93756300f86760f98f18fa5721535defc89579b331620d6174d334e9e9337"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.828487 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" event={"ID":"7528353d-35b1-42b9-84b7-14d53336d3ef","Type":"ContainerStarted","Data":"38fcf7c84bce1f04a5dcaa046cf407984bc37e8e8c4e2eaae3c9789e02bc3132"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.829106 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-k8c4l" event={"ID":"d6c22156-9bf2-4e81-9dba-8657b5761a4f","Type":"ContainerStarted","Data":"a03b607362b8058e763fec6c3b260044b2c4c57ae5622ad49b122e3a22ab6f84"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.829942 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" event={"ID":"49b94e7c-b663-40c2-a89b-ef31f9402ad4","Type":"ContainerStarted","Data":"84f83655d05f579a677ee660a51e3c9b58e9ae1bc73707a1f7ecb0b4144dce1b"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.831593 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" event={"ID":"98db08ff-cad0-4e36-8b93-7dcb3692a7ef","Type":"ContainerStarted","Data":"c886e388179ab71ddfbc3f507e4536c4821e07ab4ddc8621d802e2cc331ccb08"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.833610 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" event={"ID":"15853325-cf95-43bc-a17f-baecad9d4282","Type":"ContainerStarted","Data":"ad5faa9ed613d6acd5119411aef871a7966288c2ed07572a01d16b13666598c1"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.833637 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" event={"ID":"15853325-cf95-43bc-a17f-baecad9d4282","Type":"ContainerStarted","Data":"a8c9faccb0bc6a3a498b2d974060dfbd38931fbc603136dd26e07cf0739b0b17"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.838470 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" event={"ID":"ab667483-406a-451a-9695-d9453775a063","Type":"ContainerStarted","Data":"b19259ce2976bd1d0f47e27a671b5bf3444700aef6eb82187426b499f2c11142"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.847801 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dwsb6" event={"ID":"c289d28d-642e-4cc4-9d25-f025800585d1","Type":"ContainerStarted","Data":"40d62e3900bad6dab75bc85f9d1a23c515aea109667b2a9eb825f19b5a35c0c4"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.856112 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" event={"ID":"738e77b8-85e3-468e-8e03-81b14335094e","Type":"ContainerStarted","Data":"64fc85496bd93b48f6d0ea4e162bfe5f98e57f094048d01fde2e7a531fa5a73d"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.873786 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" event={"ID":"7245e283-43e6-4dc1-b93a-0e3452b903f8","Type":"ContainerStarted","Data":"6bf69ac319809d81110131157a352b484bb363470d0640cb0062a8f735a91090"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.878272 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" event={"ID":"ed20032c-5db7-416c-9881-d576c432e4ac","Type":"ContainerStarted","Data":"656e87f96f4f0ec368afc8ce8e2c5ac4af6e13d3eaf8092cbc912d04db37e6fc"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.879792 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" event={"ID":"8cfeb8ef-4262-4aef-a179-3018896ace13","Type":"ContainerStarted","Data":"ba316c9a02f345418fa524952d6b5564f14d1fcf784ad74d80654167161a1899"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.879822 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" event={"ID":"8cfeb8ef-4262-4aef-a179-3018896ace13","Type":"ContainerStarted","Data":"c8b154f64d0cb5d1e9286530da0327d0886f1a3e1ec8d9b3c3422cb1c5653f64"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.893178 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" event={"ID":"135075ee-2f44-402b-a071-36b3b720d928","Type":"ContainerStarted","Data":"908ecc83c9385b6fe4b439ba14eb36002a432f67121564e193e7212a63b7eb14"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.897006 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" event={"ID":"f2985436-5396-4ae0-936a-890d28feee53","Type":"ContainerStarted","Data":"a43bc4a159346641d6a693ebbfe45da3a2baff808010e8e646d591d6099faecc"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.914203 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-shxrz" event={"ID":"9acc4f76-607c-4694-aa50-cacf7fa07f50","Type":"ContainerStarted","Data":"cb39f2291992837bc7bd8303e1a9f253faf95585eb4ff6a8899ac6f7ede38ab8"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.914240 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-shxrz" event={"ID":"9acc4f76-607c-4694-aa50-cacf7fa07f50","Type":"ContainerStarted","Data":"6a3e9d6d156df91c470ed27cb4482c3167bbe50fe5e2b1a0409c92fc862ef969"} Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.914367 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:22 crc kubenswrapper[4831]: E1203 06:33:22.914690 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:23.41467813 +0000 UTC m=+140.758261638 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:22 crc kubenswrapper[4831]: I1203 06:33:22.921561 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" event={"ID":"20204b88-623e-47c6-bad0-4d08eba387f7","Type":"ContainerStarted","Data":"7d37fc049da9fe2fbe690a39bf627324170e66c674826543e7b4a89dfaabb32c"} Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.016151 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:23 crc kubenswrapper[4831]: E1203 06:33:23.016305 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:23.516285152 +0000 UTC m=+140.859868660 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.016859 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:23 crc kubenswrapper[4831]: E1203 06:33:23.017220 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:23.517208701 +0000 UTC m=+140.860792199 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.119794 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:23 crc kubenswrapper[4831]: E1203 06:33:23.120540 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:23.620524097 +0000 UTC m=+140.964107605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.173075 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g"] Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.175821 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx"] Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.179338 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hj64t"] Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.223211 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:23 crc kubenswrapper[4831]: E1203 06:33:23.225888 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:23.72587215 +0000 UTC m=+141.069455658 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.244120 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b"] Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.291272 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-d2msr"] Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.294004 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk"] Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.326197 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:23 crc kubenswrapper[4831]: E1203 06:33:23.326587 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:23.826573322 +0000 UTC m=+141.170156830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.327113 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-87j4p"] Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.335005 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-z4sqp"] Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.362484 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn"] Dec 03 06:33:23 crc kubenswrapper[4831]: W1203 06:33:23.411223 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8a0aade_9024_4ece_adbf_c962d36de3bd.slice/crio-b7077a2e068ae38bbb0e0ebb208b8db749b4de5ab48a7f41b22636c9bb3b86bd WatchSource:0}: Error finding container b7077a2e068ae38bbb0e0ebb208b8db749b4de5ab48a7f41b22636c9bb3b86bd: Status 404 returned error can't find the container with id b7077a2e068ae38bbb0e0ebb208b8db749b4de5ab48a7f41b22636c9bb3b86bd Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.427659 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:23 crc kubenswrapper[4831]: E1203 06:33:23.428066 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:23.928053229 +0000 UTC m=+141.271636737 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.528975 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:23 crc kubenswrapper[4831]: E1203 06:33:23.529389 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:24.029374391 +0000 UTC m=+141.372957899 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:23 crc kubenswrapper[4831]: W1203 06:33:23.604802 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e25a224_94a7_41ce_ae8c_6f60660873c4.slice/crio-23f97c14fadacfbd05a81f324cde76f0ba39088adf2eb4939f77cd18dd71d6a6 WatchSource:0}: Error finding container 23f97c14fadacfbd05a81f324cde76f0ba39088adf2eb4939f77cd18dd71d6a6: Status 404 returned error can't find the container with id 23f97c14fadacfbd05a81f324cde76f0ba39088adf2eb4939f77cd18dd71d6a6 Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.631328 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:23 crc kubenswrapper[4831]: E1203 06:33:23.631652 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:24.131640504 +0000 UTC m=+141.475224012 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:23 crc kubenswrapper[4831]: W1203 06:33:23.727725 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19cb3565_2001_4348_bb7f_79e9b1ce8aa8.slice/crio-4d088ee2ce9b405827a0ba5509ef0c340020237a9d0e04bb8489992196b354db WatchSource:0}: Error finding container 4d088ee2ce9b405827a0ba5509ef0c340020237a9d0e04bb8489992196b354db: Status 404 returned error can't find the container with id 4d088ee2ce9b405827a0ba5509ef0c340020237a9d0e04bb8489992196b354db Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.732707 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:23 crc kubenswrapper[4831]: E1203 06:33:23.733143 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:24.233128161 +0000 UTC m=+141.576711669 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.834829 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:23 crc kubenswrapper[4831]: E1203 06:33:23.836157 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:24.336128167 +0000 UTC m=+141.679711675 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.864725 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-d7hl4"] Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.936513 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:23 crc kubenswrapper[4831]: E1203 06:33:23.936976 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:24.436956903 +0000 UTC m=+141.780540411 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:23 crc kubenswrapper[4831]: I1203 06:33:23.964705 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dwsb6" event={"ID":"c289d28d-642e-4cc4-9d25-f025800585d1","Type":"ContainerStarted","Data":"0e7b20210f38cf82dcda97b3c3ee0a459fc00cbccda882277073a39f4efc7e6f"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.006687 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" event={"ID":"20204b88-623e-47c6-bad0-4d08eba387f7","Type":"ContainerStarted","Data":"0687e5fd95d2ae7d431b4dd08ad8c136fc1a0b4bf5fa27efb25639a8a41e0d6d"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.026668 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d2msr" event={"ID":"8e25a224-94a7-41ce-ae8c-6f60660873c4","Type":"ContainerStarted","Data":"23f97c14fadacfbd05a81f324cde76f0ba39088adf2eb4939f77cd18dd71d6a6"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.030041 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" event={"ID":"49b94e7c-b663-40c2-a89b-ef31f9402ad4","Type":"ContainerStarted","Data":"fcdf95f9afa01b516d006d614593b8f2589b13e6e4df160588f0ca02a2ec2f58"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.034140 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.038755 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:24 crc kubenswrapper[4831]: E1203 06:33:24.062508 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:24.562489747 +0000 UTC m=+141.906073255 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.080223 4831 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-fxwnr container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" start-of-body= Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.080499 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" podUID="49b94e7c-b663-40c2-a89b-ef31f9402ad4" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.40:8443/healthz\": dial tcp 10.217.0.40:8443: connect: connection refused" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.134243 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-dwsb6" podStartSLOduration=121.134213143 podStartE2EDuration="2m1.134213143s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.095941557 +0000 UTC m=+141.439525065" watchObservedRunningTime="2025-12-03 06:33:24.134213143 +0000 UTC m=+141.477796641" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.135952 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx" event={"ID":"2e8e18de-44f2-4319-980f-2c29f0e86336","Type":"ContainerStarted","Data":"b7240ea1b931bed462f99dfcd1cf29294b9bc0199cbdfabf675b25f2cacdc58c"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.146237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" event={"ID":"f2c111fe-9bfd-4701-9d8f-6077978c9d87","Type":"ContainerStarted","Data":"09f2c4c0f561d127d043cf584f363531bcba8089c2115d41de16c6b7f288b430"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.149715 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" podStartSLOduration=121.149691893 podStartE2EDuration="2m1.149691893s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.128134557 +0000 UTC m=+141.471718065" watchObservedRunningTime="2025-12-03 06:33:24.149691893 +0000 UTC m=+141.493275421" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.159497 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:24 crc kubenswrapper[4831]: E1203 06:33:24.161024 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:24.661002199 +0000 UTC m=+142.004585707 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.161631 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-d9gxq" podStartSLOduration=121.16103506 podStartE2EDuration="2m1.16103506s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.158058754 +0000 UTC m=+141.501642282" watchObservedRunningTime="2025-12-03 06:33:24.16103506 +0000 UTC m=+141.504618568" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.181854 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" event={"ID":"ed20032c-5db7-416c-9881-d576c432e4ac","Type":"ContainerStarted","Data":"8bf0493ad07b2bdcdb12bc25b30436c0bcdad865a0c65be68050648407768014"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.189796 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" event={"ID":"9c8c9c39-1846-4079-ada6-7a668288ac02","Type":"ContainerStarted","Data":"5256df8b1410f8371b4bb1b74e15dbc6f0af836a3cc325c129030ad51944ac02"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.196910 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2jlm" event={"ID":"7af4c3b5-563b-48c2-89d4-2a0975fad647","Type":"ContainerStarted","Data":"1f38094d6b336d0cfe7e6bdf65ad76c827e18f80c9705d65d0a2c1232c26c129"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.201226 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" event={"ID":"ae2cef55-64b8-4739-a0d6-4eca7d968107","Type":"ContainerStarted","Data":"359a5a5096f9a3018dfea6a517bfb1154c6dff7b643858985163e6f1f9ec7491"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.206223 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" event={"ID":"06fb5461-cc65-431e-9236-70177a346997","Type":"ContainerStarted","Data":"83f5e9d63dee2f7f78f61ed43b5cc33c227fa1c0f45e563691eed73f197a8800"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.210643 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-rfpf6" podStartSLOduration=121.210628781 podStartE2EDuration="2m1.210628781s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.206134916 +0000 UTC m=+141.549718414" watchObservedRunningTime="2025-12-03 06:33:24.210628781 +0000 UTC m=+141.554212289" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.216352 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" event={"ID":"ce1e278b-fdf9-436e-87cb-01c6bb162a6e","Type":"ContainerStarted","Data":"89788d36bc66fcac0a7bdb7d8c7bff3e8d701af5e4b2d2659b8d0ed8482ea003"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.245809 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" event={"ID":"7222afd6-e87f-4f82-84a7-ce0316d03086","Type":"ContainerStarted","Data":"672e59627c678ace41a00e1bad50a592d85c56e9831cd9df06eaa5b1ee1b4308"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.254458 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-2t7pp" event={"ID":"bfba7fc4-12b2-40ff-b18c-170051c75374","Type":"ContainerStarted","Data":"18babdffeb1d52dbad5deefd3de20376e19249e248f9f8d8d20f0e0167b21f5a"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.255429 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-2t7pp" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.259541 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-2t7pp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.259573 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2t7pp" podUID="bfba7fc4-12b2-40ff-b18c-170051c75374" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.260568 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:24 crc kubenswrapper[4831]: E1203 06:33:24.260899 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:24.760886254 +0000 UTC m=+142.104469762 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.261025 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-6n7vj" event={"ID":"294bf078-98d9-4c39-8fd5-f39926fbfe58","Type":"ContainerStarted","Data":"4212020705a2d10b3c892e02cec3098418efcdda5ce2a1a08c10e1745cca8501"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.291247 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" event={"ID":"699c38a7-81dd-4614-8bb9-cd97b5756fc4","Type":"ContainerStarted","Data":"21d9b1b36c3eb5e424864eeb70810632c5bdc3d03eb27cce3be6a9f83485f5c3"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.291468 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.295164 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m" event={"ID":"51eb384f-9a82-4a1b-ab8d-749a23376b2f","Type":"ContainerStarted","Data":"780bc73f4344f2150b7ab3b6b80a0c2923358566a1261172cb6958c30d87600a"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.296818 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" event={"ID":"19cb3565-2001-4348-bb7f-79e9b1ce8aa8","Type":"ContainerStarted","Data":"4d088ee2ce9b405827a0ba5509ef0c340020237a9d0e04bb8489992196b354db"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.298022 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" event={"ID":"199e29b9-0d3f-471b-bf0e-de1576f2654a","Type":"ContainerStarted","Data":"9a6b64a615bdc7044793c96bb824059e3b58b932fe04ad76c41393ae04613031"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.299597 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-74p5m" event={"ID":"16ddbafa-f8b0-4595-ad04-495f6c886dc3","Type":"ContainerStarted","Data":"021ed72d95abaf8fbe467b1f1d018bc5cc6286e1e457649a304acd0fbbe42a72"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.302682 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.313841 4831 generic.go:334] "Generic (PLEG): container finished" podID="7245e283-43e6-4dc1-b93a-0e3452b903f8" containerID="e5fd4f81001ce045e8dbd4e7ff391d3926ff15d7d680bd9909319112575e20a4" exitCode=0 Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.313954 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" event={"ID":"7245e283-43e6-4dc1-b93a-0e3452b903f8","Type":"ContainerDied","Data":"e5fd4f81001ce045e8dbd4e7ff391d3926ff15d7d680bd9909319112575e20a4"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.314589 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-2t7pp" podStartSLOduration=121.314568418 podStartE2EDuration="2m1.314568418s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.308722759 +0000 UTC m=+141.652306267" watchObservedRunningTime="2025-12-03 06:33:24.314568418 +0000 UTC m=+141.658151926" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.354342 4831 generic.go:334] "Generic (PLEG): container finished" podID="66d7f813-acc8-4e09-9575-9c7848a3b062" containerID="bead38e1a687000d954758c9f62fe6e7c1f702cbeb987e0ded5085afa1cb5833" exitCode=0 Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.354430 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" event={"ID":"66d7f813-acc8-4e09-9575-9c7848a3b062","Type":"ContainerDied","Data":"bead38e1a687000d954758c9f62fe6e7c1f702cbeb987e0ded5085afa1cb5833"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.361164 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:24 crc kubenswrapper[4831]: E1203 06:33:24.362543 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:24.862523947 +0000 UTC m=+142.206107475 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.365699 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" event={"ID":"e528cdde-c61e-42dd-8c55-e5276df017c6","Type":"ContainerStarted","Data":"a7f810c2729b06550a79448fe19586d7a79f83aa112785034d92eece47e37f1f"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.395392 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" podStartSLOduration=121.395376927 podStartE2EDuration="2m1.395376927s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.365138921 +0000 UTC m=+141.708722429" watchObservedRunningTime="2025-12-03 06:33:24.395376927 +0000 UTC m=+141.738960435" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.395593 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-74p5m" podStartSLOduration=5.395588984 podStartE2EDuration="5.395588984s" podCreationTimestamp="2025-12-03 06:33:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.393199437 +0000 UTC m=+141.736782945" watchObservedRunningTime="2025-12-03 06:33:24.395588984 +0000 UTC m=+141.739172492" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.423544 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hj64t" event={"ID":"a8a0aade-9024-4ece-adbf-c962d36de3bd","Type":"ContainerStarted","Data":"b7077a2e068ae38bbb0e0ebb208b8db749b4de5ab48a7f41b22636c9bb3b86bd"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.446424 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-6n7vj" podStartSLOduration=121.446404605 podStartE2EDuration="2m1.446404605s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.444542415 +0000 UTC m=+141.788125923" watchObservedRunningTime="2025-12-03 06:33:24.446404605 +0000 UTC m=+141.789988113" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.463094 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:24 crc kubenswrapper[4831]: E1203 06:33:24.464032 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:24.964015483 +0000 UTC m=+142.307598991 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.468994 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" event={"ID":"98db08ff-cad0-4e36-8b93-7dcb3692a7ef","Type":"ContainerStarted","Data":"944abef445dda579f26f22cdb78bb41ed0e18dbda4ef575b877e635fdb5b9634"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.469415 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-d8p7m" podStartSLOduration=121.469401878 podStartE2EDuration="2m1.469401878s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.466884356 +0000 UTC m=+141.810467864" watchObservedRunningTime="2025-12-03 06:33:24.469401878 +0000 UTC m=+141.812985386" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.485067 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hj64t" podStartSLOduration=6.485048283 podStartE2EDuration="6.485048283s" podCreationTimestamp="2025-12-03 06:33:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.483617646 +0000 UTC m=+141.827201154" watchObservedRunningTime="2025-12-03 06:33:24.485048283 +0000 UTC m=+141.828631791" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.525825 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" event={"ID":"1358e7a0-8632-4701-9a97-151d31f553cc","Type":"ContainerStarted","Data":"375ffc1bf342bb2d055a41ae847a348d2789394610637261be42dc66c52d7c3e"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.564140 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" event={"ID":"0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6","Type":"ContainerStarted","Data":"9c90aa979ee61b9f62537791a856a71f5aa304fc5cc2bf355ee9d6e0543b5d03"} Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.564635 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.564739 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:24 crc kubenswrapper[4831]: E1203 06:33:24.569526 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:25.069508791 +0000 UTC m=+142.413092299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.570468 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" podStartSLOduration=121.570451851 podStartE2EDuration="2m1.570451851s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.569922384 +0000 UTC m=+141.913505892" watchObservedRunningTime="2025-12-03 06:33:24.570451851 +0000 UTC m=+141.914035349" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.610377 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-shxrz" podStartSLOduration=121.61036402 podStartE2EDuration="2m1.61036402s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.608099247 +0000 UTC m=+141.951682755" watchObservedRunningTime="2025-12-03 06:33:24.61036402 +0000 UTC m=+141.953947518" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.667401 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:24 crc kubenswrapper[4831]: E1203 06:33:24.669891 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:25.169872382 +0000 UTC m=+142.513455970 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.687674 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" podStartSLOduration=121.687654146 podStartE2EDuration="2m1.687654146s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.661648136 +0000 UTC m=+142.005231644" watchObservedRunningTime="2025-12-03 06:33:24.687654146 +0000 UTC m=+142.031237664" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.688341 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-6pn5x" podStartSLOduration=121.688333538 podStartE2EDuration="2m1.688333538s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.685408273 +0000 UTC m=+142.028991781" watchObservedRunningTime="2025-12-03 06:33:24.688333538 +0000 UTC m=+142.031917056" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.746816 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7pqsq" podStartSLOduration=121.746801326 podStartE2EDuration="2m1.746801326s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:24.746550628 +0000 UTC m=+142.090134146" watchObservedRunningTime="2025-12-03 06:33:24.746801326 +0000 UTC m=+142.090384834" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.770615 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:24 crc kubenswrapper[4831]: E1203 06:33:24.771100 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:25.27108044 +0000 UTC m=+142.614663948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.872893 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:24 crc kubenswrapper[4831]: E1203 06:33:24.873633 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:25.37358219 +0000 UTC m=+142.717165698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.978944 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:24 crc kubenswrapper[4831]: E1203 06:33:24.979596 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:25.479578433 +0000 UTC m=+142.823161941 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.980380 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.991208 4831 patch_prober.go:28] interesting pod/router-default-5444994796-6n7vj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:33:24 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 03 06:33:24 crc kubenswrapper[4831]: [+]process-running ok Dec 03 06:33:24 crc kubenswrapper[4831]: healthz check failed Dec 03 06:33:24 crc kubenswrapper[4831]: I1203 06:33:24.991260 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6n7vj" podUID="294bf078-98d9-4c39-8fd5-f39926fbfe58" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.087746 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:25 crc kubenswrapper[4831]: E1203 06:33:25.088253 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:25.588241532 +0000 UTC m=+142.931825040 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.188936 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:25 crc kubenswrapper[4831]: E1203 06:33:25.189260 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:25.689245854 +0000 UTC m=+143.032829362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.290441 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:25 crc kubenswrapper[4831]: E1203 06:33:25.290737 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:25.790721991 +0000 UTC m=+143.134305499 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.392881 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:25 crc kubenswrapper[4831]: E1203 06:33:25.393174 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:25.89315929 +0000 UTC m=+143.236742798 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.500529 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:25 crc kubenswrapper[4831]: E1203 06:33:25.500811 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:26.000800275 +0000 UTC m=+143.344383783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.563410 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-shxrz" Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.601229 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:25 crc kubenswrapper[4831]: E1203 06:33:25.601662 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:26.101613871 +0000 UTC m=+143.445197369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.602052 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:25 crc kubenswrapper[4831]: E1203 06:33:25.602363 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:26.102355545 +0000 UTC m=+143.445939053 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.616110 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hj64t" event={"ID":"a8a0aade-9024-4ece-adbf-c962d36de3bd","Type":"ContainerStarted","Data":"bfa773041c56b0db504be82e26f6cb3c036347b83099c8898d017374f4bb34da"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.633589 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" event={"ID":"66d7f813-acc8-4e09-9575-9c7848a3b062","Type":"ContainerStarted","Data":"1b5b480e53d0b6c474922e7b522d4cb12da38ca2fcc89593058ac263f6a3f438"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.661108 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2jlm" event={"ID":"7af4c3b5-563b-48c2-89d4-2a0975fad647","Type":"ContainerStarted","Data":"7a28b5c188490bea96f876d687ea7e64228f700966e8642ee015d5916b6a34c0"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.661982 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" podStartSLOduration=122.661968241 podStartE2EDuration="2m2.661968241s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:25.660636627 +0000 UTC m=+143.004220135" watchObservedRunningTime="2025-12-03 06:33:25.661968241 +0000 UTC m=+143.005551749" Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.676197 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" event={"ID":"453efca8-e843-4972-b9ae-eae3df7b02a6","Type":"ContainerStarted","Data":"0108a5fc47569c7ef0bcd8e9900e1e559a5493e316499c056d360b8da08d76ed"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.683022 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" event={"ID":"348c2fe3-0341-4203-adf6-719a6efbcd75","Type":"ContainerStarted","Data":"bdab5293838aa86dc4f542cea30cfa6ee1b10a1231c6edd04dfb0c95b0a3b89c"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.683066 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" event={"ID":"348c2fe3-0341-4203-adf6-719a6efbcd75","Type":"ContainerStarted","Data":"5a85530cf2672d4f2661d382fb696c502eae42b106bd7e2990df36482108bd72"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.699719 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-l2jlm" podStartSLOduration=122.699704429 podStartE2EDuration="2m2.699704429s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:25.698769048 +0000 UTC m=+143.042352556" watchObservedRunningTime="2025-12-03 06:33:25.699704429 +0000 UTC m=+143.043287937" Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.706658 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:25 crc kubenswrapper[4831]: E1203 06:33:25.707494 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:26.20747799 +0000 UTC m=+143.551061498 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.712460 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d2msr" event={"ID":"8e25a224-94a7-41ce-ae8c-6f60660873c4","Type":"ContainerStarted","Data":"089129dbad8d855c27946650a4be31b4a06ffde9593db85cb3a3a2c8edce992c"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.743590 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" event={"ID":"ae2cef55-64b8-4739-a0d6-4eca7d968107","Type":"ContainerStarted","Data":"fe15c74aec3863c133a539b4b054006f6b8fb9087b44378618d6d39e510ccf74"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.744373 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.747443 4831 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-fv8cx container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.747476 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" podUID="ae2cef55-64b8-4739-a0d6-4eca7d968107" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.778718 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" event={"ID":"15853325-cf95-43bc-a17f-baecad9d4282","Type":"ContainerStarted","Data":"b3d4f62a75405354b11678105416428e93805baf1b91ea82f77b5f2c4fd6affa"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.781956 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-d7hl4" podStartSLOduration=122.781942365 podStartE2EDuration="2m2.781942365s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:25.765710441 +0000 UTC m=+143.109293949" watchObservedRunningTime="2025-12-03 06:33:25.781942365 +0000 UTC m=+143.125525873" Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.809676 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:25 crc kubenswrapper[4831]: E1203 06:33:25.810886 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:26.310875079 +0000 UTC m=+143.654458587 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.816885 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" event={"ID":"f2c111fe-9bfd-4701-9d8f-6077978c9d87","Type":"ContainerStarted","Data":"9852cec765db61530cbf04f9d49614e8b10522433b97b7c236656fb996179cb0"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.819373 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" event={"ID":"ce1e278b-fdf9-436e-87cb-01c6bb162a6e","Type":"ContainerStarted","Data":"d1619b529626adab1feff976320be479ca0346851d55084482f017ee6ad26db7"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.821519 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-k8c4l" event={"ID":"d6c22156-9bf2-4e81-9dba-8657b5761a4f","Type":"ContainerStarted","Data":"44d5bf147dd12d8851df19e4d74de46d5daa8c788a090b5f16fb4049bd97abc9"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.839178 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" event={"ID":"7a1bc70a-7234-44c0-9b2d-5f18cc6e13fc","Type":"ContainerStarted","Data":"2e5758662a139225acba9ef68178aec651ed19169c5de5ad2d14772dfc505aad"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.878510 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" event={"ID":"199e29b9-0d3f-471b-bf0e-de1576f2654a","Type":"ContainerStarted","Data":"8a0f5c352b6ace82ca40a49728816b131cc01388b2f543062e1d71fc94174dc1"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.879498 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.883448 4831 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-87j4p container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.30:6443/healthz\": dial tcp 10.217.0.30:6443: connect: connection refused" start-of-body= Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.883483 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" podUID="199e29b9-0d3f-471b-bf0e-de1576f2654a" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.30:6443/healthz\": dial tcp 10.217.0.30:6443: connect: connection refused" Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.896010 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" event={"ID":"691274e4-346e-4eed-860a-12513d61bd02","Type":"ContainerStarted","Data":"d77f4b4bfce00b0dad8f7e5255a2d73aa76465171fa2fd09bf9ddad15ba61e9e"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.913980 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:25 crc kubenswrapper[4831]: E1203 06:33:25.914340 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:26.414303289 +0000 UTC m=+143.757886787 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.914618 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:25 crc kubenswrapper[4831]: E1203 06:33:25.921969 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:26.421945966 +0000 UTC m=+143.765529474 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.922286 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" event={"ID":"ab667483-406a-451a-9695-d9453775a063","Type":"ContainerStarted","Data":"9d205af81edbd84fcfbde87f927f538a4103058cc69ac800cdf696f5a75120ed"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.941384 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" podStartSLOduration=122.941359373 podStartE2EDuration="2m2.941359373s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:25.930660118 +0000 UTC m=+143.274243626" watchObservedRunningTime="2025-12-03 06:33:25.941359373 +0000 UTC m=+143.284942881" Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.957124 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" event={"ID":"1358e7a0-8632-4701-9a97-151d31f553cc","Type":"ContainerStarted","Data":"a38d3bb0e5079e9657eadbbed07f687a2bafccd6cc061132decc726e4562b0e3"} Dec 03 06:33:25 crc kubenswrapper[4831]: I1203 06:33:25.957178 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" event={"ID":"1358e7a0-8632-4701-9a97-151d31f553cc","Type":"ContainerStarted","Data":"a70e87ba66e7f443dfe76acfc08f2c443de98bc125d8d6daac4822746067f3dc"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.001646 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" event={"ID":"19cb3565-2001-4348-bb7f-79e9b1ce8aa8","Type":"ContainerStarted","Data":"710727bcdd196e2dac71466072a177e8a76285828f4659a7c354f9cb1e881a75"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.002431 4831 patch_prober.go:28] interesting pod/router-default-5444994796-6n7vj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:33:26 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 03 06:33:26 crc kubenswrapper[4831]: [+]process-running ok Dec 03 06:33:26 crc kubenswrapper[4831]: healthz check failed Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.002459 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6n7vj" podUID="294bf078-98d9-4c39-8fd5-f39926fbfe58" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.015351 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-pzrgg" podStartSLOduration=123.015328811 podStartE2EDuration="2m3.015328811s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.0028888 +0000 UTC m=+143.346472308" watchObservedRunningTime="2025-12-03 06:33:26.015328811 +0000 UTC m=+143.358912319" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.016262 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:26 crc kubenswrapper[4831]: E1203 06:33:26.016452 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:26.516440608 +0000 UTC m=+143.860024116 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.016652 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:26 crc kubenswrapper[4831]: E1203 06:33:26.018128 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:26.518120661 +0000 UTC m=+143.861704169 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.061362 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kjgd7" event={"ID":"64f831a1-076e-47f9-afba-8812335ee8b5","Type":"ContainerStarted","Data":"713007540ef45524638ad0e48873b1d9f9220afff3b6a713a9a8a12275d7c0e4"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.061407 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-kjgd7" event={"ID":"64f831a1-076e-47f9-afba-8812335ee8b5","Type":"ContainerStarted","Data":"d431e8b299f4d9c638ad1e210c8a290a838ae5c993cc62597fb18996044fcafe"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.062831 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-xdfq7" podStartSLOduration=123.062815675 podStartE2EDuration="2m3.062815675s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.04407748 +0000 UTC m=+143.387660988" watchObservedRunningTime="2025-12-03 06:33:26.062815675 +0000 UTC m=+143.406399183" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.075168 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" event={"ID":"738e77b8-85e3-468e-8e03-81b14335094e","Type":"ContainerStarted","Data":"28966b80ad2f648f8efe430614ca6e7d258a079fdcd9da4266d1f998337b3266"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.093373 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-px6wl" podStartSLOduration=123.093359701 podStartE2EDuration="2m3.093359701s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.090601872 +0000 UTC m=+143.434185380" watchObservedRunningTime="2025-12-03 06:33:26.093359701 +0000 UTC m=+143.436943209" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.107895 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx" event={"ID":"2e8e18de-44f2-4319-980f-2c29f0e86336","Type":"ContainerStarted","Data":"4841a6644d87091cb149c345c143dd58677573170383da0a0bd50080357281a6"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.107937 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx" event={"ID":"2e8e18de-44f2-4319-980f-2c29f0e86336","Type":"ContainerStarted","Data":"6f6f7f7a541d4774acab0db1294054e767374712ef224fc4255bdd63c2d96d8b"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.109943 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" event={"ID":"7222afd6-e87f-4f82-84a7-ce0316d03086","Type":"ContainerStarted","Data":"693dc08ad478866c0267465cbdcff59bb99926ac73fdb2459ab8d536595468de"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.119458 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" event={"ID":"135075ee-2f44-402b-a071-36b3b720d928","Type":"ContainerStarted","Data":"7973a7bb0fe3f24d5a7fef9c18eefc12726352545301f6c9c6d16b23bc76b8c8"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.120617 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.120959 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:26 crc kubenswrapper[4831]: E1203 06:33:26.122466 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:26.622450791 +0000 UTC m=+143.966034299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.124886 4831 generic.go:334] "Generic (PLEG): container finished" podID="f2985436-5396-4ae0-936a-890d28feee53" containerID="1ffd6cc6aa4f88828075711385aa72588a6b2014448e12f100014cdffd6c8ee7" exitCode=0 Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.124947 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" event={"ID":"f2985436-5396-4ae0-936a-890d28feee53","Type":"ContainerDied","Data":"1ffd6cc6aa4f88828075711385aa72588a6b2014448e12f100014cdffd6c8ee7"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.132581 4831 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-vjhwt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.132639 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" podUID="135075ee-2f44-402b-a071-36b3b720d928" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.149418 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" event={"ID":"06fb5461-cc65-431e-9236-70177a346997","Type":"ContainerStarted","Data":"e76e93603f5e4b361cf60d21abb6ec531f51bce7b071fd27b3ab45b24491a295"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.149559 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-z4sqp" podStartSLOduration=123.149542475 podStartE2EDuration="2m3.149542475s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.148172652 +0000 UTC m=+143.491756160" watchObservedRunningTime="2025-12-03 06:33:26.149542475 +0000 UTC m=+143.493125983" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.150247 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.156413 4831 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-jtcpn container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" start-of-body= Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.156459 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" podUID="06fb5461-cc65-431e-9236-70177a346997" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.27:5443/healthz\": dial tcp 10.217.0.27:5443: connect: connection refused" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.184892 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" event={"ID":"7528353d-35b1-42b9-84b7-14d53336d3ef","Type":"ContainerStarted","Data":"5642ba4de2471f1e2524da26e6177a858b8d2758a4abd058b4fc9ea5d0f93f79"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.225368 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:26 crc kubenswrapper[4831]: E1203 06:33:26.228058 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:26.72802807 +0000 UTC m=+144.071611568 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.228444 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.229647 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.234443 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" event={"ID":"9c8c9c39-1846-4079-ada6-7a668288ac02","Type":"ContainerStarted","Data":"36148d98b54f6536ce768c71d7a25edb57b058c927e398cd2b62a760e05661ea"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.234472 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" event={"ID":"9c8c9c39-1846-4079-ada6-7a668288ac02","Type":"ContainerStarted","Data":"01aca614949d7fff5b804c2c25bf035b14bd41d5940ac0973abd51daf1645c6d"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.234515 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.235190 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" podStartSLOduration=123.235175531 podStartE2EDuration="2m3.235175531s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.233738214 +0000 UTC m=+143.577321722" watchObservedRunningTime="2025-12-03 06:33:26.235175531 +0000 UTC m=+143.578759039" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.235416 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-k8c4l" podStartSLOduration=123.235412359 podStartE2EDuration="2m3.235412359s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.185687013 +0000 UTC m=+143.529270521" watchObservedRunningTime="2025-12-03 06:33:26.235412359 +0000 UTC m=+143.578995867" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.240707 4831 patch_prober.go:28] interesting pod/apiserver-7bbb656c7d-48l57 container/oauth-apiserver namespace/openshift-oauth-apiserver: Startup probe status=failure output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.240755 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" podUID="66d7f813-acc8-4e09-9575-9c7848a3b062" containerName="oauth-apiserver" probeResult="failure" output="Get \"https://10.217.0.8:8443/livez\": dial tcp 10.217.0.8:8443: connect: connection refused" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.269180 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" event={"ID":"7245e283-43e6-4dc1-b93a-0e3452b903f8","Type":"ContainerStarted","Data":"cf5adc53117c169692f0b53a5fbc368da283cbd846148ae6b54c9eb877b8c32c"} Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.269225 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.269889 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-2t7pp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.269934 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2t7pp" podUID="bfba7fc4-12b2-40ff-b18c-170051c75374" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.282927 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-knpqw" podStartSLOduration=123.282901253 podStartE2EDuration="2m3.282901253s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.268031822 +0000 UTC m=+143.611615320" watchObservedRunningTime="2025-12-03 06:33:26.282901253 +0000 UTC m=+143.626484771" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.295885 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-fxwnr" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.327615 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:26 crc kubenswrapper[4831]: E1203 06:33:26.328747 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:26.828733292 +0000 UTC m=+144.172316800 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.359998 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" podStartSLOduration=123.359982041 podStartE2EDuration="2m3.359982041s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.359652911 +0000 UTC m=+143.703236419" watchObservedRunningTime="2025-12-03 06:33:26.359982041 +0000 UTC m=+143.703565549" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.360185 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-5x48c" podStartSLOduration=123.360181098 podStartE2EDuration="2m3.360181098s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.307258829 +0000 UTC m=+143.650842337" watchObservedRunningTime="2025-12-03 06:33:26.360181098 +0000 UTC m=+143.703764606" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.394924 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" podStartSLOduration=123.39490697 podStartE2EDuration="2m3.39490697s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.393783373 +0000 UTC m=+143.737366891" watchObservedRunningTime="2025-12-03 06:33:26.39490697 +0000 UTC m=+143.738490468" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.428974 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" podStartSLOduration=123.42895568 podStartE2EDuration="2m3.42895568s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.420787186 +0000 UTC m=+143.764370704" watchObservedRunningTime="2025-12-03 06:33:26.42895568 +0000 UTC m=+143.772539188" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.435815 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:26 crc kubenswrapper[4831]: E1203 06:33:26.440755 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:26.940742989 +0000 UTC m=+144.284326497 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.443997 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-64gsc"] Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.445305 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.451616 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.463448 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-64gsc"] Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.480185 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-b7ghk" podStartSLOduration=123.465302563 podStartE2EDuration="2m3.465302563s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.463931848 +0000 UTC m=+143.807515356" watchObservedRunningTime="2025-12-03 06:33:26.465302563 +0000 UTC m=+143.808886071" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.536755 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.536985 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cdecddf-df66-4aef-bc33-65cbcf74db58-utilities\") pod \"certified-operators-64gsc\" (UID: \"5cdecddf-df66-4aef-bc33-65cbcf74db58\") " pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.537049 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cdecddf-df66-4aef-bc33-65cbcf74db58-catalog-content\") pod \"certified-operators-64gsc\" (UID: \"5cdecddf-df66-4aef-bc33-65cbcf74db58\") " pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.537108 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4npb\" (UniqueName: \"kubernetes.io/projected/5cdecddf-df66-4aef-bc33-65cbcf74db58-kube-api-access-x4npb\") pod \"certified-operators-64gsc\" (UID: \"5cdecddf-df66-4aef-bc33-65cbcf74db58\") " pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:33:26 crc kubenswrapper[4831]: E1203 06:33:26.537210 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:27.037196685 +0000 UTC m=+144.380780193 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.563959 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6w69b" podStartSLOduration=123.563936828 podStartE2EDuration="2m3.563936828s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.552624752 +0000 UTC m=+143.896208260" watchObservedRunningTime="2025-12-03 06:33:26.563936828 +0000 UTC m=+143.907520326" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.605766 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rtq2s"] Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.613824 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-kjgd7" podStartSLOduration=123.613806989 podStartE2EDuration="2m3.613806989s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.611732881 +0000 UTC m=+143.955316389" watchObservedRunningTime="2025-12-03 06:33:26.613806989 +0000 UTC m=+143.957390497" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.614787 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.633819 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.638399 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cdecddf-df66-4aef-bc33-65cbcf74db58-utilities\") pod \"certified-operators-64gsc\" (UID: \"5cdecddf-df66-4aef-bc33-65cbcf74db58\") " pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.638462 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cdecddf-df66-4aef-bc33-65cbcf74db58-catalog-content\") pod \"certified-operators-64gsc\" (UID: \"5cdecddf-df66-4aef-bc33-65cbcf74db58\") " pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.638506 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.638529 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4npb\" (UniqueName: \"kubernetes.io/projected/5cdecddf-df66-4aef-bc33-65cbcf74db58-kube-api-access-x4npb\") pod \"certified-operators-64gsc\" (UID: \"5cdecddf-df66-4aef-bc33-65cbcf74db58\") " pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.639500 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cdecddf-df66-4aef-bc33-65cbcf74db58-utilities\") pod \"certified-operators-64gsc\" (UID: \"5cdecddf-df66-4aef-bc33-65cbcf74db58\") " pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.640603 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cdecddf-df66-4aef-bc33-65cbcf74db58-catalog-content\") pod \"certified-operators-64gsc\" (UID: \"5cdecddf-df66-4aef-bc33-65cbcf74db58\") " pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:33:26 crc kubenswrapper[4831]: E1203 06:33:26.640812 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:27.140801571 +0000 UTC m=+144.484385079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.641208 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rtq2s"] Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.649802 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-vxszx" podStartSLOduration=123.649783891 podStartE2EDuration="2m3.649783891s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.648672834 +0000 UTC m=+143.992256333" watchObservedRunningTime="2025-12-03 06:33:26.649783891 +0000 UTC m=+143.993367389" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.680148 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-jtlgt" podStartSLOduration=123.680133941 podStartE2EDuration="2m3.680133941s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.677207096 +0000 UTC m=+144.020790604" watchObservedRunningTime="2025-12-03 06:33:26.680133941 +0000 UTC m=+144.023717449" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.699312 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4npb\" (UniqueName: \"kubernetes.io/projected/5cdecddf-df66-4aef-bc33-65cbcf74db58-kube-api-access-x4npb\") pod \"certified-operators-64gsc\" (UID: \"5cdecddf-df66-4aef-bc33-65cbcf74db58\") " pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.740600 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.740854 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9w5n\" (UniqueName: \"kubernetes.io/projected/ccfbc043-c76e-4afd-a7e7-db0057427fa5-kube-api-access-x9w5n\") pod \"community-operators-rtq2s\" (UID: \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\") " pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.740979 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccfbc043-c76e-4afd-a7e7-db0057427fa5-catalog-content\") pod \"community-operators-rtq2s\" (UID: \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\") " pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.741038 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccfbc043-c76e-4afd-a7e7-db0057427fa5-utilities\") pod \"community-operators-rtq2s\" (UID: \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\") " pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:33:26 crc kubenswrapper[4831]: E1203 06:33:26.741157 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:27.241137721 +0000 UTC m=+144.584721229 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.784625 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.831484 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-57gps" podStartSLOduration=123.831469278 podStartE2EDuration="2m3.831469278s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.725261908 +0000 UTC m=+144.068845416" watchObservedRunningTime="2025-12-03 06:33:26.831469278 +0000 UTC m=+144.175052786" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.843787 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccfbc043-c76e-4afd-a7e7-db0057427fa5-catalog-content\") pod \"community-operators-rtq2s\" (UID: \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\") " pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.843846 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccfbc043-c76e-4afd-a7e7-db0057427fa5-utilities\") pod \"community-operators-rtq2s\" (UID: \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\") " pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.843869 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.843891 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9w5n\" (UniqueName: \"kubernetes.io/projected/ccfbc043-c76e-4afd-a7e7-db0057427fa5-kube-api-access-x9w5n\") pod \"community-operators-rtq2s\" (UID: \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\") " pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.844507 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccfbc043-c76e-4afd-a7e7-db0057427fa5-catalog-content\") pod \"community-operators-rtq2s\" (UID: \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\") " pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.844725 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccfbc043-c76e-4afd-a7e7-db0057427fa5-utilities\") pod \"community-operators-rtq2s\" (UID: \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\") " pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:33:26 crc kubenswrapper[4831]: E1203 06:33:26.844933 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:27.344921882 +0000 UTC m=+144.688505390 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.873210 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-s4k78"] Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.874141 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.901280 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9w5n\" (UniqueName: \"kubernetes.io/projected/ccfbc043-c76e-4afd-a7e7-db0057427fa5-kube-api-access-x9w5n\") pod \"community-operators-rtq2s\" (UID: \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\") " pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.912825 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s4k78"] Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.946938 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.947250 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/845a21ea-b176-471d-bacd-b98289285d1c-utilities\") pod \"certified-operators-s4k78\" (UID: \"845a21ea-b176-471d-bacd-b98289285d1c\") " pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.947479 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/845a21ea-b176-471d-bacd-b98289285d1c-catalog-content\") pod \"certified-operators-s4k78\" (UID: \"845a21ea-b176-471d-bacd-b98289285d1c\") " pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.947497 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l75q8\" (UniqueName: \"kubernetes.io/projected/845a21ea-b176-471d-bacd-b98289285d1c-kube-api-access-l75q8\") pod \"certified-operators-s4k78\" (UID: \"845a21ea-b176-471d-bacd-b98289285d1c\") " pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:33:26 crc kubenswrapper[4831]: E1203 06:33:26.947596 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:27.447581817 +0000 UTC m=+144.791165315 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.954862 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.955271 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" podStartSLOduration=123.955256126 podStartE2EDuration="2m3.955256126s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.939680752 +0000 UTC m=+144.283264260" watchObservedRunningTime="2025-12-03 06:33:26.955256126 +0000 UTC m=+144.298839634" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.989678 4831 patch_prober.go:28] interesting pod/router-default-5444994796-6n7vj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:33:26 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 03 06:33:26 crc kubenswrapper[4831]: [+]process-running ok Dec 03 06:33:26 crc kubenswrapper[4831]: healthz check failed Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.989950 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6n7vj" podUID="294bf078-98d9-4c39-8fd5-f39926fbfe58" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:33:26 crc kubenswrapper[4831]: I1203 06:33:26.991414 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" podStartSLOduration=123.991394843 podStartE2EDuration="2m3.991394843s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:26.990457102 +0000 UTC m=+144.334040610" watchObservedRunningTime="2025-12-03 06:33:26.991394843 +0000 UTC m=+144.334978351" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.019735 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-fdmr5"] Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.020784 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.049159 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/845a21ea-b176-471d-bacd-b98289285d1c-utilities\") pod \"certified-operators-s4k78\" (UID: \"845a21ea-b176-471d-bacd-b98289285d1c\") " pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.049207 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.049228 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/845a21ea-b176-471d-bacd-b98289285d1c-catalog-content\") pod \"certified-operators-s4k78\" (UID: \"845a21ea-b176-471d-bacd-b98289285d1c\") " pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.049249 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l75q8\" (UniqueName: \"kubernetes.io/projected/845a21ea-b176-471d-bacd-b98289285d1c-kube-api-access-l75q8\") pod \"certified-operators-s4k78\" (UID: \"845a21ea-b176-471d-bacd-b98289285d1c\") " pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:33:27 crc kubenswrapper[4831]: E1203 06:33:27.050329 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:27.550295685 +0000 UTC m=+144.893879183 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.050763 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/845a21ea-b176-471d-bacd-b98289285d1c-utilities\") pod \"certified-operators-s4k78\" (UID: \"845a21ea-b176-471d-bacd-b98289285d1c\") " pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.051200 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/845a21ea-b176-471d-bacd-b98289285d1c-catalog-content\") pod \"certified-operators-s4k78\" (UID: \"845a21ea-b176-471d-bacd-b98289285d1c\") " pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.052779 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fdmr5"] Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.149876 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.150080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9b9xn\" (UniqueName: \"kubernetes.io/projected/de96d02d-bef7-48a1-85f6-2d6086ff6498-kube-api-access-9b9xn\") pod \"community-operators-fdmr5\" (UID: \"de96d02d-bef7-48a1-85f6-2d6086ff6498\") " pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.150160 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de96d02d-bef7-48a1-85f6-2d6086ff6498-utilities\") pod \"community-operators-fdmr5\" (UID: \"de96d02d-bef7-48a1-85f6-2d6086ff6498\") " pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.150239 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de96d02d-bef7-48a1-85f6-2d6086ff6498-catalog-content\") pod \"community-operators-fdmr5\" (UID: \"de96d02d-bef7-48a1-85f6-2d6086ff6498\") " pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:33:27 crc kubenswrapper[4831]: E1203 06:33:27.150346 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:27.650332785 +0000 UTC m=+144.993916293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.150718 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l75q8\" (UniqueName: \"kubernetes.io/projected/845a21ea-b176-471d-bacd-b98289285d1c-kube-api-access-l75q8\") pod \"certified-operators-s4k78\" (UID: \"845a21ea-b176-471d-bacd-b98289285d1c\") " pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.191887 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.253219 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de96d02d-bef7-48a1-85f6-2d6086ff6498-utilities\") pod \"community-operators-fdmr5\" (UID: \"de96d02d-bef7-48a1-85f6-2d6086ff6498\") " pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.253488 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.253530 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de96d02d-bef7-48a1-85f6-2d6086ff6498-catalog-content\") pod \"community-operators-fdmr5\" (UID: \"de96d02d-bef7-48a1-85f6-2d6086ff6498\") " pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.253570 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9b9xn\" (UniqueName: \"kubernetes.io/projected/de96d02d-bef7-48a1-85f6-2d6086ff6498-kube-api-access-9b9xn\") pod \"community-operators-fdmr5\" (UID: \"de96d02d-bef7-48a1-85f6-2d6086ff6498\") " pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.253994 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de96d02d-bef7-48a1-85f6-2d6086ff6498-utilities\") pod \"community-operators-fdmr5\" (UID: \"de96d02d-bef7-48a1-85f6-2d6086ff6498\") " pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:33:27 crc kubenswrapper[4831]: E1203 06:33:27.254141 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:27.754131017 +0000 UTC m=+145.097714525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.254285 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de96d02d-bef7-48a1-85f6-2d6086ff6498-catalog-content\") pod \"community-operators-fdmr5\" (UID: \"de96d02d-bef7-48a1-85f6-2d6086ff6498\") " pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.310677 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9b9xn\" (UniqueName: \"kubernetes.io/projected/de96d02d-bef7-48a1-85f6-2d6086ff6498-kube-api-access-9b9xn\") pod \"community-operators-fdmr5\" (UID: \"de96d02d-bef7-48a1-85f6-2d6086ff6498\") " pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.329477 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-k8c4l" event={"ID":"d6c22156-9bf2-4e81-9dba-8657b5761a4f","Type":"ContainerStarted","Data":"cd193baee332e8a588ce3f74e73342f846213c58dc9acde16586b13f14a0e762"} Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.341699 4831 generic.go:334] "Generic (PLEG): container finished" podID="e528cdde-c61e-42dd-8c55-e5276df017c6" containerID="a7f810c2729b06550a79448fe19586d7a79f83aa112785034d92eece47e37f1f" exitCode=0 Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.341794 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" event={"ID":"e528cdde-c61e-42dd-8c55-e5276df017c6","Type":"ContainerDied","Data":"a7f810c2729b06550a79448fe19586d7a79f83aa112785034d92eece47e37f1f"} Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.364790 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.365561 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:27 crc kubenswrapper[4831]: E1203 06:33:27.365915 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:27.865897866 +0000 UTC m=+145.209481374 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.371926 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" event={"ID":"453efca8-e843-4972-b9ae-eae3df7b02a6","Type":"ContainerStarted","Data":"8763d90861f8829226ab98fe2ff801de820749ed16f7eecc3bb39977540825a9"} Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.452023 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-sb6sm" event={"ID":"19cb3565-2001-4348-bb7f-79e9b1ce8aa8","Type":"ContainerStarted","Data":"2fd825a312de790863d151633717a1acc2d1439363e5bfc259c67ae95b659d77"} Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.455126 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" event={"ID":"f2985436-5396-4ae0-936a-890d28feee53","Type":"ContainerStarted","Data":"a89cc77c5cfb483ec35b787cd81ed6e900d369d3cbed9c47d6359bed325b459c"} Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.463115 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-d2msr" event={"ID":"8e25a224-94a7-41ce-ae8c-6f60660873c4","Type":"ContainerStarted","Data":"55dc6dad3e8fd5de9ab0aed5bd620e1557fde34aa7be1de9ab68af8dc0b2634c"} Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.493070 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:27 crc kubenswrapper[4831]: E1203 06:33:27.494435 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:27.994419547 +0000 UTC m=+145.338003055 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.501108 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-2t7pp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.501163 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2t7pp" podUID="bfba7fc4-12b2-40ff-b18c-170051c75374" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.509249 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-64gsc"] Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.520921 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-fv8cx" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.523005 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-d2msr" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.533299 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-d2msr" podStartSLOduration=9.533251041 podStartE2EDuration="9.533251041s" podCreationTimestamp="2025-12-03 06:33:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:27.518783773 +0000 UTC m=+144.862367291" watchObservedRunningTime="2025-12-03 06:33:27.533251041 +0000 UTC m=+144.876834549" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.533832 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.587919 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.595521 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:27 crc kubenswrapper[4831]: E1203 06:33:27.597287 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:28.097273068 +0000 UTC m=+145.440856576 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.607895 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.608152 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.698974 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:27 crc kubenswrapper[4831]: E1203 06:33:27.699365 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:28.199350615 +0000 UTC m=+145.542934123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.706684 4831 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.799520 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:27 crc kubenswrapper[4831]: E1203 06:33:27.799950 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:28.299933303 +0000 UTC m=+145.643516811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.901096 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:27 crc kubenswrapper[4831]: E1203 06:33:27.901630 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:28.401618477 +0000 UTC m=+145.745201985 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.986466 4831 patch_prober.go:28] interesting pod/router-default-5444994796-6n7vj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:33:27 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 03 06:33:27 crc kubenswrapper[4831]: [+]process-running ok Dec 03 06:33:27 crc kubenswrapper[4831]: healthz check failed Dec 03 06:33:27 crc kubenswrapper[4831]: I1203 06:33:27.986672 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6n7vj" podUID="294bf078-98d9-4c39-8fd5-f39926fbfe58" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.002477 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:28 crc kubenswrapper[4831]: E1203 06:33:28.002773 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:28.502759093 +0000 UTC m=+145.846342601 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.101769 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-fdmr5"] Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.104145 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:28 crc kubenswrapper[4831]: E1203 06:33:28.104434 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:28.604422706 +0000 UTC m=+145.948006204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.147064 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-jtcpn" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.175576 4831 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2025-12-03T06:33:27.706703452Z","Handler":null,"Name":""} Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.181235 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rtq2s"] Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.206544 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:28 crc kubenswrapper[4831]: E1203 06:33:28.206686 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2025-12-03 06:33:28.706660528 +0000 UTC m=+146.050244036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.206798 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:28 crc kubenswrapper[4831]: E1203 06:33:28.207115 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2025-12-03 06:33:28.707102612 +0000 UTC m=+146.050686120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-ncw4v" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.209716 4831 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.209743 4831 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.231244 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-s4k78"] Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.307523 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.403091 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.409377 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.424376 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.424412 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.459882 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-ncw4v\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.469413 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" event={"ID":"453efca8-e843-4972-b9ae-eae3df7b02a6","Type":"ContainerStarted","Data":"0d630c9f590987b7822c58c5e8978d1d36d2579b6f645c2004417f1a52152996"} Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.470161 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdmr5" event={"ID":"de96d02d-bef7-48a1-85f6-2d6086ff6498","Type":"ContainerStarted","Data":"ea6004a14ec9765a93c18a26d241f288366156cb5a7a54206e46e2a8620f6f86"} Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.471859 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" event={"ID":"f2985436-5396-4ae0-936a-890d28feee53","Type":"ContainerStarted","Data":"4d7c5623a8f462f2779d77490bf1bed9fdeded3d7a8fd070ac648cee7d821312"} Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.472890 4831 generic.go:334] "Generic (PLEG): container finished" podID="5cdecddf-df66-4aef-bc33-65cbcf74db58" containerID="f9874f5c209994a6cde1827fe496db55cdadf8f1545eb9eb0ab97a79619d9e34" exitCode=0 Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.472936 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64gsc" event={"ID":"5cdecddf-df66-4aef-bc33-65cbcf74db58","Type":"ContainerDied","Data":"f9874f5c209994a6cde1827fe496db55cdadf8f1545eb9eb0ab97a79619d9e34"} Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.472952 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64gsc" event={"ID":"5cdecddf-df66-4aef-bc33-65cbcf74db58","Type":"ContainerStarted","Data":"aafa2a436c94f5a7e53dd1e7b9bc2f352897d17d3254b1c9b6fadf2528ac0a05"} Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.474190 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.475500 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4k78" event={"ID":"845a21ea-b176-471d-bacd-b98289285d1c","Type":"ContainerStarted","Data":"75583351d7e90dfab5bf0a7d2fe52634e0484919e890762ae4abb5e540c3f976"} Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.477229 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtq2s" event={"ID":"ccfbc043-c76e-4afd-a7e7-db0057427fa5","Type":"ContainerStarted","Data":"80fea317094a3d00023c8a4666edafa49fe939e30c3762fa57878d03865d1d13"} Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.497779 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" podStartSLOduration=125.497754439 podStartE2EDuration="2m5.497754439s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:28.492192119 +0000 UTC m=+145.835775637" watchObservedRunningTime="2025-12-03 06:33:28.497754439 +0000 UTC m=+145.841337937" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.590589 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.593302 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-4v6c6"] Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.594229 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.595846 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.602225 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4v6c6"] Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.721393 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kvvz\" (UniqueName: \"kubernetes.io/projected/bf274771-c291-4ab4-9f69-1e1554707a6c-kube-api-access-5kvvz\") pod \"redhat-marketplace-4v6c6\" (UID: \"bf274771-c291-4ab4-9f69-1e1554707a6c\") " pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.721702 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf274771-c291-4ab4-9f69-1e1554707a6c-catalog-content\") pod \"redhat-marketplace-4v6c6\" (UID: \"bf274771-c291-4ab4-9f69-1e1554707a6c\") " pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.721800 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf274771-c291-4ab4-9f69-1e1554707a6c-utilities\") pod \"redhat-marketplace-4v6c6\" (UID: \"bf274771-c291-4ab4-9f69-1e1554707a6c\") " pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.822798 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf274771-c291-4ab4-9f69-1e1554707a6c-utilities\") pod \"redhat-marketplace-4v6c6\" (UID: \"bf274771-c291-4ab4-9f69-1e1554707a6c\") " pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.822838 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kvvz\" (UniqueName: \"kubernetes.io/projected/bf274771-c291-4ab4-9f69-1e1554707a6c-kube-api-access-5kvvz\") pod \"redhat-marketplace-4v6c6\" (UID: \"bf274771-c291-4ab4-9f69-1e1554707a6c\") " pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.822867 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf274771-c291-4ab4-9f69-1e1554707a6c-catalog-content\") pod \"redhat-marketplace-4v6c6\" (UID: \"bf274771-c291-4ab4-9f69-1e1554707a6c\") " pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.823443 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf274771-c291-4ab4-9f69-1e1554707a6c-utilities\") pod \"redhat-marketplace-4v6c6\" (UID: \"bf274771-c291-4ab4-9f69-1e1554707a6c\") " pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.828755 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf274771-c291-4ab4-9f69-1e1554707a6c-catalog-content\") pod \"redhat-marketplace-4v6c6\" (UID: \"bf274771-c291-4ab4-9f69-1e1554707a6c\") " pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.844763 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kvvz\" (UniqueName: \"kubernetes.io/projected/bf274771-c291-4ab4-9f69-1e1554707a6c-kube-api-access-5kvvz\") pod \"redhat-marketplace-4v6c6\" (UID: \"bf274771-c291-4ab4-9f69-1e1554707a6c\") " pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.881044 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.891189 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncw4v"] Dec 03 06:33:28 crc kubenswrapper[4831]: W1203 06:33:28.907963 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d9d5be4_e63c_4f4a_85c0_ef3735894eb4.slice/crio-0dd3fa2e4c40f947b90235108f947a2e5fd174dc1790c383be63326f6073d8eb WatchSource:0}: Error finding container 0dd3fa2e4c40f947b90235108f947a2e5fd174dc1790c383be63326f6073d8eb: Status 404 returned error can't find the container with id 0dd3fa2e4c40f947b90235108f947a2e5fd174dc1790c383be63326f6073d8eb Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.908264 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.925610 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj249\" (UniqueName: \"kubernetes.io/projected/e528cdde-c61e-42dd-8c55-e5276df017c6-kube-api-access-rj249\") pod \"e528cdde-c61e-42dd-8c55-e5276df017c6\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.925958 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e528cdde-c61e-42dd-8c55-e5276df017c6-config-volume\") pod \"e528cdde-c61e-42dd-8c55-e5276df017c6\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.926024 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e528cdde-c61e-42dd-8c55-e5276df017c6-secret-volume\") pod \"e528cdde-c61e-42dd-8c55-e5276df017c6\" (UID: \"e528cdde-c61e-42dd-8c55-e5276df017c6\") " Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.926687 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e528cdde-c61e-42dd-8c55-e5276df017c6-config-volume" (OuterVolumeSpecName: "config-volume") pod "e528cdde-c61e-42dd-8c55-e5276df017c6" (UID: "e528cdde-c61e-42dd-8c55-e5276df017c6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.943901 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e528cdde-c61e-42dd-8c55-e5276df017c6-kube-api-access-rj249" (OuterVolumeSpecName: "kube-api-access-rj249") pod "e528cdde-c61e-42dd-8c55-e5276df017c6" (UID: "e528cdde-c61e-42dd-8c55-e5276df017c6"). InnerVolumeSpecName "kube-api-access-rj249". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.945610 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e528cdde-c61e-42dd-8c55-e5276df017c6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e528cdde-c61e-42dd-8c55-e5276df017c6" (UID: "e528cdde-c61e-42dd-8c55-e5276df017c6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.982500 4831 patch_prober.go:28] interesting pod/router-default-5444994796-6n7vj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:33:28 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 03 06:33:28 crc kubenswrapper[4831]: [+]process-running ok Dec 03 06:33:28 crc kubenswrapper[4831]: healthz check failed Dec 03 06:33:28 crc kubenswrapper[4831]: I1203 06:33:28.982553 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6n7vj" podUID="294bf078-98d9-4c39-8fd5-f39926fbfe58" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.004592 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lbzbt"] Dec 03 06:33:29 crc kubenswrapper[4831]: E1203 06:33:29.004801 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e528cdde-c61e-42dd-8c55-e5276df017c6" containerName="collect-profiles" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.004813 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e528cdde-c61e-42dd-8c55-e5276df017c6" containerName="collect-profiles" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.004916 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e528cdde-c61e-42dd-8c55-e5276df017c6" containerName="collect-profiles" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.009919 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.028282 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca97992-94be-4cf7-b532-1239da97bf6d-utilities\") pod \"redhat-marketplace-lbzbt\" (UID: \"8ca97992-94be-4cf7-b532-1239da97bf6d\") " pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.028374 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca97992-94be-4cf7-b532-1239da97bf6d-catalog-content\") pod \"redhat-marketplace-lbzbt\" (UID: \"8ca97992-94be-4cf7-b532-1239da97bf6d\") " pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.028422 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9z82m\" (UniqueName: \"kubernetes.io/projected/8ca97992-94be-4cf7-b532-1239da97bf6d-kube-api-access-9z82m\") pod \"redhat-marketplace-lbzbt\" (UID: \"8ca97992-94be-4cf7-b532-1239da97bf6d\") " pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.028467 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e528cdde-c61e-42dd-8c55-e5276df017c6-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.028477 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e528cdde-c61e-42dd-8c55-e5276df017c6-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.028486 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj249\" (UniqueName: \"kubernetes.io/projected/e528cdde-c61e-42dd-8c55-e5276df017c6-kube-api-access-rj249\") on node \"crc\" DevicePath \"\"" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.032540 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.033255 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbzbt"] Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.131405 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9z82m\" (UniqueName: \"kubernetes.io/projected/8ca97992-94be-4cf7-b532-1239da97bf6d-kube-api-access-9z82m\") pod \"redhat-marketplace-lbzbt\" (UID: \"8ca97992-94be-4cf7-b532-1239da97bf6d\") " pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.131487 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca97992-94be-4cf7-b532-1239da97bf6d-utilities\") pod \"redhat-marketplace-lbzbt\" (UID: \"8ca97992-94be-4cf7-b532-1239da97bf6d\") " pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.131533 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca97992-94be-4cf7-b532-1239da97bf6d-catalog-content\") pod \"redhat-marketplace-lbzbt\" (UID: \"8ca97992-94be-4cf7-b532-1239da97bf6d\") " pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.132256 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca97992-94be-4cf7-b532-1239da97bf6d-catalog-content\") pod \"redhat-marketplace-lbzbt\" (UID: \"8ca97992-94be-4cf7-b532-1239da97bf6d\") " pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.132569 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca97992-94be-4cf7-b532-1239da97bf6d-utilities\") pod \"redhat-marketplace-lbzbt\" (UID: \"8ca97992-94be-4cf7-b532-1239da97bf6d\") " pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.160200 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9z82m\" (UniqueName: \"kubernetes.io/projected/8ca97992-94be-4cf7-b532-1239da97bf6d-kube-api-access-9z82m\") pod \"redhat-marketplace-lbzbt\" (UID: \"8ca97992-94be-4cf7-b532-1239da97bf6d\") " pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.341280 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.375567 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-4v6c6"] Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.451698 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.452308 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.459333 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.459543 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.477259 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.491821 4831 generic.go:334] "Generic (PLEG): container finished" podID="de96d02d-bef7-48a1-85f6-2d6086ff6498" containerID="8fe7b1b56da394b9c9fee8380d217c0f1e9ae002319006dc11321d933d9a8c8d" exitCode=0 Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.491884 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdmr5" event={"ID":"de96d02d-bef7-48a1-85f6-2d6086ff6498","Type":"ContainerDied","Data":"8fe7b1b56da394b9c9fee8380d217c0f1e9ae002319006dc11321d933d9a8c8d"} Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.519072 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4v6c6" event={"ID":"bf274771-c291-4ab4-9f69-1e1554707a6c","Type":"ContainerStarted","Data":"31ae8e730854386722f81226b580e143344d454983a84bbccd0b0e840a2274c5"} Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.529069 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" event={"ID":"e528cdde-c61e-42dd-8c55-e5276df017c6","Type":"ContainerDied","Data":"0b2a6b11a65bd210057cb4a722cdfa0f3f4bc963c3be48c414a9fe4d2058aaf8"} Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.529112 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b2a6b11a65bd210057cb4a722cdfa0f3f4bc963c3be48c414a9fe4d2058aaf8" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.529224 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.540783 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a36103d-2c3c-4625-85ff-678a5c4ee9cf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5a36103d-2c3c-4625-85ff-678a5c4ee9cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.543536 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a36103d-2c3c-4625-85ff-678a5c4ee9cf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5a36103d-2c3c-4625-85ff-678a5c4ee9cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.559133 4831 generic.go:334] "Generic (PLEG): container finished" podID="845a21ea-b176-471d-bacd-b98289285d1c" containerID="804f9e8e87b6daeb6e06030bbcd451e81a53d6ac709746beb5b09c85f8d94454" exitCode=0 Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.559228 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4k78" event={"ID":"845a21ea-b176-471d-bacd-b98289285d1c","Type":"ContainerDied","Data":"804f9e8e87b6daeb6e06030bbcd451e81a53d6ac709746beb5b09c85f8d94454"} Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.569554 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" event={"ID":"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4","Type":"ContainerStarted","Data":"aa3ad225e2bfdcf38c973c450f6478afd52dd5d0d7942642e504aada16033080"} Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.569600 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" event={"ID":"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4","Type":"ContainerStarted","Data":"0dd3fa2e4c40f947b90235108f947a2e5fd174dc1790c383be63326f6073d8eb"} Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.569729 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.572983 4831 generic.go:334] "Generic (PLEG): container finished" podID="ccfbc043-c76e-4afd-a7e7-db0057427fa5" containerID="e190fc9cdacc185ec3c5349d8f84cb6c70e0f6278a5e7ab17e90e5e38642f498" exitCode=0 Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.573089 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtq2s" event={"ID":"ccfbc043-c76e-4afd-a7e7-db0057427fa5","Type":"ContainerDied","Data":"e190fc9cdacc185ec3c5349d8f84cb6c70e0f6278a5e7ab17e90e5e38642f498"} Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.608298 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ndqmw"] Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.611022 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" event={"ID":"453efca8-e843-4972-b9ae-eae3df7b02a6","Type":"ContainerStarted","Data":"243c31c438fb7c8872f11f1a9069229a0bf3d94e0c98aa77ce0213f5e9f7be2a"} Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.611123 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.612781 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ndqmw"] Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.614331 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.620831 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" podStartSLOduration=126.620811166 podStartE2EDuration="2m6.620811166s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:29.619618598 +0000 UTC m=+146.963202106" watchObservedRunningTime="2025-12-03 06:33:29.620811166 +0000 UTC m=+146.964394684" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.644822 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a36103d-2c3c-4625-85ff-678a5c4ee9cf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5a36103d-2c3c-4625-85ff-678a5c4ee9cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.645026 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a36103d-2c3c-4625-85ff-678a5c4ee9cf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5a36103d-2c3c-4625-85ff-678a5c4ee9cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.645596 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-utilities\") pod \"redhat-operators-ndqmw\" (UID: \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\") " pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.645652 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-catalog-content\") pod \"redhat-operators-ndqmw\" (UID: \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\") " pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.645668 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrstd\" (UniqueName: \"kubernetes.io/projected/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-kube-api-access-lrstd\") pod \"redhat-operators-ndqmw\" (UID: \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\") " pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.646747 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbzbt"] Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.647446 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a36103d-2c3c-4625-85ff-678a5c4ee9cf-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5a36103d-2c3c-4625-85ff-678a5c4ee9cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.671580 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-dv9z6" podStartSLOduration=10.671560535 podStartE2EDuration="10.671560535s" podCreationTimestamp="2025-12-03 06:33:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:29.667928258 +0000 UTC m=+147.011511766" watchObservedRunningTime="2025-12-03 06:33:29.671560535 +0000 UTC m=+147.015144043" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.674366 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a36103d-2c3c-4625-85ff-678a5c4ee9cf-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5a36103d-2c3c-4625-85ff-678a5c4ee9cf\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:33:29 crc kubenswrapper[4831]: W1203 06:33:29.685723 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ca97992_94be_4cf7_b532_1239da97bf6d.slice/crio-e62cdeeccc9441442d1f68d29ba0d6803b936823a1e3de4bd11477d6441368d0 WatchSource:0}: Error finding container e62cdeeccc9441442d1f68d29ba0d6803b936823a1e3de4bd11477d6441368d0: Status 404 returned error can't find the container with id e62cdeeccc9441442d1f68d29ba0d6803b936823a1e3de4bd11477d6441368d0 Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.748139 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-utilities\") pod \"redhat-operators-ndqmw\" (UID: \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\") " pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.748188 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-catalog-content\") pod \"redhat-operators-ndqmw\" (UID: \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\") " pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.748229 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrstd\" (UniqueName: \"kubernetes.io/projected/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-kube-api-access-lrstd\") pod \"redhat-operators-ndqmw\" (UID: \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\") " pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.749547 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-utilities\") pod \"redhat-operators-ndqmw\" (UID: \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\") " pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.749724 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-catalog-content\") pod \"redhat-operators-ndqmw\" (UID: \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\") " pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.764412 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrstd\" (UniqueName: \"kubernetes.io/projected/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-kube-api-access-lrstd\") pod \"redhat-operators-ndqmw\" (UID: \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\") " pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.779626 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.980555 4831 patch_prober.go:28] interesting pod/router-default-5444994796-6n7vj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:33:29 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 03 06:33:29 crc kubenswrapper[4831]: [+]process-running ok Dec 03 06:33:29 crc kubenswrapper[4831]: healthz check failed Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.980949 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6n7vj" podUID="294bf078-98d9-4c39-8fd5-f39926fbfe58" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:33:29 crc kubenswrapper[4831]: I1203 06:33:29.987566 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.002464 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6phx5"] Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.003371 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.054551 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4f1dcf-c437-4687-93b5-4198d80bff3d-catalog-content\") pod \"redhat-operators-6phx5\" (UID: \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\") " pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.054620 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.054665 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.054754 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4f1dcf-c437-4687-93b5-4198d80bff3d-utilities\") pod \"redhat-operators-6phx5\" (UID: \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\") " pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.054929 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.055094 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxh88\" (UniqueName: \"kubernetes.io/projected/3f4f1dcf-c437-4687-93b5-4198d80bff3d-kube-api-access-bxh88\") pod \"redhat-operators-6phx5\" (UID: \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\") " pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.055163 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.055937 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.064728 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6phx5"] Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.064770 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.068042 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.076144 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.156472 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bxh88\" (UniqueName: \"kubernetes.io/projected/3f4f1dcf-c437-4687-93b5-4198d80bff3d-kube-api-access-bxh88\") pod \"redhat-operators-6phx5\" (UID: \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\") " pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.156555 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4f1dcf-c437-4687-93b5-4198d80bff3d-catalog-content\") pod \"redhat-operators-6phx5\" (UID: \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\") " pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.156639 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4f1dcf-c437-4687-93b5-4198d80bff3d-utilities\") pod \"redhat-operators-6phx5\" (UID: \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\") " pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.157214 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4f1dcf-c437-4687-93b5-4198d80bff3d-catalog-content\") pod \"redhat-operators-6phx5\" (UID: \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\") " pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.157358 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4f1dcf-c437-4687-93b5-4198d80bff3d-utilities\") pod \"redhat-operators-6phx5\" (UID: \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\") " pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.172767 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxh88\" (UniqueName: \"kubernetes.io/projected/3f4f1dcf-c437-4687-93b5-4198d80bff3d-kube-api-access-bxh88\") pod \"redhat-operators-6phx5\" (UID: \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\") " pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.230907 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.240736 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.251113 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.268054 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Dec 03 06:33:30 crc kubenswrapper[4831]: W1203 06:33:30.285899 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5a36103d_2c3c_4625_85ff_678a5c4ee9cf.slice/crio-20b66957e2ab5001591cbaedcd43064387a1e5e5390a114727b5fe844037e165 WatchSource:0}: Error finding container 20b66957e2ab5001591cbaedcd43064387a1e5e5390a114727b5fe844037e165: Status 404 returned error can't find the container with id 20b66957e2ab5001591cbaedcd43064387a1e5e5390a114727b5fe844037e165 Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.359715 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.432595 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ndqmw"] Dec 03 06:33:30 crc kubenswrapper[4831]: W1203 06:33:30.487448 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea4d526d_cb74_4ac6_a3fe_33aad14c3444.slice/crio-04123ef09724e760335fa9636da353b69aaecc1a8b321488746f3ab6a1c5eeb3 WatchSource:0}: Error finding container 04123ef09724e760335fa9636da353b69aaecc1a8b321488746f3ab6a1c5eeb3: Status 404 returned error can't find the container with id 04123ef09724e760335fa9636da353b69aaecc1a8b321488746f3ab6a1c5eeb3 Dec 03 06:33:30 crc kubenswrapper[4831]: W1203 06:33:30.575171 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-3ffa80c394cdbf6c4b503f29e2d5060dc2975750495696f68da976e6313d0e1d WatchSource:0}: Error finding container 3ffa80c394cdbf6c4b503f29e2d5060dc2975750495696f68da976e6313d0e1d: Status 404 returned error can't find the container with id 3ffa80c394cdbf6c4b503f29e2d5060dc2975750495696f68da976e6313d0e1d Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.626389 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-rddpj" Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.631037 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5a36103d-2c3c-4625-85ff-678a5c4ee9cf","Type":"ContainerStarted","Data":"20b66957e2ab5001591cbaedcd43064387a1e5e5390a114727b5fe844037e165"} Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.635897 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndqmw" event={"ID":"ea4d526d-cb74-4ac6-a3fe-33aad14c3444","Type":"ContainerStarted","Data":"04123ef09724e760335fa9636da353b69aaecc1a8b321488746f3ab6a1c5eeb3"} Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.648084 4831 generic.go:334] "Generic (PLEG): container finished" podID="bf274771-c291-4ab4-9f69-1e1554707a6c" containerID="ddf5a50878aa159057b41d98cf6fdfef6298e80f750ff8dab2f4bd61cddcacf6" exitCode=0 Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.648190 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4v6c6" event={"ID":"bf274771-c291-4ab4-9f69-1e1554707a6c","Type":"ContainerDied","Data":"ddf5a50878aa159057b41d98cf6fdfef6298e80f750ff8dab2f4bd61cddcacf6"} Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.655746 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"3ffa80c394cdbf6c4b503f29e2d5060dc2975750495696f68da976e6313d0e1d"} Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.662852 4831 generic.go:334] "Generic (PLEG): container finished" podID="8ca97992-94be-4cf7-b532-1239da97bf6d" containerID="832b8b90569f40454e7f5ffc7b4554d531da87cf26dde69e2836aca39385447b" exitCode=0 Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.665018 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbzbt" event={"ID":"8ca97992-94be-4cf7-b532-1239da97bf6d","Type":"ContainerDied","Data":"832b8b90569f40454e7f5ffc7b4554d531da87cf26dde69e2836aca39385447b"} Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.665059 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbzbt" event={"ID":"8ca97992-94be-4cf7-b532-1239da97bf6d","Type":"ContainerStarted","Data":"e62cdeeccc9441442d1f68d29ba0d6803b936823a1e3de4bd11477d6441368d0"} Dec 03 06:33:30 crc kubenswrapper[4831]: W1203 06:33:30.724606 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-913be32d0bf024c47dde7ed76238c367f672e8c57f41bb0a61fe7c1e554cb949 WatchSource:0}: Error finding container 913be32d0bf024c47dde7ed76238c367f672e8c57f41bb0a61fe7c1e554cb949: Status 404 returned error can't find the container with id 913be32d0bf024c47dde7ed76238c367f672e8c57f41bb0a61fe7c1e554cb949 Dec 03 06:33:30 crc kubenswrapper[4831]: W1203 06:33:30.751746 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-dddc8a16c3e86cf6aefcd7f992d7dbe4f4704badecc4aa8aacc57b55bb237841 WatchSource:0}: Error finding container dddc8a16c3e86cf6aefcd7f992d7dbe4f4704badecc4aa8aacc57b55bb237841: Status 404 returned error can't find the container with id dddc8a16c3e86cf6aefcd7f992d7dbe4f4704badecc4aa8aacc57b55bb237841 Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.980264 4831 patch_prober.go:28] interesting pod/router-default-5444994796-6n7vj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:33:30 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 03 06:33:30 crc kubenswrapper[4831]: [+]process-running ok Dec 03 06:33:30 crc kubenswrapper[4831]: healthz check failed Dec 03 06:33:30 crc kubenswrapper[4831]: I1203 06:33:30.980326 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6n7vj" podUID="294bf078-98d9-4c39-8fd5-f39926fbfe58" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.040206 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6phx5"] Dec 03 06:33:31 crc kubenswrapper[4831]: W1203 06:33:31.051258 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f4f1dcf_c437_4687_93b5_4198d80bff3d.slice/crio-82e628eac0c30eed291f5f076cc698cc79198c491df5756f3b8884a109a88b48 WatchSource:0}: Error finding container 82e628eac0c30eed291f5f076cc698cc79198c491df5756f3b8884a109a88b48: Status 404 returned error can't find the container with id 82e628eac0c30eed291f5f076cc698cc79198c491df5756f3b8884a109a88b48 Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.176932 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.184478 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.235943 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.244373 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-48l57" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.247353 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.247396 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.249154 4831 patch_prober.go:28] interesting pod/console-f9d7485db-dwsb6 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.249210 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-dwsb6" podUID="c289d28d-642e-4cc4-9d25-f025800585d1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.285703 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-2t7pp container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.285748 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-2t7pp" podUID="bfba7fc4-12b2-40ff-b18c-170051c75374" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.285947 4831 patch_prober.go:28] interesting pod/downloads-7954f5f757-2t7pp container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" start-of-body= Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.285962 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-2t7pp" podUID="bfba7fc4-12b2-40ff-b18c-170051c75374" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.10:8080/\": dial tcp 10.217.0.10:8080: connect: connection refused" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.382597 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.382633 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.398760 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.672701 4831 generic.go:334] "Generic (PLEG): container finished" podID="5a36103d-2c3c-4625-85ff-678a5c4ee9cf" containerID="538cc493386a53883e640b443a9d7bce55764b241b9fa14bce7d571438f46f60" exitCode=0 Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.673057 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5a36103d-2c3c-4625-85ff-678a5c4ee9cf","Type":"ContainerDied","Data":"538cc493386a53883e640b443a9d7bce55764b241b9fa14bce7d571438f46f60"} Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.682490 4831 generic.go:334] "Generic (PLEG): container finished" podID="ea4d526d-cb74-4ac6-a3fe-33aad14c3444" containerID="0ad2bd96ad34a1e3ee11191d845b498994363bf825f46d81b32b477282ce9a6e" exitCode=0 Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.682567 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndqmw" event={"ID":"ea4d526d-cb74-4ac6-a3fe-33aad14c3444","Type":"ContainerDied","Data":"0ad2bd96ad34a1e3ee11191d845b498994363bf825f46d81b32b477282ce9a6e"} Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.688067 4831 generic.go:334] "Generic (PLEG): container finished" podID="3f4f1dcf-c437-4687-93b5-4198d80bff3d" containerID="4cf808f161d897df1e1526d2227b0b99ae0ac96b594c3ea0ce5019a1fc4fa55d" exitCode=0 Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.688156 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6phx5" event={"ID":"3f4f1dcf-c437-4687-93b5-4198d80bff3d","Type":"ContainerDied","Data":"4cf808f161d897df1e1526d2227b0b99ae0ac96b594c3ea0ce5019a1fc4fa55d"} Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.688189 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6phx5" event={"ID":"3f4f1dcf-c437-4687-93b5-4198d80bff3d","Type":"ContainerStarted","Data":"82e628eac0c30eed291f5f076cc698cc79198c491df5756f3b8884a109a88b48"} Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.696268 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"cb2f82acdc37fabb9b82e01c507bbafc8b8715093fbcd28b1797a36cd7e8c1d1"} Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.696328 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"913be32d0bf024c47dde7ed76238c367f672e8c57f41bb0a61fe7c1e554cb949"} Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.696487 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.700590 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"ef0b51f548f68111f1f6dff82132773b8bdf2e97308c1f2510d47b5165835a57"} Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.700622 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"dddc8a16c3e86cf6aefcd7f992d7dbe4f4704badecc4aa8aacc57b55bb237841"} Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.708289 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ed2f859ba3169a69ef51eefa4d77cee8f5b1cdcb5fee1967e54d21e2c200ecb4"} Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.713664 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-wcqwh" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.977936 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.983831 4831 patch_prober.go:28] interesting pod/router-default-5444994796-6n7vj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:33:31 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 03 06:33:31 crc kubenswrapper[4831]: [+]process-running ok Dec 03 06:33:31 crc kubenswrapper[4831]: healthz check failed Dec 03 06:33:31 crc kubenswrapper[4831]: I1203 06:33:31.983912 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6n7vj" podUID="294bf078-98d9-4c39-8fd5-f39926fbfe58" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.538094 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.538977 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.539062 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.542079 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.542281 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.615014 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccb1bed3-01b5-43c0-bb47-c093c5b2050f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ccb1bed3-01b5-43c0-bb47-c093c5b2050f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.615065 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccb1bed3-01b5-43c0-bb47-c093c5b2050f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ccb1bed3-01b5-43c0-bb47-c093c5b2050f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.715806 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccb1bed3-01b5-43c0-bb47-c093c5b2050f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ccb1bed3-01b5-43c0-bb47-c093c5b2050f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.715859 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccb1bed3-01b5-43c0-bb47-c093c5b2050f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ccb1bed3-01b5-43c0-bb47-c093c5b2050f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.716023 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccb1bed3-01b5-43c0-bb47-c093c5b2050f-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"ccb1bed3-01b5-43c0-bb47-c093c5b2050f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.748729 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccb1bed3-01b5-43c0-bb47-c093c5b2050f-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"ccb1bed3-01b5-43c0-bb47-c093c5b2050f\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.862151 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.980235 4831 patch_prober.go:28] interesting pod/router-default-5444994796-6n7vj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Dec 03 06:33:32 crc kubenswrapper[4831]: [-]has-synced failed: reason withheld Dec 03 06:33:32 crc kubenswrapper[4831]: [+]process-running ok Dec 03 06:33:32 crc kubenswrapper[4831]: healthz check failed Dec 03 06:33:32 crc kubenswrapper[4831]: I1203 06:33:32.980290 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-6n7vj" podUID="294bf078-98d9-4c39-8fd5-f39926fbfe58" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.150210 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.198172 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.221588 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a36103d-2c3c-4625-85ff-678a5c4ee9cf-kubelet-dir\") pod \"5a36103d-2c3c-4625-85ff-678a5c4ee9cf\" (UID: \"5a36103d-2c3c-4625-85ff-678a5c4ee9cf\") " Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.221990 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a36103d-2c3c-4625-85ff-678a5c4ee9cf-kube-api-access\") pod \"5a36103d-2c3c-4625-85ff-678a5c4ee9cf\" (UID: \"5a36103d-2c3c-4625-85ff-678a5c4ee9cf\") " Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.223156 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a36103d-2c3c-4625-85ff-678a5c4ee9cf-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5a36103d-2c3c-4625-85ff-678a5c4ee9cf" (UID: "5a36103d-2c3c-4625-85ff-678a5c4ee9cf"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.236232 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a36103d-2c3c-4625-85ff-678a5c4ee9cf-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5a36103d-2c3c-4625-85ff-678a5c4ee9cf" (UID: "5a36103d-2c3c-4625-85ff-678a5c4ee9cf"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.325603 4831 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a36103d-2c3c-4625-85ff-678a5c4ee9cf-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.325643 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a36103d-2c3c-4625-85ff-678a5c4ee9cf-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.789805 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.789905 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5a36103d-2c3c-4625-85ff-678a5c4ee9cf","Type":"ContainerDied","Data":"20b66957e2ab5001591cbaedcd43064387a1e5e5390a114727b5fe844037e165"} Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.791308 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20b66957e2ab5001591cbaedcd43064387a1e5e5390a114727b5fe844037e165" Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.798053 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ccb1bed3-01b5-43c0-bb47-c093c5b2050f","Type":"ContainerStarted","Data":"d62a11f7890a87f09950efa264c55afe99691d7b7ecb955d1ce740aa291b73ae"} Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.982996 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:33 crc kubenswrapper[4831]: I1203 06:33:33.985383 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-6n7vj" Dec 03 06:33:34 crc kubenswrapper[4831]: I1203 06:33:34.858776 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ccb1bed3-01b5-43c0-bb47-c093c5b2050f","Type":"ContainerStarted","Data":"5eb3de0afcd22e77b358cfb2a339f2b035e52249b03f83aa94d45634d32a25b5"} Dec 03 06:33:34 crc kubenswrapper[4831]: I1203 06:33:34.875005 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.874988133 podStartE2EDuration="2.874988133s" podCreationTimestamp="2025-12-03 06:33:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:33:34.874106435 +0000 UTC m=+152.217689943" watchObservedRunningTime="2025-12-03 06:33:34.874988133 +0000 UTC m=+152.218571641" Dec 03 06:33:35 crc kubenswrapper[4831]: I1203 06:33:35.871729 4831 generic.go:334] "Generic (PLEG): container finished" podID="ccb1bed3-01b5-43c0-bb47-c093c5b2050f" containerID="5eb3de0afcd22e77b358cfb2a339f2b035e52249b03f83aa94d45634d32a25b5" exitCode=0 Dec 03 06:33:35 crc kubenswrapper[4831]: I1203 06:33:35.872084 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ccb1bed3-01b5-43c0-bb47-c093c5b2050f","Type":"ContainerDied","Data":"5eb3de0afcd22e77b358cfb2a339f2b035e52249b03f83aa94d45634d32a25b5"} Dec 03 06:33:37 crc kubenswrapper[4831]: I1203 06:33:37.169868 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-d2msr" Dec 03 06:33:41 crc kubenswrapper[4831]: I1203 06:33:41.268799 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:41 crc kubenswrapper[4831]: I1203 06:33:41.273514 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:33:41 crc kubenswrapper[4831]: I1203 06:33:41.276753 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-2t7pp" Dec 03 06:33:41 crc kubenswrapper[4831]: I1203 06:33:41.907253 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"ccb1bed3-01b5-43c0-bb47-c093c5b2050f","Type":"ContainerDied","Data":"d62a11f7890a87f09950efa264c55afe99691d7b7ecb955d1ce740aa291b73ae"} Dec 03 06:33:41 crc kubenswrapper[4831]: I1203 06:33:41.907520 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d62a11f7890a87f09950efa264c55afe99691d7b7ecb955d1ce740aa291b73ae" Dec 03 06:33:41 crc kubenswrapper[4831]: I1203 06:33:41.942618 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:33:42 crc kubenswrapper[4831]: I1203 06:33:42.074719 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccb1bed3-01b5-43c0-bb47-c093c5b2050f-kube-api-access\") pod \"ccb1bed3-01b5-43c0-bb47-c093c5b2050f\" (UID: \"ccb1bed3-01b5-43c0-bb47-c093c5b2050f\") " Dec 03 06:33:42 crc kubenswrapper[4831]: I1203 06:33:42.074759 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccb1bed3-01b5-43c0-bb47-c093c5b2050f-kubelet-dir\") pod \"ccb1bed3-01b5-43c0-bb47-c093c5b2050f\" (UID: \"ccb1bed3-01b5-43c0-bb47-c093c5b2050f\") " Dec 03 06:33:42 crc kubenswrapper[4831]: I1203 06:33:42.075061 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ccb1bed3-01b5-43c0-bb47-c093c5b2050f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ccb1bed3-01b5-43c0-bb47-c093c5b2050f" (UID: "ccb1bed3-01b5-43c0-bb47-c093c5b2050f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:33:42 crc kubenswrapper[4831]: I1203 06:33:42.081573 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccb1bed3-01b5-43c0-bb47-c093c5b2050f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ccb1bed3-01b5-43c0-bb47-c093c5b2050f" (UID: "ccb1bed3-01b5-43c0-bb47-c093c5b2050f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:33:42 crc kubenswrapper[4831]: I1203 06:33:42.176138 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ccb1bed3-01b5-43c0-bb47-c093c5b2050f-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:33:42 crc kubenswrapper[4831]: I1203 06:33:42.176206 4831 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ccb1bed3-01b5-43c0-bb47-c093c5b2050f-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:33:42 crc kubenswrapper[4831]: I1203 06:33:42.910931 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Dec 03 06:33:46 crc kubenswrapper[4831]: I1203 06:33:46.545756 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:33:46 crc kubenswrapper[4831]: I1203 06:33:46.555974 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8283839a-a189-493f-bde7-e0193d575963-metrics-certs\") pod \"network-metrics-daemon-lllsw\" (UID: \"8283839a-a189-493f-bde7-e0193d575963\") " pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:33:46 crc kubenswrapper[4831]: I1203 06:33:46.833420 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-lllsw" Dec 03 06:33:48 crc kubenswrapper[4831]: I1203 06:33:48.605416 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:33:57 crc kubenswrapper[4831]: I1203 06:33:57.596742 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:33:57 crc kubenswrapper[4831]: I1203 06:33:57.597336 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:33:58 crc kubenswrapper[4831]: E1203 06:33:58.970991 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Dec 03 06:33:58 crc kubenswrapper[4831]: E1203 06:33:58.971584 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9b9xn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-fdmr5_openshift-marketplace(de96d02d-bef7-48a1-85f6-2d6086ff6498): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Dec 03 06:33:58 crc kubenswrapper[4831]: E1203 06:33:58.973020 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-fdmr5" podUID="de96d02d-bef7-48a1-85f6-2d6086ff6498" Dec 03 06:33:59 crc kubenswrapper[4831]: E1203 06:33:59.012275 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-fdmr5" podUID="de96d02d-bef7-48a1-85f6-2d6086ff6498" Dec 03 06:33:59 crc kubenswrapper[4831]: I1203 06:33:59.314161 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-lllsw"] Dec 03 06:33:59 crc kubenswrapper[4831]: W1203 06:33:59.323381 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8283839a_a189_493f_bde7_e0193d575963.slice/crio-fc48c8390c0ba5d87e6145fef0fdef834353249473f2027b646ff3f4e30ddddc WatchSource:0}: Error finding container fc48c8390c0ba5d87e6145fef0fdef834353249473f2027b646ff3f4e30ddddc: Status 404 returned error can't find the container with id fc48c8390c0ba5d87e6145fef0fdef834353249473f2027b646ff3f4e30ddddc Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.004640 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtq2s" event={"ID":"ccfbc043-c76e-4afd-a7e7-db0057427fa5","Type":"ContainerDied","Data":"399f0b88f7684a260f5bd513cdca14805cc25c5ff3e40c0639d005383b2b3613"} Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.004464 4831 generic.go:334] "Generic (PLEG): container finished" podID="ccfbc043-c76e-4afd-a7e7-db0057427fa5" containerID="399f0b88f7684a260f5bd513cdca14805cc25c5ff3e40c0639d005383b2b3613" exitCode=0 Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.011079 4831 generic.go:334] "Generic (PLEG): container finished" podID="ea4d526d-cb74-4ac6-a3fe-33aad14c3444" containerID="62bda3a1a2584028128622934e95acaaf70b654738a7715764ca2a94d4df15db" exitCode=0 Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.011180 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndqmw" event={"ID":"ea4d526d-cb74-4ac6-a3fe-33aad14c3444","Type":"ContainerDied","Data":"62bda3a1a2584028128622934e95acaaf70b654738a7715764ca2a94d4df15db"} Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.013980 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6phx5" event={"ID":"3f4f1dcf-c437-4687-93b5-4198d80bff3d","Type":"ContainerStarted","Data":"57d93b642634757b5ec710f07536cdbafa51fc4a24cd032e525ae5495a38e373"} Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.019725 4831 generic.go:334] "Generic (PLEG): container finished" podID="bf274771-c291-4ab4-9f69-1e1554707a6c" containerID="02d715fcd5f09ec798cc093491d969cb08fae535535a67e096e4c7924b07ae17" exitCode=0 Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.019880 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4v6c6" event={"ID":"bf274771-c291-4ab4-9f69-1e1554707a6c","Type":"ContainerDied","Data":"02d715fcd5f09ec798cc093491d969cb08fae535535a67e096e4c7924b07ae17"} Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.022351 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lllsw" event={"ID":"8283839a-a189-493f-bde7-e0193d575963","Type":"ContainerStarted","Data":"65a27c548349241e02c7d0228cfc81d475fc080c5a9ec658d2e80d8ed1beb95e"} Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.022395 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lllsw" event={"ID":"8283839a-a189-493f-bde7-e0193d575963","Type":"ContainerStarted","Data":"fc48c8390c0ba5d87e6145fef0fdef834353249473f2027b646ff3f4e30ddddc"} Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.026959 4831 generic.go:334] "Generic (PLEG): container finished" podID="5cdecddf-df66-4aef-bc33-65cbcf74db58" containerID="0318b2ae97b99bfccd1fbe2b462c9dbee4817059d1caae8743e8c089b294c8ec" exitCode=0 Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.027022 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64gsc" event={"ID":"5cdecddf-df66-4aef-bc33-65cbcf74db58","Type":"ContainerDied","Data":"0318b2ae97b99bfccd1fbe2b462c9dbee4817059d1caae8743e8c089b294c8ec"} Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.032578 4831 generic.go:334] "Generic (PLEG): container finished" podID="845a21ea-b176-471d-bacd-b98289285d1c" containerID="db811c1197eb8ded5e8633fffe11b1f2b010443ab51f8fb7c506e9d731b728db" exitCode=0 Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.032689 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4k78" event={"ID":"845a21ea-b176-471d-bacd-b98289285d1c","Type":"ContainerDied","Data":"db811c1197eb8ded5e8633fffe11b1f2b010443ab51f8fb7c506e9d731b728db"} Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.037183 4831 generic.go:334] "Generic (PLEG): container finished" podID="8ca97992-94be-4cf7-b532-1239da97bf6d" containerID="b1adc17f6d3a925078a0c2fdea2ced839e910fc664b624901628cb1f9079d708" exitCode=0 Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.037233 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbzbt" event={"ID":"8ca97992-94be-4cf7-b532-1239da97bf6d","Type":"ContainerDied","Data":"b1adc17f6d3a925078a0c2fdea2ced839e910fc664b624901628cb1f9079d708"} Dec 03 06:34:00 crc kubenswrapper[4831]: I1203 06:34:00.266808 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Dec 03 06:34:01 crc kubenswrapper[4831]: I1203 06:34:01.049834 4831 generic.go:334] "Generic (PLEG): container finished" podID="3f4f1dcf-c437-4687-93b5-4198d80bff3d" containerID="57d93b642634757b5ec710f07536cdbafa51fc4a24cd032e525ae5495a38e373" exitCode=0 Dec 03 06:34:01 crc kubenswrapper[4831]: I1203 06:34:01.049931 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6phx5" event={"ID":"3f4f1dcf-c437-4687-93b5-4198d80bff3d","Type":"ContainerDied","Data":"57d93b642634757b5ec710f07536cdbafa51fc4a24cd032e525ae5495a38e373"} Dec 03 06:34:01 crc kubenswrapper[4831]: I1203 06:34:01.056075 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-lllsw" event={"ID":"8283839a-a189-493f-bde7-e0193d575963","Type":"ContainerStarted","Data":"7b15867b44f0f0b8c8d2ba645dbbe9bcb0b21080b34dc8c799331e33ff884032"} Dec 03 06:34:01 crc kubenswrapper[4831]: I1203 06:34:01.094667 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-lllsw" podStartSLOduration=158.094640693 podStartE2EDuration="2m38.094640693s" podCreationTimestamp="2025-12-03 06:31:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:34:01.091485251 +0000 UTC m=+178.435068779" watchObservedRunningTime="2025-12-03 06:34:01.094640693 +0000 UTC m=+178.438224201" Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.063549 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndqmw" event={"ID":"ea4d526d-cb74-4ac6-a3fe-33aad14c3444","Type":"ContainerStarted","Data":"05221bdd08b39cc75cc87540df869ad1463e16d95d607b8b644e68dfb26b1051"} Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.065093 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6phx5" event={"ID":"3f4f1dcf-c437-4687-93b5-4198d80bff3d","Type":"ContainerStarted","Data":"fb9965f7b23a6edb6430f8c54de03cdd208604f8c617b2c3062ba86f09933d76"} Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.067146 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4v6c6" event={"ID":"bf274771-c291-4ab4-9f69-1e1554707a6c","Type":"ContainerStarted","Data":"83f85b2da2946a979e73cc350b05e91bf7d357f8aa9bfb30ddcb7ecc07585feb"} Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.069988 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64gsc" event={"ID":"5cdecddf-df66-4aef-bc33-65cbcf74db58","Type":"ContainerStarted","Data":"cb4727a6296a5b28db8342f8ce962ce1ce7b3856e62cb0313a7e26022c14d138"} Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.074760 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4k78" event={"ID":"845a21ea-b176-471d-bacd-b98289285d1c","Type":"ContainerStarted","Data":"be844767d89186f1f42cbb2aa7e52ea5a68fe21ab07b31f12974cdb26b031f0d"} Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.076830 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbzbt" event={"ID":"8ca97992-94be-4cf7-b532-1239da97bf6d","Type":"ContainerStarted","Data":"8c8ae87988c1198dbda0248024beaf074f2fde24376bc228b0ddf3b1e2b053cc"} Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.078652 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtq2s" event={"ID":"ccfbc043-c76e-4afd-a7e7-db0057427fa5","Type":"ContainerStarted","Data":"532e3ecfca649bcc16e35370a18d6adcedea5cff04eb73d44bf99d88c9a55ac6"} Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.085280 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ndqmw" podStartSLOduration=3.695563071 podStartE2EDuration="33.085267464s" podCreationTimestamp="2025-12-03 06:33:29 +0000 UTC" firstStartedPulling="2025-12-03 06:33:31.686497075 +0000 UTC m=+149.030080583" lastFinishedPulling="2025-12-03 06:34:01.076201468 +0000 UTC m=+178.419784976" observedRunningTime="2025-12-03 06:34:02.08419086 +0000 UTC m=+179.427774368" watchObservedRunningTime="2025-12-03 06:34:02.085267464 +0000 UTC m=+179.428850972" Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.099589 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-b6k5g" Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.101207 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-4v6c6" podStartSLOduration=3.6097491870000002 podStartE2EDuration="34.101190729s" podCreationTimestamp="2025-12-03 06:33:28 +0000 UTC" firstStartedPulling="2025-12-03 06:33:30.689532829 +0000 UTC m=+148.033116337" lastFinishedPulling="2025-12-03 06:34:01.180974361 +0000 UTC m=+178.524557879" observedRunningTime="2025-12-03 06:34:02.100659342 +0000 UTC m=+179.444242850" watchObservedRunningTime="2025-12-03 06:34:02.101190729 +0000 UTC m=+179.444774237" Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.119089 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-64gsc" podStartSLOduration=3.314855705 podStartE2EDuration="36.119073146s" podCreationTimestamp="2025-12-03 06:33:26 +0000 UTC" firstStartedPulling="2025-12-03 06:33:28.473926809 +0000 UTC m=+145.817510317" lastFinishedPulling="2025-12-03 06:34:01.27814425 +0000 UTC m=+178.621727758" observedRunningTime="2025-12-03 06:34:02.118388534 +0000 UTC m=+179.461972042" watchObservedRunningTime="2025-12-03 06:34:02.119073146 +0000 UTC m=+179.462656654" Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.143862 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-s4k78" podStartSLOduration=4.321991439 podStartE2EDuration="36.143844956s" podCreationTimestamp="2025-12-03 06:33:26 +0000 UTC" firstStartedPulling="2025-12-03 06:33:29.562194033 +0000 UTC m=+146.905777541" lastFinishedPulling="2025-12-03 06:34:01.38404755 +0000 UTC m=+178.727631058" observedRunningTime="2025-12-03 06:34:02.138795263 +0000 UTC m=+179.482378771" watchObservedRunningTime="2025-12-03 06:34:02.143844956 +0000 UTC m=+179.487428464" Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.166370 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lbzbt" podStartSLOduration=3.779683644 podStartE2EDuration="34.166355673s" podCreationTimestamp="2025-12-03 06:33:28 +0000 UTC" firstStartedPulling="2025-12-03 06:33:30.69020315 +0000 UTC m=+148.033786658" lastFinishedPulling="2025-12-03 06:34:01.076875179 +0000 UTC m=+178.420458687" observedRunningTime="2025-12-03 06:34:02.164156072 +0000 UTC m=+179.507739580" watchObservedRunningTime="2025-12-03 06:34:02.166355673 +0000 UTC m=+179.509939181" Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.182140 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6phx5" podStartSLOduration=3.393289701 podStartE2EDuration="33.182122963s" podCreationTimestamp="2025-12-03 06:33:29 +0000 UTC" firstStartedPulling="2025-12-03 06:33:31.692246001 +0000 UTC m=+149.035829509" lastFinishedPulling="2025-12-03 06:34:01.481079263 +0000 UTC m=+178.824662771" observedRunningTime="2025-12-03 06:34:02.178763814 +0000 UTC m=+179.522347332" watchObservedRunningTime="2025-12-03 06:34:02.182122963 +0000 UTC m=+179.525706471" Dec 03 06:34:02 crc kubenswrapper[4831]: I1203 06:34:02.201418 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rtq2s" podStartSLOduration=4.774464652 podStartE2EDuration="36.201403005s" podCreationTimestamp="2025-12-03 06:33:26 +0000 UTC" firstStartedPulling="2025-12-03 06:33:29.579412429 +0000 UTC m=+146.922995947" lastFinishedPulling="2025-12-03 06:34:01.006350782 +0000 UTC m=+178.349934300" observedRunningTime="2025-12-03 06:34:02.200224937 +0000 UTC m=+179.543808445" watchObservedRunningTime="2025-12-03 06:34:02.201403005 +0000 UTC m=+179.544986513" Dec 03 06:34:06 crc kubenswrapper[4831]: I1203 06:34:06.785691 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:34:06 crc kubenswrapper[4831]: I1203 06:34:06.786258 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:34:06 crc kubenswrapper[4831]: I1203 06:34:06.859212 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:34:06 crc kubenswrapper[4831]: I1203 06:34:06.955987 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:34:06 crc kubenswrapper[4831]: I1203 06:34:06.956034 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:34:06 crc kubenswrapper[4831]: I1203 06:34:06.996498 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:34:07 crc kubenswrapper[4831]: I1203 06:34:07.142536 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:34:07 crc kubenswrapper[4831]: I1203 06:34:07.160247 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:34:07 crc kubenswrapper[4831]: I1203 06:34:07.192708 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:34:07 crc kubenswrapper[4831]: I1203 06:34:07.193011 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:34:07 crc kubenswrapper[4831]: I1203 06:34:07.517435 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.154335 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.327259 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 06:34:08 crc kubenswrapper[4831]: E1203 06:34:08.327473 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a36103d-2c3c-4625-85ff-678a5c4ee9cf" containerName="pruner" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.327484 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a36103d-2c3c-4625-85ff-678a5c4ee9cf" containerName="pruner" Dec 03 06:34:08 crc kubenswrapper[4831]: E1203 06:34:08.327493 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccb1bed3-01b5-43c0-bb47-c093c5b2050f" containerName="pruner" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.327499 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccb1bed3-01b5-43c0-bb47-c093c5b2050f" containerName="pruner" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.327610 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a36103d-2c3c-4625-85ff-678a5c4ee9cf" containerName="pruner" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.327627 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccb1bed3-01b5-43c0-bb47-c093c5b2050f" containerName="pruner" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.327966 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.330226 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.331408 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.335984 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.437842 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/602d7cff-a928-4919-8f62-2e511822cc3d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"602d7cff-a928-4919-8f62-2e511822cc3d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.437921 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/602d7cff-a928-4919-8f62-2e511822cc3d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"602d7cff-a928-4919-8f62-2e511822cc3d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.539216 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/602d7cff-a928-4919-8f62-2e511822cc3d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"602d7cff-a928-4919-8f62-2e511822cc3d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.539291 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/602d7cff-a928-4919-8f62-2e511822cc3d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"602d7cff-a928-4919-8f62-2e511822cc3d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.539391 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/602d7cff-a928-4919-8f62-2e511822cc3d-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"602d7cff-a928-4919-8f62-2e511822cc3d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.562759 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/602d7cff-a928-4919-8f62-2e511822cc3d-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"602d7cff-a928-4919-8f62-2e511822cc3d\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.650818 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.870959 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.908975 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.909041 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:34:08 crc kubenswrapper[4831]: I1203 06:34:08.948949 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:34:09 crc kubenswrapper[4831]: I1203 06:34:09.118537 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"602d7cff-a928-4919-8f62-2e511822cc3d","Type":"ContainerStarted","Data":"44670cfa16af23ae3b850b057595b433250f0b42779d0640b385ae2704c91f90"} Dec 03 06:34:09 crc kubenswrapper[4831]: I1203 06:34:09.153702 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:34:09 crc kubenswrapper[4831]: I1203 06:34:09.342380 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:34:09 crc kubenswrapper[4831]: I1203 06:34:09.343348 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:34:09 crc kubenswrapper[4831]: I1203 06:34:09.378788 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:34:09 crc kubenswrapper[4831]: I1203 06:34:09.988606 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:34:09 crc kubenswrapper[4831]: I1203 06:34:09.988893 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:34:10 crc kubenswrapper[4831]: I1203 06:34:10.042218 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:34:10 crc kubenswrapper[4831]: I1203 06:34:10.124035 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"602d7cff-a928-4919-8f62-2e511822cc3d","Type":"ContainerStarted","Data":"4104f7b65f5fd2a57e5b38ef6dfb25b98e3aacfe18687421254d2ff8c2d4b4d5"} Dec 03 06:34:10 crc kubenswrapper[4831]: I1203 06:34:10.178812 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=2.178793165 podStartE2EDuration="2.178793165s" podCreationTimestamp="2025-12-03 06:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:34:10.168026447 +0000 UTC m=+187.511609945" watchObservedRunningTime="2025-12-03 06:34:10.178793165 +0000 UTC m=+187.522376673" Dec 03 06:34:10 crc kubenswrapper[4831]: I1203 06:34:10.184854 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-87j4p"] Dec 03 06:34:10 crc kubenswrapper[4831]: I1203 06:34:10.192916 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:34:10 crc kubenswrapper[4831]: I1203 06:34:10.220517 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:34:10 crc kubenswrapper[4831]: I1203 06:34:10.362066 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:34:10 crc kubenswrapper[4831]: I1203 06:34:10.362148 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:34:10 crc kubenswrapper[4831]: I1203 06:34:10.405191 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s4k78"] Dec 03 06:34:10 crc kubenswrapper[4831]: I1203 06:34:10.405465 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-s4k78" podUID="845a21ea-b176-471d-bacd-b98289285d1c" containerName="registry-server" containerID="cri-o://be844767d89186f1f42cbb2aa7e52ea5a68fe21ab07b31f12974cdb26b031f0d" gracePeriod=2 Dec 03 06:34:10 crc kubenswrapper[4831]: I1203 06:34:10.416031 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.130735 4831 generic.go:334] "Generic (PLEG): container finished" podID="602d7cff-a928-4919-8f62-2e511822cc3d" containerID="4104f7b65f5fd2a57e5b38ef6dfb25b98e3aacfe18687421254d2ff8c2d4b4d5" exitCode=0 Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.130823 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"602d7cff-a928-4919-8f62-2e511822cc3d","Type":"ContainerDied","Data":"4104f7b65f5fd2a57e5b38ef6dfb25b98e3aacfe18687421254d2ff8c2d4b4d5"} Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.135750 4831 generic.go:334] "Generic (PLEG): container finished" podID="845a21ea-b176-471d-bacd-b98289285d1c" containerID="be844767d89186f1f42cbb2aa7e52ea5a68fe21ab07b31f12974cdb26b031f0d" exitCode=0 Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.135801 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4k78" event={"ID":"845a21ea-b176-471d-bacd-b98289285d1c","Type":"ContainerDied","Data":"be844767d89186f1f42cbb2aa7e52ea5a68fe21ab07b31f12974cdb26b031f0d"} Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.179663 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.404543 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.477874 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/845a21ea-b176-471d-bacd-b98289285d1c-utilities\") pod \"845a21ea-b176-471d-bacd-b98289285d1c\" (UID: \"845a21ea-b176-471d-bacd-b98289285d1c\") " Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.478038 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/845a21ea-b176-471d-bacd-b98289285d1c-catalog-content\") pod \"845a21ea-b176-471d-bacd-b98289285d1c\" (UID: \"845a21ea-b176-471d-bacd-b98289285d1c\") " Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.478068 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l75q8\" (UniqueName: \"kubernetes.io/projected/845a21ea-b176-471d-bacd-b98289285d1c-kube-api-access-l75q8\") pod \"845a21ea-b176-471d-bacd-b98289285d1c\" (UID: \"845a21ea-b176-471d-bacd-b98289285d1c\") " Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.478782 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/845a21ea-b176-471d-bacd-b98289285d1c-utilities" (OuterVolumeSpecName: "utilities") pod "845a21ea-b176-471d-bacd-b98289285d1c" (UID: "845a21ea-b176-471d-bacd-b98289285d1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.483654 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/845a21ea-b176-471d-bacd-b98289285d1c-kube-api-access-l75q8" (OuterVolumeSpecName: "kube-api-access-l75q8") pod "845a21ea-b176-471d-bacd-b98289285d1c" (UID: "845a21ea-b176-471d-bacd-b98289285d1c"). InnerVolumeSpecName "kube-api-access-l75q8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.539676 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/845a21ea-b176-471d-bacd-b98289285d1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "845a21ea-b176-471d-bacd-b98289285d1c" (UID: "845a21ea-b176-471d-bacd-b98289285d1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.579336 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/845a21ea-b176-471d-bacd-b98289285d1c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.579377 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l75q8\" (UniqueName: \"kubernetes.io/projected/845a21ea-b176-471d-bacd-b98289285d1c-kube-api-access-l75q8\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:11 crc kubenswrapper[4831]: I1203 06:34:11.579394 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/845a21ea-b176-471d-bacd-b98289285d1c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.143422 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-s4k78" event={"ID":"845a21ea-b176-471d-bacd-b98289285d1c","Type":"ContainerDied","Data":"75583351d7e90dfab5bf0a7d2fe52634e0484919e890762ae4abb5e540c3f976"} Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.143484 4831 scope.go:117] "RemoveContainer" containerID="be844767d89186f1f42cbb2aa7e52ea5a68fe21ab07b31f12974cdb26b031f0d" Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.143705 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-s4k78" Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.166664 4831 scope.go:117] "RemoveContainer" containerID="db811c1197eb8ded5e8633fffe11b1f2b010443ab51f8fb7c506e9d731b728db" Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.174056 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-s4k78"] Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.180247 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-s4k78"] Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.202134 4831 scope.go:117] "RemoveContainer" containerID="804f9e8e87b6daeb6e06030bbcd451e81a53d6ac709746beb5b09c85f8d94454" Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.202354 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbzbt"] Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.395859 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.488559 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/602d7cff-a928-4919-8f62-2e511822cc3d-kube-api-access\") pod \"602d7cff-a928-4919-8f62-2e511822cc3d\" (UID: \"602d7cff-a928-4919-8f62-2e511822cc3d\") " Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.488639 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/602d7cff-a928-4919-8f62-2e511822cc3d-kubelet-dir\") pod \"602d7cff-a928-4919-8f62-2e511822cc3d\" (UID: \"602d7cff-a928-4919-8f62-2e511822cc3d\") " Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.488844 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/602d7cff-a928-4919-8f62-2e511822cc3d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "602d7cff-a928-4919-8f62-2e511822cc3d" (UID: "602d7cff-a928-4919-8f62-2e511822cc3d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.495473 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/602d7cff-a928-4919-8f62-2e511822cc3d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "602d7cff-a928-4919-8f62-2e511822cc3d" (UID: "602d7cff-a928-4919-8f62-2e511822cc3d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.589262 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/602d7cff-a928-4919-8f62-2e511822cc3d-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.589296 4831 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/602d7cff-a928-4919-8f62-2e511822cc3d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:12 crc kubenswrapper[4831]: I1203 06:34:12.798869 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6phx5"] Dec 03 06:34:13 crc kubenswrapper[4831]: I1203 06:34:13.020190 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="845a21ea-b176-471d-bacd-b98289285d1c" path="/var/lib/kubelet/pods/845a21ea-b176-471d-bacd-b98289285d1c/volumes" Dec 03 06:34:13 crc kubenswrapper[4831]: I1203 06:34:13.147780 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"602d7cff-a928-4919-8f62-2e511822cc3d","Type":"ContainerDied","Data":"44670cfa16af23ae3b850b057595b433250f0b42779d0640b385ae2704c91f90"} Dec 03 06:34:13 crc kubenswrapper[4831]: I1203 06:34:13.147815 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44670cfa16af23ae3b850b057595b433250f0b42779d0640b385ae2704c91f90" Dec 03 06:34:13 crc kubenswrapper[4831]: I1203 06:34:13.147862 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Dec 03 06:34:13 crc kubenswrapper[4831]: I1203 06:34:13.149729 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lbzbt" podUID="8ca97992-94be-4cf7-b532-1239da97bf6d" containerName="registry-server" containerID="cri-o://8c8ae87988c1198dbda0248024beaf074f2fde24376bc228b0ddf3b1e2b053cc" gracePeriod=2 Dec 03 06:34:13 crc kubenswrapper[4831]: I1203 06:34:13.149899 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6phx5" podUID="3f4f1dcf-c437-4687-93b5-4198d80bff3d" containerName="registry-server" containerID="cri-o://fb9965f7b23a6edb6430f8c54de03cdd208604f8c617b2c3062ba86f09933d76" gracePeriod=2 Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.157335 4831 generic.go:334] "Generic (PLEG): container finished" podID="8ca97992-94be-4cf7-b532-1239da97bf6d" containerID="8c8ae87988c1198dbda0248024beaf074f2fde24376bc228b0ddf3b1e2b053cc" exitCode=0 Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.157351 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbzbt" event={"ID":"8ca97992-94be-4cf7-b532-1239da97bf6d","Type":"ContainerDied","Data":"8c8ae87988c1198dbda0248024beaf074f2fde24376bc228b0ddf3b1e2b053cc"} Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.160196 4831 generic.go:334] "Generic (PLEG): container finished" podID="3f4f1dcf-c437-4687-93b5-4198d80bff3d" containerID="fb9965f7b23a6edb6430f8c54de03cdd208604f8c617b2c3062ba86f09933d76" exitCode=0 Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.160226 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6phx5" event={"ID":"3f4f1dcf-c437-4687-93b5-4198d80bff3d","Type":"ContainerDied","Data":"fb9965f7b23a6edb6430f8c54de03cdd208604f8c617b2c3062ba86f09933d76"} Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.231840 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.314018 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4f1dcf-c437-4687-93b5-4198d80bff3d-utilities\") pod \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\" (UID: \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\") " Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.314058 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4f1dcf-c437-4687-93b5-4198d80bff3d-catalog-content\") pod \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\" (UID: \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\") " Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.314085 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxh88\" (UniqueName: \"kubernetes.io/projected/3f4f1dcf-c437-4687-93b5-4198d80bff3d-kube-api-access-bxh88\") pod \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\" (UID: \"3f4f1dcf-c437-4687-93b5-4198d80bff3d\") " Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.314979 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f4f1dcf-c437-4687-93b5-4198d80bff3d-utilities" (OuterVolumeSpecName: "utilities") pod "3f4f1dcf-c437-4687-93b5-4198d80bff3d" (UID: "3f4f1dcf-c437-4687-93b5-4198d80bff3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.318791 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4f1dcf-c437-4687-93b5-4198d80bff3d-kube-api-access-bxh88" (OuterVolumeSpecName: "kube-api-access-bxh88") pod "3f4f1dcf-c437-4687-93b5-4198d80bff3d" (UID: "3f4f1dcf-c437-4687-93b5-4198d80bff3d"). InnerVolumeSpecName "kube-api-access-bxh88". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.415768 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3f4f1dcf-c437-4687-93b5-4198d80bff3d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.415795 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bxh88\" (UniqueName: \"kubernetes.io/projected/3f4f1dcf-c437-4687-93b5-4198d80bff3d-kube-api-access-bxh88\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.429015 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3f4f1dcf-c437-4687-93b5-4198d80bff3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3f4f1dcf-c437-4687-93b5-4198d80bff3d" (UID: "3f4f1dcf-c437-4687-93b5-4198d80bff3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.450929 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.517138 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9z82m\" (UniqueName: \"kubernetes.io/projected/8ca97992-94be-4cf7-b532-1239da97bf6d-kube-api-access-9z82m\") pod \"8ca97992-94be-4cf7-b532-1239da97bf6d\" (UID: \"8ca97992-94be-4cf7-b532-1239da97bf6d\") " Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.517194 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca97992-94be-4cf7-b532-1239da97bf6d-catalog-content\") pod \"8ca97992-94be-4cf7-b532-1239da97bf6d\" (UID: \"8ca97992-94be-4cf7-b532-1239da97bf6d\") " Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.517231 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca97992-94be-4cf7-b532-1239da97bf6d-utilities\") pod \"8ca97992-94be-4cf7-b532-1239da97bf6d\" (UID: \"8ca97992-94be-4cf7-b532-1239da97bf6d\") " Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.517557 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3f4f1dcf-c437-4687-93b5-4198d80bff3d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.518182 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ca97992-94be-4cf7-b532-1239da97bf6d-utilities" (OuterVolumeSpecName: "utilities") pod "8ca97992-94be-4cf7-b532-1239da97bf6d" (UID: "8ca97992-94be-4cf7-b532-1239da97bf6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.520385 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ca97992-94be-4cf7-b532-1239da97bf6d-kube-api-access-9z82m" (OuterVolumeSpecName: "kube-api-access-9z82m") pod "8ca97992-94be-4cf7-b532-1239da97bf6d" (UID: "8ca97992-94be-4cf7-b532-1239da97bf6d"). InnerVolumeSpecName "kube-api-access-9z82m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.537036 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ca97992-94be-4cf7-b532-1239da97bf6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8ca97992-94be-4cf7-b532-1239da97bf6d" (UID: "8ca97992-94be-4cf7-b532-1239da97bf6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.618930 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9z82m\" (UniqueName: \"kubernetes.io/projected/8ca97992-94be-4cf7-b532-1239da97bf6d-kube-api-access-9z82m\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.618976 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8ca97992-94be-4cf7-b532-1239da97bf6d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:14 crc kubenswrapper[4831]: I1203 06:34:14.618992 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8ca97992-94be-4cf7-b532-1239da97bf6d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.166971 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbzbt" event={"ID":"8ca97992-94be-4cf7-b532-1239da97bf6d","Type":"ContainerDied","Data":"e62cdeeccc9441442d1f68d29ba0d6803b936823a1e3de4bd11477d6441368d0"} Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.166986 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbzbt" Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.167081 4831 scope.go:117] "RemoveContainer" containerID="8c8ae87988c1198dbda0248024beaf074f2fde24376bc228b0ddf3b1e2b053cc" Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.171612 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6phx5" event={"ID":"3f4f1dcf-c437-4687-93b5-4198d80bff3d","Type":"ContainerDied","Data":"82e628eac0c30eed291f5f076cc698cc79198c491df5756f3b8884a109a88b48"} Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.171660 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6phx5" Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.185120 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbzbt"] Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.191734 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbzbt"] Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.196555 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6phx5"] Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.199927 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6phx5"] Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.295216 4831 scope.go:117] "RemoveContainer" containerID="b1adc17f6d3a925078a0c2fdea2ced839e910fc664b624901628cb1f9079d708" Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.316747 4831 scope.go:117] "RemoveContainer" containerID="832b8b90569f40454e7f5ffc7b4554d531da87cf26dde69e2836aca39385447b" Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.341579 4831 scope.go:117] "RemoveContainer" containerID="fb9965f7b23a6edb6430f8c54de03cdd208604f8c617b2c3062ba86f09933d76" Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.379436 4831 scope.go:117] "RemoveContainer" containerID="57d93b642634757b5ec710f07536cdbafa51fc4a24cd032e525ae5495a38e373" Dec 03 06:34:15 crc kubenswrapper[4831]: I1203 06:34:15.400881 4831 scope.go:117] "RemoveContainer" containerID="4cf808f161d897df1e1526d2227b0b99ae0ac96b594c3ea0ce5019a1fc4fa55d" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.180488 4831 generic.go:334] "Generic (PLEG): container finished" podID="de96d02d-bef7-48a1-85f6-2d6086ff6498" containerID="588570f019c021f536234edc6757f2bfad7f083e7c27ab809556f72d558b6a8b" exitCode=0 Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.180521 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdmr5" event={"ID":"de96d02d-bef7-48a1-85f6-2d6086ff6498","Type":"ContainerDied","Data":"588570f019c021f536234edc6757f2bfad7f083e7c27ab809556f72d558b6a8b"} Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.728776 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 06:34:16 crc kubenswrapper[4831]: E1203 06:34:16.729120 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4f1dcf-c437-4687-93b5-4198d80bff3d" containerName="registry-server" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729143 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4f1dcf-c437-4687-93b5-4198d80bff3d" containerName="registry-server" Dec 03 06:34:16 crc kubenswrapper[4831]: E1203 06:34:16.729159 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="845a21ea-b176-471d-bacd-b98289285d1c" containerName="extract-utilities" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729168 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="845a21ea-b176-471d-bacd-b98289285d1c" containerName="extract-utilities" Dec 03 06:34:16 crc kubenswrapper[4831]: E1203 06:34:16.729184 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4f1dcf-c437-4687-93b5-4198d80bff3d" containerName="extract-content" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729194 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4f1dcf-c437-4687-93b5-4198d80bff3d" containerName="extract-content" Dec 03 06:34:16 crc kubenswrapper[4831]: E1203 06:34:16.729206 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca97992-94be-4cf7-b532-1239da97bf6d" containerName="extract-utilities" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729214 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca97992-94be-4cf7-b532-1239da97bf6d" containerName="extract-utilities" Dec 03 06:34:16 crc kubenswrapper[4831]: E1203 06:34:16.729228 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4f1dcf-c437-4687-93b5-4198d80bff3d" containerName="extract-utilities" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729237 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4f1dcf-c437-4687-93b5-4198d80bff3d" containerName="extract-utilities" Dec 03 06:34:16 crc kubenswrapper[4831]: E1203 06:34:16.729251 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="845a21ea-b176-471d-bacd-b98289285d1c" containerName="registry-server" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729261 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="845a21ea-b176-471d-bacd-b98289285d1c" containerName="registry-server" Dec 03 06:34:16 crc kubenswrapper[4831]: E1203 06:34:16.729273 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="602d7cff-a928-4919-8f62-2e511822cc3d" containerName="pruner" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729282 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="602d7cff-a928-4919-8f62-2e511822cc3d" containerName="pruner" Dec 03 06:34:16 crc kubenswrapper[4831]: E1203 06:34:16.729300 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca97992-94be-4cf7-b532-1239da97bf6d" containerName="extract-content" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729308 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca97992-94be-4cf7-b532-1239da97bf6d" containerName="extract-content" Dec 03 06:34:16 crc kubenswrapper[4831]: E1203 06:34:16.729337 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ca97992-94be-4cf7-b532-1239da97bf6d" containerName="registry-server" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729346 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ca97992-94be-4cf7-b532-1239da97bf6d" containerName="registry-server" Dec 03 06:34:16 crc kubenswrapper[4831]: E1203 06:34:16.729358 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="845a21ea-b176-471d-bacd-b98289285d1c" containerName="extract-content" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729366 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="845a21ea-b176-471d-bacd-b98289285d1c" containerName="extract-content" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729472 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="602d7cff-a928-4919-8f62-2e511822cc3d" containerName="pruner" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729486 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4f1dcf-c437-4687-93b5-4198d80bff3d" containerName="registry-server" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729506 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="845a21ea-b176-471d-bacd-b98289285d1c" containerName="registry-server" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729520 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ca97992-94be-4cf7-b532-1239da97bf6d" containerName="registry-server" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.729910 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.734714 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.738810 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.739084 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.743375 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baccf8bb-eb1e-4298-8841-0aaf91b213f6-kube-api-access\") pod \"installer-9-crc\" (UID: \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.743560 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/baccf8bb-eb1e-4298-8841-0aaf91b213f6-var-lock\") pod \"installer-9-crc\" (UID: \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.743659 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baccf8bb-eb1e-4298-8841-0aaf91b213f6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.845699 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baccf8bb-eb1e-4298-8841-0aaf91b213f6-kube-api-access\") pod \"installer-9-crc\" (UID: \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.845957 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/baccf8bb-eb1e-4298-8841-0aaf91b213f6-var-lock\") pod \"installer-9-crc\" (UID: \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.845996 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baccf8bb-eb1e-4298-8841-0aaf91b213f6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.846054 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baccf8bb-eb1e-4298-8841-0aaf91b213f6-kubelet-dir\") pod \"installer-9-crc\" (UID: \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.846089 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/baccf8bb-eb1e-4298-8841-0aaf91b213f6-var-lock\") pod \"installer-9-crc\" (UID: \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:34:16 crc kubenswrapper[4831]: I1203 06:34:16.887845 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baccf8bb-eb1e-4298-8841-0aaf91b213f6-kube-api-access\") pod \"installer-9-crc\" (UID: \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\") " pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:34:17 crc kubenswrapper[4831]: I1203 06:34:17.018493 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4f1dcf-c437-4687-93b5-4198d80bff3d" path="/var/lib/kubelet/pods/3f4f1dcf-c437-4687-93b5-4198d80bff3d/volumes" Dec 03 06:34:17 crc kubenswrapper[4831]: I1203 06:34:17.019256 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ca97992-94be-4cf7-b532-1239da97bf6d" path="/var/lib/kubelet/pods/8ca97992-94be-4cf7-b532-1239da97bf6d/volumes" Dec 03 06:34:17 crc kubenswrapper[4831]: I1203 06:34:17.072454 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:34:17 crc kubenswrapper[4831]: I1203 06:34:17.200449 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdmr5" event={"ID":"de96d02d-bef7-48a1-85f6-2d6086ff6498","Type":"ContainerStarted","Data":"f9a57183f9409e1128477bdee490e393070cac6ce488ca13fd50c42a464e845a"} Dec 03 06:34:17 crc kubenswrapper[4831]: I1203 06:34:17.228092 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-fdmr5" podStartSLOduration=4.05507 podStartE2EDuration="51.228071702s" podCreationTimestamp="2025-12-03 06:33:26 +0000 UTC" firstStartedPulling="2025-12-03 06:33:29.508652574 +0000 UTC m=+146.852236082" lastFinishedPulling="2025-12-03 06:34:16.681654276 +0000 UTC m=+194.025237784" observedRunningTime="2025-12-03 06:34:17.224032773 +0000 UTC m=+194.567616281" watchObservedRunningTime="2025-12-03 06:34:17.228071702 +0000 UTC m=+194.571655210" Dec 03 06:34:17 crc kubenswrapper[4831]: I1203 06:34:17.365285 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:34:17 crc kubenswrapper[4831]: I1203 06:34:17.365348 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:34:17 crc kubenswrapper[4831]: I1203 06:34:17.456020 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Dec 03 06:34:17 crc kubenswrapper[4831]: W1203 06:34:17.465408 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbaccf8bb_eb1e_4298_8841_0aaf91b213f6.slice/crio-ddd82b6040ab359194a6bb32b371cf954cbfe6420235ca5bc36716de987fc41f WatchSource:0}: Error finding container ddd82b6040ab359194a6bb32b371cf954cbfe6420235ca5bc36716de987fc41f: Status 404 returned error can't find the container with id ddd82b6040ab359194a6bb32b371cf954cbfe6420235ca5bc36716de987fc41f Dec 03 06:34:18 crc kubenswrapper[4831]: I1203 06:34:18.207704 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"baccf8bb-eb1e-4298-8841-0aaf91b213f6","Type":"ContainerStarted","Data":"709810af3d5ef6ce2ffccdad589fd52439a4e3c137d7a417e881a9c993bbf549"} Dec 03 06:34:18 crc kubenswrapper[4831]: I1203 06:34:18.208100 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"baccf8bb-eb1e-4298-8841-0aaf91b213f6","Type":"ContainerStarted","Data":"ddd82b6040ab359194a6bb32b371cf954cbfe6420235ca5bc36716de987fc41f"} Dec 03 06:34:18 crc kubenswrapper[4831]: I1203 06:34:18.226249 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=2.226229217 podStartE2EDuration="2.226229217s" podCreationTimestamp="2025-12-03 06:34:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:34:18.223791889 +0000 UTC m=+195.567375397" watchObservedRunningTime="2025-12-03 06:34:18.226229217 +0000 UTC m=+195.569812745" Dec 03 06:34:18 crc kubenswrapper[4831]: I1203 06:34:18.404441 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-fdmr5" podUID="de96d02d-bef7-48a1-85f6-2d6086ff6498" containerName="registry-server" probeResult="failure" output=< Dec 03 06:34:18 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 03 06:34:18 crc kubenswrapper[4831]: > Dec 03 06:34:25 crc kubenswrapper[4831]: I1203 06:34:25.766102 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-22sdc"] Dec 03 06:34:25 crc kubenswrapper[4831]: I1203 06:34:25.766961 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" podUID="699c38a7-81dd-4614-8bb9-cd97b5756fc4" containerName="controller-manager" containerID="cri-o://21d9b1b36c3eb5e424864eeb70810632c5bdc3d03eb27cce3be6a9f83485f5c3" gracePeriod=30 Dec 03 06:34:25 crc kubenswrapper[4831]: I1203 06:34:25.789209 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch"] Dec 03 06:34:25 crc kubenswrapper[4831]: I1203 06:34:25.789776 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" podUID="8cfeb8ef-4262-4aef-a179-3018896ace13" containerName="route-controller-manager" containerID="cri-o://ba316c9a02f345418fa524952d6b5564f14d1fcf784ad74d80654167161a1899" gracePeriod=30 Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.138266 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.144745 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.156364 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cfeb8ef-4262-4aef-a179-3018896ace13-config\") pod \"8cfeb8ef-4262-4aef-a179-3018896ace13\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.156482 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-config\") pod \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.156535 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-proxy-ca-bundles\") pod \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.156565 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cfeb8ef-4262-4aef-a179-3018896ace13-client-ca\") pod \"8cfeb8ef-4262-4aef-a179-3018896ace13\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.156598 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cfeb8ef-4262-4aef-a179-3018896ace13-serving-cert\") pod \"8cfeb8ef-4262-4aef-a179-3018896ace13\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.156626 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvkpr\" (UniqueName: \"kubernetes.io/projected/699c38a7-81dd-4614-8bb9-cd97b5756fc4-kube-api-access-cvkpr\") pod \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.156658 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-client-ca\") pod \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.156723 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699c38a7-81dd-4614-8bb9-cd97b5756fc4-serving-cert\") pod \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\" (UID: \"699c38a7-81dd-4614-8bb9-cd97b5756fc4\") " Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.156818 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w55j7\" (UniqueName: \"kubernetes.io/projected/8cfeb8ef-4262-4aef-a179-3018896ace13-kube-api-access-w55j7\") pod \"8cfeb8ef-4262-4aef-a179-3018896ace13\" (UID: \"8cfeb8ef-4262-4aef-a179-3018896ace13\") " Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.157279 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "699c38a7-81dd-4614-8bb9-cd97b5756fc4" (UID: "699c38a7-81dd-4614-8bb9-cd97b5756fc4"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.157414 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cfeb8ef-4262-4aef-a179-3018896ace13-client-ca" (OuterVolumeSpecName: "client-ca") pod "8cfeb8ef-4262-4aef-a179-3018896ace13" (UID: "8cfeb8ef-4262-4aef-a179-3018896ace13"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.158362 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-client-ca" (OuterVolumeSpecName: "client-ca") pod "699c38a7-81dd-4614-8bb9-cd97b5756fc4" (UID: "699c38a7-81dd-4614-8bb9-cd97b5756fc4"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.158564 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cfeb8ef-4262-4aef-a179-3018896ace13-config" (OuterVolumeSpecName: "config") pod "8cfeb8ef-4262-4aef-a179-3018896ace13" (UID: "8cfeb8ef-4262-4aef-a179-3018896ace13"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.158795 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-config" (OuterVolumeSpecName: "config") pod "699c38a7-81dd-4614-8bb9-cd97b5756fc4" (UID: "699c38a7-81dd-4614-8bb9-cd97b5756fc4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.163282 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/699c38a7-81dd-4614-8bb9-cd97b5756fc4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "699c38a7-81dd-4614-8bb9-cd97b5756fc4" (UID: "699c38a7-81dd-4614-8bb9-cd97b5756fc4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.164490 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/699c38a7-81dd-4614-8bb9-cd97b5756fc4-kube-api-access-cvkpr" (OuterVolumeSpecName: "kube-api-access-cvkpr") pod "699c38a7-81dd-4614-8bb9-cd97b5756fc4" (UID: "699c38a7-81dd-4614-8bb9-cd97b5756fc4"). InnerVolumeSpecName "kube-api-access-cvkpr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.168987 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cfeb8ef-4262-4aef-a179-3018896ace13-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cfeb8ef-4262-4aef-a179-3018896ace13" (UID: "8cfeb8ef-4262-4aef-a179-3018896ace13"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.175155 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cfeb8ef-4262-4aef-a179-3018896ace13-kube-api-access-w55j7" (OuterVolumeSpecName: "kube-api-access-w55j7") pod "8cfeb8ef-4262-4aef-a179-3018896ace13" (UID: "8cfeb8ef-4262-4aef-a179-3018896ace13"). InnerVolumeSpecName "kube-api-access-w55j7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.253276 4831 generic.go:334] "Generic (PLEG): container finished" podID="8cfeb8ef-4262-4aef-a179-3018896ace13" containerID="ba316c9a02f345418fa524952d6b5564f14d1fcf784ad74d80654167161a1899" exitCode=0 Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.253409 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.253495 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" event={"ID":"8cfeb8ef-4262-4aef-a179-3018896ace13","Type":"ContainerDied","Data":"ba316c9a02f345418fa524952d6b5564f14d1fcf784ad74d80654167161a1899"} Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.253582 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch" event={"ID":"8cfeb8ef-4262-4aef-a179-3018896ace13","Type":"ContainerDied","Data":"c8b154f64d0cb5d1e9286530da0327d0886f1a3e1ec8d9b3c3422cb1c5653f64"} Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.253616 4831 scope.go:117] "RemoveContainer" containerID="ba316c9a02f345418fa524952d6b5564f14d1fcf784ad74d80654167161a1899" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.256426 4831 generic.go:334] "Generic (PLEG): container finished" podID="699c38a7-81dd-4614-8bb9-cd97b5756fc4" containerID="21d9b1b36c3eb5e424864eeb70810632c5bdc3d03eb27cce3be6a9f83485f5c3" exitCode=0 Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.256477 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" event={"ID":"699c38a7-81dd-4614-8bb9-cd97b5756fc4","Type":"ContainerDied","Data":"21d9b1b36c3eb5e424864eeb70810632c5bdc3d03eb27cce3be6a9f83485f5c3"} Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.256521 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" event={"ID":"699c38a7-81dd-4614-8bb9-cd97b5756fc4","Type":"ContainerDied","Data":"ae95805608391884a4f8608836752fca345ab3acf2daa14ef0ff7f4d0bc2fa45"} Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.256484 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-22sdc" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.258252 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.258276 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/699c38a7-81dd-4614-8bb9-cd97b5756fc4-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.258285 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w55j7\" (UniqueName: \"kubernetes.io/projected/8cfeb8ef-4262-4aef-a179-3018896ace13-kube-api-access-w55j7\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.258295 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cfeb8ef-4262-4aef-a179-3018896ace13-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.258303 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.258325 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/699c38a7-81dd-4614-8bb9-cd97b5756fc4-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.258334 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8cfeb8ef-4262-4aef-a179-3018896ace13-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.258342 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cfeb8ef-4262-4aef-a179-3018896ace13-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.258350 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvkpr\" (UniqueName: \"kubernetes.io/projected/699c38a7-81dd-4614-8bb9-cd97b5756fc4-kube-api-access-cvkpr\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.276844 4831 scope.go:117] "RemoveContainer" containerID="ba316c9a02f345418fa524952d6b5564f14d1fcf784ad74d80654167161a1899" Dec 03 06:34:26 crc kubenswrapper[4831]: E1203 06:34:26.277306 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba316c9a02f345418fa524952d6b5564f14d1fcf784ad74d80654167161a1899\": container with ID starting with ba316c9a02f345418fa524952d6b5564f14d1fcf784ad74d80654167161a1899 not found: ID does not exist" containerID="ba316c9a02f345418fa524952d6b5564f14d1fcf784ad74d80654167161a1899" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.277411 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba316c9a02f345418fa524952d6b5564f14d1fcf784ad74d80654167161a1899"} err="failed to get container status \"ba316c9a02f345418fa524952d6b5564f14d1fcf784ad74d80654167161a1899\": rpc error: code = NotFound desc = could not find container \"ba316c9a02f345418fa524952d6b5564f14d1fcf784ad74d80654167161a1899\": container with ID starting with ba316c9a02f345418fa524952d6b5564f14d1fcf784ad74d80654167161a1899 not found: ID does not exist" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.277475 4831 scope.go:117] "RemoveContainer" containerID="21d9b1b36c3eb5e424864eeb70810632c5bdc3d03eb27cce3be6a9f83485f5c3" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.291004 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch"] Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.293720 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-4jrch"] Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.303105 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-22sdc"] Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.305450 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-22sdc"] Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.308883 4831 scope.go:117] "RemoveContainer" containerID="21d9b1b36c3eb5e424864eeb70810632c5bdc3d03eb27cce3be6a9f83485f5c3" Dec 03 06:34:26 crc kubenswrapper[4831]: E1203 06:34:26.309828 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21d9b1b36c3eb5e424864eeb70810632c5bdc3d03eb27cce3be6a9f83485f5c3\": container with ID starting with 21d9b1b36c3eb5e424864eeb70810632c5bdc3d03eb27cce3be6a9f83485f5c3 not found: ID does not exist" containerID="21d9b1b36c3eb5e424864eeb70810632c5bdc3d03eb27cce3be6a9f83485f5c3" Dec 03 06:34:26 crc kubenswrapper[4831]: I1203 06:34:26.309861 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21d9b1b36c3eb5e424864eeb70810632c5bdc3d03eb27cce3be6a9f83485f5c3"} err="failed to get container status \"21d9b1b36c3eb5e424864eeb70810632c5bdc3d03eb27cce3be6a9f83485f5c3\": rpc error: code = NotFound desc = could not find container \"21d9b1b36c3eb5e424864eeb70810632c5bdc3d03eb27cce3be6a9f83485f5c3\": container with ID starting with 21d9b1b36c3eb5e424864eeb70810632c5bdc3d03eb27cce3be6a9f83485f5c3 not found: ID does not exist" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.022490 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="699c38a7-81dd-4614-8bb9-cd97b5756fc4" path="/var/lib/kubelet/pods/699c38a7-81dd-4614-8bb9-cd97b5756fc4/volumes" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.023092 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cfeb8ef-4262-4aef-a179-3018896ace13" path="/var/lib/kubelet/pods/8cfeb8ef-4262-4aef-a179-3018896ace13/volumes" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.413540 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.453924 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.596279 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.596366 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.596419 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.597038 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.597103 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c" gracePeriod=600 Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.627795 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-59f79b7694-cd2tn"] Dec 03 06:34:27 crc kubenswrapper[4831]: E1203 06:34:27.628087 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="699c38a7-81dd-4614-8bb9-cd97b5756fc4" containerName="controller-manager" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.628108 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="699c38a7-81dd-4614-8bb9-cd97b5756fc4" containerName="controller-manager" Dec 03 06:34:27 crc kubenswrapper[4831]: E1203 06:34:27.628126 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cfeb8ef-4262-4aef-a179-3018896ace13" containerName="route-controller-manager" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.628135 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cfeb8ef-4262-4aef-a179-3018896ace13" containerName="route-controller-manager" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.628251 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="699c38a7-81dd-4614-8bb9-cd97b5756fc4" containerName="controller-manager" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.628268 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cfeb8ef-4262-4aef-a179-3018896ace13" containerName="route-controller-manager" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.628720 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.631071 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz"] Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.631825 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.635003 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.635010 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.636454 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.636579 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.636982 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.637020 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.637083 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.637146 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.637215 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.637106 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.637926 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.646778 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.649143 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59f79b7694-cd2tn"] Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.654587 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz"] Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.654611 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.675420 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-client-ca\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.675470 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-config\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.675524 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c616f87-e9b7-492b-addc-73088c201213-serving-cert\") pod \"route-controller-manager-568767bfc4-qbswz\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.675551 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dth2d\" (UniqueName: \"kubernetes.io/projected/7cfdd9a8-5caa-4426-8c88-15340686da2d-kube-api-access-dth2d\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.675619 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c966n\" (UniqueName: \"kubernetes.io/projected/8c616f87-e9b7-492b-addc-73088c201213-kube-api-access-c966n\") pod \"route-controller-manager-568767bfc4-qbswz\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.675670 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cfdd9a8-5caa-4426-8c88-15340686da2d-serving-cert\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.675923 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c616f87-e9b7-492b-addc-73088c201213-client-ca\") pod \"route-controller-manager-568767bfc4-qbswz\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.675995 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-proxy-ca-bundles\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.676028 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c616f87-e9b7-492b-addc-73088c201213-config\") pod \"route-controller-manager-568767bfc4-qbswz\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.780835 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-client-ca\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.780919 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-config\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.780990 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c616f87-e9b7-492b-addc-73088c201213-serving-cert\") pod \"route-controller-manager-568767bfc4-qbswz\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.781025 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dth2d\" (UniqueName: \"kubernetes.io/projected/7cfdd9a8-5caa-4426-8c88-15340686da2d-kube-api-access-dth2d\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.781061 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c966n\" (UniqueName: \"kubernetes.io/projected/8c616f87-e9b7-492b-addc-73088c201213-kube-api-access-c966n\") pod \"route-controller-manager-568767bfc4-qbswz\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.781099 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cfdd9a8-5caa-4426-8c88-15340686da2d-serving-cert\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.781182 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c616f87-e9b7-492b-addc-73088c201213-client-ca\") pod \"route-controller-manager-568767bfc4-qbswz\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.781215 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-proxy-ca-bundles\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.781244 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c616f87-e9b7-492b-addc-73088c201213-config\") pod \"route-controller-manager-568767bfc4-qbswz\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.782489 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-client-ca\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.782572 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c616f87-e9b7-492b-addc-73088c201213-config\") pod \"route-controller-manager-568767bfc4-qbswz\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.783943 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c616f87-e9b7-492b-addc-73088c201213-client-ca\") pod \"route-controller-manager-568767bfc4-qbswz\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.784403 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-proxy-ca-bundles\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.785024 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-config\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.806810 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c616f87-e9b7-492b-addc-73088c201213-serving-cert\") pod \"route-controller-manager-568767bfc4-qbswz\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.807279 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cfdd9a8-5caa-4426-8c88-15340686da2d-serving-cert\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.809661 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c966n\" (UniqueName: \"kubernetes.io/projected/8c616f87-e9b7-492b-addc-73088c201213-kube-api-access-c966n\") pod \"route-controller-manager-568767bfc4-qbswz\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.810935 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dth2d\" (UniqueName: \"kubernetes.io/projected/7cfdd9a8-5caa-4426-8c88-15340686da2d-kube-api-access-dth2d\") pod \"controller-manager-59f79b7694-cd2tn\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.948862 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:27 crc kubenswrapper[4831]: I1203 06:34:27.961134 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:28 crc kubenswrapper[4831]: I1203 06:34:28.181679 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-59f79b7694-cd2tn"] Dec 03 06:34:28 crc kubenswrapper[4831]: W1203 06:34:28.185341 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cfdd9a8_5caa_4426_8c88_15340686da2d.slice/crio-ddfa3f4bc6eed394e9097d7bc89d5f3d00c9943b85b2e5830d3237351daf5766 WatchSource:0}: Error finding container ddfa3f4bc6eed394e9097d7bc89d5f3d00c9943b85b2e5830d3237351daf5766: Status 404 returned error can't find the container with id ddfa3f4bc6eed394e9097d7bc89d5f3d00c9943b85b2e5830d3237351daf5766 Dec 03 06:34:28 crc kubenswrapper[4831]: I1203 06:34:28.232750 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz"] Dec 03 06:34:28 crc kubenswrapper[4831]: W1203 06:34:28.239238 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c616f87_e9b7_492b_addc_73088c201213.slice/crio-5f3edbfea3ccf4112e5acd98ba8c6e0c64a70fd785d1ea074f04211101efa88a WatchSource:0}: Error finding container 5f3edbfea3ccf4112e5acd98ba8c6e0c64a70fd785d1ea074f04211101efa88a: Status 404 returned error can't find the container with id 5f3edbfea3ccf4112e5acd98ba8c6e0c64a70fd785d1ea074f04211101efa88a Dec 03 06:34:28 crc kubenswrapper[4831]: I1203 06:34:28.271176 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" event={"ID":"8c616f87-e9b7-492b-addc-73088c201213","Type":"ContainerStarted","Data":"5f3edbfea3ccf4112e5acd98ba8c6e0c64a70fd785d1ea074f04211101efa88a"} Dec 03 06:34:28 crc kubenswrapper[4831]: I1203 06:34:28.272717 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c" exitCode=0 Dec 03 06:34:28 crc kubenswrapper[4831]: I1203 06:34:28.272756 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c"} Dec 03 06:34:28 crc kubenswrapper[4831]: I1203 06:34:28.278761 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" event={"ID":"7cfdd9a8-5caa-4426-8c88-15340686da2d","Type":"ContainerStarted","Data":"ddfa3f4bc6eed394e9097d7bc89d5f3d00c9943b85b2e5830d3237351daf5766"} Dec 03 06:34:29 crc kubenswrapper[4831]: I1203 06:34:29.289356 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"b518ced4b22d22ab3d00838b322beb819fadfbd73a5137b1811cce469ae983d6"} Dec 03 06:34:29 crc kubenswrapper[4831]: I1203 06:34:29.293061 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" event={"ID":"7cfdd9a8-5caa-4426-8c88-15340686da2d","Type":"ContainerStarted","Data":"8c8bea68a015ba0ce0f725b6763cb48bd5581398b30e8f2458b26c85683242f2"} Dec 03 06:34:29 crc kubenswrapper[4831]: I1203 06:34:29.293381 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:29 crc kubenswrapper[4831]: I1203 06:34:29.298737 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" event={"ID":"8c616f87-e9b7-492b-addc-73088c201213","Type":"ContainerStarted","Data":"3f2a82e089caadf5b3a3fc2fd1143cc93ce8e3224be6380e78892d5cede27d70"} Dec 03 06:34:29 crc kubenswrapper[4831]: I1203 06:34:29.300842 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:29 crc kubenswrapper[4831]: I1203 06:34:29.304940 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:29 crc kubenswrapper[4831]: I1203 06:34:29.309867 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:29 crc kubenswrapper[4831]: I1203 06:34:29.348563 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" podStartSLOduration=4.348539545 podStartE2EDuration="4.348539545s" podCreationTimestamp="2025-12-03 06:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:34:29.342163269 +0000 UTC m=+206.685746807" watchObservedRunningTime="2025-12-03 06:34:29.348539545 +0000 UTC m=+206.692123063" Dec 03 06:34:29 crc kubenswrapper[4831]: I1203 06:34:29.423593 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" podStartSLOduration=4.423554353 podStartE2EDuration="4.423554353s" podCreationTimestamp="2025-12-03 06:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:34:29.419548984 +0000 UTC m=+206.763132582" watchObservedRunningTime="2025-12-03 06:34:29.423554353 +0000 UTC m=+206.767137901" Dec 03 06:34:30 crc kubenswrapper[4831]: I1203 06:34:30.604759 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fdmr5"] Dec 03 06:34:30 crc kubenswrapper[4831]: I1203 06:34:30.605155 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-fdmr5" podUID="de96d02d-bef7-48a1-85f6-2d6086ff6498" containerName="registry-server" containerID="cri-o://f9a57183f9409e1128477bdee490e393070cac6ce488ca13fd50c42a464e845a" gracePeriod=2 Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.068662 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.131791 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9b9xn\" (UniqueName: \"kubernetes.io/projected/de96d02d-bef7-48a1-85f6-2d6086ff6498-kube-api-access-9b9xn\") pod \"de96d02d-bef7-48a1-85f6-2d6086ff6498\" (UID: \"de96d02d-bef7-48a1-85f6-2d6086ff6498\") " Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.131872 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de96d02d-bef7-48a1-85f6-2d6086ff6498-catalog-content\") pod \"de96d02d-bef7-48a1-85f6-2d6086ff6498\" (UID: \"de96d02d-bef7-48a1-85f6-2d6086ff6498\") " Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.132181 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de96d02d-bef7-48a1-85f6-2d6086ff6498-utilities\") pod \"de96d02d-bef7-48a1-85f6-2d6086ff6498\" (UID: \"de96d02d-bef7-48a1-85f6-2d6086ff6498\") " Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.133423 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de96d02d-bef7-48a1-85f6-2d6086ff6498-utilities" (OuterVolumeSpecName: "utilities") pod "de96d02d-bef7-48a1-85f6-2d6086ff6498" (UID: "de96d02d-bef7-48a1-85f6-2d6086ff6498"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.133845 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de96d02d-bef7-48a1-85f6-2d6086ff6498-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.138977 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de96d02d-bef7-48a1-85f6-2d6086ff6498-kube-api-access-9b9xn" (OuterVolumeSpecName: "kube-api-access-9b9xn") pod "de96d02d-bef7-48a1-85f6-2d6086ff6498" (UID: "de96d02d-bef7-48a1-85f6-2d6086ff6498"). InnerVolumeSpecName "kube-api-access-9b9xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.197087 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de96d02d-bef7-48a1-85f6-2d6086ff6498-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de96d02d-bef7-48a1-85f6-2d6086ff6498" (UID: "de96d02d-bef7-48a1-85f6-2d6086ff6498"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.236364 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9b9xn\" (UniqueName: \"kubernetes.io/projected/de96d02d-bef7-48a1-85f6-2d6086ff6498-kube-api-access-9b9xn\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.236412 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de96d02d-bef7-48a1-85f6-2d6086ff6498-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.315122 4831 generic.go:334] "Generic (PLEG): container finished" podID="de96d02d-bef7-48a1-85f6-2d6086ff6498" containerID="f9a57183f9409e1128477bdee490e393070cac6ce488ca13fd50c42a464e845a" exitCode=0 Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.316380 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-fdmr5" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.317283 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdmr5" event={"ID":"de96d02d-bef7-48a1-85f6-2d6086ff6498","Type":"ContainerDied","Data":"f9a57183f9409e1128477bdee490e393070cac6ce488ca13fd50c42a464e845a"} Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.317405 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-fdmr5" event={"ID":"de96d02d-bef7-48a1-85f6-2d6086ff6498","Type":"ContainerDied","Data":"ea6004a14ec9765a93c18a26d241f288366156cb5a7a54206e46e2a8620f6f86"} Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.317439 4831 scope.go:117] "RemoveContainer" containerID="f9a57183f9409e1128477bdee490e393070cac6ce488ca13fd50c42a464e845a" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.345171 4831 scope.go:117] "RemoveContainer" containerID="588570f019c021f536234edc6757f2bfad7f083e7c27ab809556f72d558b6a8b" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.369089 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-fdmr5"] Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.372726 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-fdmr5"] Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.386005 4831 scope.go:117] "RemoveContainer" containerID="8fe7b1b56da394b9c9fee8380d217c0f1e9ae002319006dc11321d933d9a8c8d" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.414869 4831 scope.go:117] "RemoveContainer" containerID="f9a57183f9409e1128477bdee490e393070cac6ce488ca13fd50c42a464e845a" Dec 03 06:34:31 crc kubenswrapper[4831]: E1203 06:34:31.415376 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9a57183f9409e1128477bdee490e393070cac6ce488ca13fd50c42a464e845a\": container with ID starting with f9a57183f9409e1128477bdee490e393070cac6ce488ca13fd50c42a464e845a not found: ID does not exist" containerID="f9a57183f9409e1128477bdee490e393070cac6ce488ca13fd50c42a464e845a" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.415413 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9a57183f9409e1128477bdee490e393070cac6ce488ca13fd50c42a464e845a"} err="failed to get container status \"f9a57183f9409e1128477bdee490e393070cac6ce488ca13fd50c42a464e845a\": rpc error: code = NotFound desc = could not find container \"f9a57183f9409e1128477bdee490e393070cac6ce488ca13fd50c42a464e845a\": container with ID starting with f9a57183f9409e1128477bdee490e393070cac6ce488ca13fd50c42a464e845a not found: ID does not exist" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.415440 4831 scope.go:117] "RemoveContainer" containerID="588570f019c021f536234edc6757f2bfad7f083e7c27ab809556f72d558b6a8b" Dec 03 06:34:31 crc kubenswrapper[4831]: E1203 06:34:31.415918 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"588570f019c021f536234edc6757f2bfad7f083e7c27ab809556f72d558b6a8b\": container with ID starting with 588570f019c021f536234edc6757f2bfad7f083e7c27ab809556f72d558b6a8b not found: ID does not exist" containerID="588570f019c021f536234edc6757f2bfad7f083e7c27ab809556f72d558b6a8b" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.416003 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"588570f019c021f536234edc6757f2bfad7f083e7c27ab809556f72d558b6a8b"} err="failed to get container status \"588570f019c021f536234edc6757f2bfad7f083e7c27ab809556f72d558b6a8b\": rpc error: code = NotFound desc = could not find container \"588570f019c021f536234edc6757f2bfad7f083e7c27ab809556f72d558b6a8b\": container with ID starting with 588570f019c021f536234edc6757f2bfad7f083e7c27ab809556f72d558b6a8b not found: ID does not exist" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.416072 4831 scope.go:117] "RemoveContainer" containerID="8fe7b1b56da394b9c9fee8380d217c0f1e9ae002319006dc11321d933d9a8c8d" Dec 03 06:34:31 crc kubenswrapper[4831]: E1203 06:34:31.416404 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fe7b1b56da394b9c9fee8380d217c0f1e9ae002319006dc11321d933d9a8c8d\": container with ID starting with 8fe7b1b56da394b9c9fee8380d217c0f1e9ae002319006dc11321d933d9a8c8d not found: ID does not exist" containerID="8fe7b1b56da394b9c9fee8380d217c0f1e9ae002319006dc11321d933d9a8c8d" Dec 03 06:34:31 crc kubenswrapper[4831]: I1203 06:34:31.416487 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fe7b1b56da394b9c9fee8380d217c0f1e9ae002319006dc11321d933d9a8c8d"} err="failed to get container status \"8fe7b1b56da394b9c9fee8380d217c0f1e9ae002319006dc11321d933d9a8c8d\": rpc error: code = NotFound desc = could not find container \"8fe7b1b56da394b9c9fee8380d217c0f1e9ae002319006dc11321d933d9a8c8d\": container with ID starting with 8fe7b1b56da394b9c9fee8380d217c0f1e9ae002319006dc11321d933d9a8c8d not found: ID does not exist" Dec 03 06:34:33 crc kubenswrapper[4831]: I1203 06:34:33.025781 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de96d02d-bef7-48a1-85f6-2d6086ff6498" path="/var/lib/kubelet/pods/de96d02d-bef7-48a1-85f6-2d6086ff6498/volumes" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.220032 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" podUID="199e29b9-0d3f-471b-bf0e-de1576f2654a" containerName="oauth-openshift" containerID="cri-o://8a0f5c352b6ace82ca40a49728816b131cc01388b2f543062e1d71fc94174dc1" gracePeriod=15 Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.699507 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.796740 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-session\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.796792 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-idp-0-file-data\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.796822 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-cliconfig\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.796840 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-login\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.796909 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/199e29b9-0d3f-471b-bf0e-de1576f2654a-audit-dir\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.796960 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbhrj\" (UniqueName: \"kubernetes.io/projected/199e29b9-0d3f-471b-bf0e-de1576f2654a-kube-api-access-fbhrj\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.796978 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-router-certs\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.797011 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-serving-cert\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.797064 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/199e29b9-0d3f-471b-bf0e-de1576f2654a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.797031 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-provider-selection\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.797290 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-error\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.797660 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.797850 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-service-ca\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.797893 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-ocp-branding-template\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.797918 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-trusted-ca-bundle\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.797956 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-audit-policies\") pod \"199e29b9-0d3f-471b-bf0e-de1576f2654a\" (UID: \"199e29b9-0d3f-471b-bf0e-de1576f2654a\") " Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.798193 4831 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/199e29b9-0d3f-471b-bf0e-de1576f2654a-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.798211 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.798295 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.798632 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.798643 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.802658 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.803674 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.805051 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.806606 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.807239 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.807471 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.807778 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.808012 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.811750 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/199e29b9-0d3f-471b-bf0e-de1576f2654a-kube-api-access-fbhrj" (OuterVolumeSpecName: "kube-api-access-fbhrj") pod "199e29b9-0d3f-471b-bf0e-de1576f2654a" (UID: "199e29b9-0d3f-471b-bf0e-de1576f2654a"). InnerVolumeSpecName "kube-api-access-fbhrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.899845 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.899877 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbhrj\" (UniqueName: \"kubernetes.io/projected/199e29b9-0d3f-471b-bf0e-de1576f2654a-kube-api-access-fbhrj\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.899888 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.899898 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.899909 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.899919 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.899928 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.899937 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.899946 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.899956 4831 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/199e29b9-0d3f-471b-bf0e-de1576f2654a-audit-policies\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.899965 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:35 crc kubenswrapper[4831]: I1203 06:34:35.899975 4831 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/199e29b9-0d3f-471b-bf0e-de1576f2654a-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.352746 4831 generic.go:334] "Generic (PLEG): container finished" podID="199e29b9-0d3f-471b-bf0e-de1576f2654a" containerID="8a0f5c352b6ace82ca40a49728816b131cc01388b2f543062e1d71fc94174dc1" exitCode=0 Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.352796 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" event={"ID":"199e29b9-0d3f-471b-bf0e-de1576f2654a","Type":"ContainerDied","Data":"8a0f5c352b6ace82ca40a49728816b131cc01388b2f543062e1d71fc94174dc1"} Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.352829 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" event={"ID":"199e29b9-0d3f-471b-bf0e-de1576f2654a","Type":"ContainerDied","Data":"9a6b64a615bdc7044793c96bb824059e3b58b932fe04ad76c41393ae04613031"} Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.352851 4831 scope.go:117] "RemoveContainer" containerID="8a0f5c352b6ace82ca40a49728816b131cc01388b2f543062e1d71fc94174dc1" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.352969 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-87j4p" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.373975 4831 scope.go:117] "RemoveContainer" containerID="8a0f5c352b6ace82ca40a49728816b131cc01388b2f543062e1d71fc94174dc1" Dec 03 06:34:36 crc kubenswrapper[4831]: E1203 06:34:36.374410 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a0f5c352b6ace82ca40a49728816b131cc01388b2f543062e1d71fc94174dc1\": container with ID starting with 8a0f5c352b6ace82ca40a49728816b131cc01388b2f543062e1d71fc94174dc1 not found: ID does not exist" containerID="8a0f5c352b6ace82ca40a49728816b131cc01388b2f543062e1d71fc94174dc1" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.374442 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0f5c352b6ace82ca40a49728816b131cc01388b2f543062e1d71fc94174dc1"} err="failed to get container status \"8a0f5c352b6ace82ca40a49728816b131cc01388b2f543062e1d71fc94174dc1\": rpc error: code = NotFound desc = could not find container \"8a0f5c352b6ace82ca40a49728816b131cc01388b2f543062e1d71fc94174dc1\": container with ID starting with 8a0f5c352b6ace82ca40a49728816b131cc01388b2f543062e1d71fc94174dc1 not found: ID does not exist" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.389930 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-87j4p"] Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.393667 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-87j4p"] Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.635661 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-85455bb588-f8rnq"] Dec 03 06:34:36 crc kubenswrapper[4831]: E1203 06:34:36.636467 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de96d02d-bef7-48a1-85f6-2d6086ff6498" containerName="extract-utilities" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.636568 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="de96d02d-bef7-48a1-85f6-2d6086ff6498" containerName="extract-utilities" Dec 03 06:34:36 crc kubenswrapper[4831]: E1203 06:34:36.636688 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="199e29b9-0d3f-471b-bf0e-de1576f2654a" containerName="oauth-openshift" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.636806 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="199e29b9-0d3f-471b-bf0e-de1576f2654a" containerName="oauth-openshift" Dec 03 06:34:36 crc kubenswrapper[4831]: E1203 06:34:36.636905 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de96d02d-bef7-48a1-85f6-2d6086ff6498" containerName="registry-server" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.636982 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="de96d02d-bef7-48a1-85f6-2d6086ff6498" containerName="registry-server" Dec 03 06:34:36 crc kubenswrapper[4831]: E1203 06:34:36.637060 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de96d02d-bef7-48a1-85f6-2d6086ff6498" containerName="extract-content" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.637135 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="de96d02d-bef7-48a1-85f6-2d6086ff6498" containerName="extract-content" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.637398 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="de96d02d-bef7-48a1-85f6-2d6086ff6498" containerName="registry-server" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.637555 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="199e29b9-0d3f-471b-bf0e-de1576f2654a" containerName="oauth-openshift" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.638116 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.643675 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.643987 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.644069 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.644100 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.644178 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.646991 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-85455bb588-f8rnq"] Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.649627 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.649672 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.649672 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.649881 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.649933 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.649962 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.650200 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.652417 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.654953 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.662735 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.707929 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4538e325-87ae-4924-9214-5b050cdf4176-audit-dir\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.707977 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-service-ca\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.708011 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.708036 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.708062 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-session\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.708131 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-user-template-error\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.708167 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlr75\" (UniqueName: \"kubernetes.io/projected/4538e325-87ae-4924-9214-5b050cdf4176-kube-api-access-vlr75\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.708269 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.708341 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.708426 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.708459 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-user-template-login\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.708487 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4538e325-87ae-4924-9214-5b050cdf4176-audit-policies\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.708510 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.708535 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-router-certs\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.809937 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.810051 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-user-template-login\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.810107 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4538e325-87ae-4924-9214-5b050cdf4176-audit-policies\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.810154 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.810242 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-router-certs\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.810379 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4538e325-87ae-4924-9214-5b050cdf4176-audit-dir\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.810427 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-service-ca\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.810455 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.810478 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.810502 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-session\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.810536 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-user-template-error\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.810589 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vlr75\" (UniqueName: \"kubernetes.io/projected/4538e325-87ae-4924-9214-5b050cdf4176-kube-api-access-vlr75\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.810616 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.810640 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.811956 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4538e325-87ae-4924-9214-5b050cdf4176-audit-dir\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.812133 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4538e325-87ae-4924-9214-5b050cdf4176-audit-policies\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.812183 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-cliconfig\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.812996 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.813204 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-service-ca\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.814852 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-serving-cert\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.814838 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-router-certs\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.815532 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.816387 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.817960 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-user-template-error\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.818780 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-user-template-login\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.821824 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.822440 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4538e325-87ae-4924-9214-5b050cdf4176-v4-0-config-system-session\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:36 crc kubenswrapper[4831]: I1203 06:34:36.837556 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlr75\" (UniqueName: \"kubernetes.io/projected/4538e325-87ae-4924-9214-5b050cdf4176-kube-api-access-vlr75\") pod \"oauth-openshift-85455bb588-f8rnq\" (UID: \"4538e325-87ae-4924-9214-5b050cdf4176\") " pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:37 crc kubenswrapper[4831]: I1203 06:34:37.003692 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:37 crc kubenswrapper[4831]: I1203 06:34:37.037259 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="199e29b9-0d3f-471b-bf0e-de1576f2654a" path="/var/lib/kubelet/pods/199e29b9-0d3f-471b-bf0e-de1576f2654a/volumes" Dec 03 06:34:37 crc kubenswrapper[4831]: I1203 06:34:37.476986 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-85455bb588-f8rnq"] Dec 03 06:34:38 crc kubenswrapper[4831]: I1203 06:34:38.371259 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" event={"ID":"4538e325-87ae-4924-9214-5b050cdf4176","Type":"ContainerStarted","Data":"fcd0ec15cfa8939b63f3b3ad27922c68a8b11868f247ca5934d05416c3c326b6"} Dec 03 06:34:38 crc kubenswrapper[4831]: I1203 06:34:38.372085 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" event={"ID":"4538e325-87ae-4924-9214-5b050cdf4176","Type":"ContainerStarted","Data":"23f8531d1f993a63e51e84b6efdacc64465c014dccebbb6ce9b946b0c343fcd9"} Dec 03 06:34:38 crc kubenswrapper[4831]: I1203 06:34:38.372855 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:38 crc kubenswrapper[4831]: I1203 06:34:38.382955 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" Dec 03 06:34:38 crc kubenswrapper[4831]: I1203 06:34:38.405933 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-85455bb588-f8rnq" podStartSLOduration=28.405916192 podStartE2EDuration="28.405916192s" podCreationTimestamp="2025-12-03 06:34:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:34:38.40585339 +0000 UTC m=+215.749436938" watchObservedRunningTime="2025-12-03 06:34:38.405916192 +0000 UTC m=+215.749499700" Dec 03 06:34:45 crc kubenswrapper[4831]: I1203 06:34:45.764591 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59f79b7694-cd2tn"] Dec 03 06:34:45 crc kubenswrapper[4831]: I1203 06:34:45.765553 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" podUID="7cfdd9a8-5caa-4426-8c88-15340686da2d" containerName="controller-manager" containerID="cri-o://8c8bea68a015ba0ce0f725b6763cb48bd5581398b30e8f2458b26c85683242f2" gracePeriod=30 Dec 03 06:34:45 crc kubenswrapper[4831]: I1203 06:34:45.779120 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz"] Dec 03 06:34:45 crc kubenswrapper[4831]: I1203 06:34:45.781790 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" podUID="8c616f87-e9b7-492b-addc-73088c201213" containerName="route-controller-manager" containerID="cri-o://3f2a82e089caadf5b3a3fc2fd1143cc93ce8e3224be6380e78892d5cede27d70" gracePeriod=30 Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.218032 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.230228 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.254442 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-config\") pod \"7cfdd9a8-5caa-4426-8c88-15340686da2d\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.254504 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-proxy-ca-bundles\") pod \"7cfdd9a8-5caa-4426-8c88-15340686da2d\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.254571 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dth2d\" (UniqueName: \"kubernetes.io/projected/7cfdd9a8-5caa-4426-8c88-15340686da2d-kube-api-access-dth2d\") pod \"7cfdd9a8-5caa-4426-8c88-15340686da2d\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.254598 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-client-ca\") pod \"7cfdd9a8-5caa-4426-8c88-15340686da2d\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.254634 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c616f87-e9b7-492b-addc-73088c201213-serving-cert\") pod \"8c616f87-e9b7-492b-addc-73088c201213\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.254654 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cfdd9a8-5caa-4426-8c88-15340686da2d-serving-cert\") pod \"7cfdd9a8-5caa-4426-8c88-15340686da2d\" (UID: \"7cfdd9a8-5caa-4426-8c88-15340686da2d\") " Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.254682 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c616f87-e9b7-492b-addc-73088c201213-client-ca\") pod \"8c616f87-e9b7-492b-addc-73088c201213\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.254708 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c616f87-e9b7-492b-addc-73088c201213-config\") pod \"8c616f87-e9b7-492b-addc-73088c201213\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.254755 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c966n\" (UniqueName: \"kubernetes.io/projected/8c616f87-e9b7-492b-addc-73088c201213-kube-api-access-c966n\") pod \"8c616f87-e9b7-492b-addc-73088c201213\" (UID: \"8c616f87-e9b7-492b-addc-73088c201213\") " Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.257051 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c616f87-e9b7-492b-addc-73088c201213-client-ca" (OuterVolumeSpecName: "client-ca") pod "8c616f87-e9b7-492b-addc-73088c201213" (UID: "8c616f87-e9b7-492b-addc-73088c201213"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.257379 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-config" (OuterVolumeSpecName: "config") pod "7cfdd9a8-5caa-4426-8c88-15340686da2d" (UID: "7cfdd9a8-5caa-4426-8c88-15340686da2d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.257994 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c616f87-e9b7-492b-addc-73088c201213-config" (OuterVolumeSpecName: "config") pod "8c616f87-e9b7-492b-addc-73088c201213" (UID: "8c616f87-e9b7-492b-addc-73088c201213"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.260651 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7cfdd9a8-5caa-4426-8c88-15340686da2d" (UID: "7cfdd9a8-5caa-4426-8c88-15340686da2d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.265038 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cfdd9a8-5caa-4426-8c88-15340686da2d-kube-api-access-dth2d" (OuterVolumeSpecName: "kube-api-access-dth2d") pod "7cfdd9a8-5caa-4426-8c88-15340686da2d" (UID: "7cfdd9a8-5caa-4426-8c88-15340686da2d"). InnerVolumeSpecName "kube-api-access-dth2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.265051 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cfdd9a8-5caa-4426-8c88-15340686da2d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7cfdd9a8-5caa-4426-8c88-15340686da2d" (UID: "7cfdd9a8-5caa-4426-8c88-15340686da2d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.267061 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-client-ca" (OuterVolumeSpecName: "client-ca") pod "7cfdd9a8-5caa-4426-8c88-15340686da2d" (UID: "7cfdd9a8-5caa-4426-8c88-15340686da2d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.267573 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c616f87-e9b7-492b-addc-73088c201213-kube-api-access-c966n" (OuterVolumeSpecName: "kube-api-access-c966n") pod "8c616f87-e9b7-492b-addc-73088c201213" (UID: "8c616f87-e9b7-492b-addc-73088c201213"). InnerVolumeSpecName "kube-api-access-c966n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.269264 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c616f87-e9b7-492b-addc-73088c201213-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8c616f87-e9b7-492b-addc-73088c201213" (UID: "8c616f87-e9b7-492b-addc-73088c201213"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.356707 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dth2d\" (UniqueName: \"kubernetes.io/projected/7cfdd9a8-5caa-4426-8c88-15340686da2d-kube-api-access-dth2d\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.356762 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.356772 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8c616f87-e9b7-492b-addc-73088c201213-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.356780 4831 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cfdd9a8-5caa-4426-8c88-15340686da2d-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.356788 4831 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8c616f87-e9b7-492b-addc-73088c201213-client-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.356796 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c616f87-e9b7-492b-addc-73088c201213-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.356806 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c966n\" (UniqueName: \"kubernetes.io/projected/8c616f87-e9b7-492b-addc-73088c201213-kube-api-access-c966n\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.356814 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.356822 4831 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7cfdd9a8-5caa-4426-8c88-15340686da2d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.431537 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.431502 4831 generic.go:334] "Generic (PLEG): container finished" podID="7cfdd9a8-5caa-4426-8c88-15340686da2d" containerID="8c8bea68a015ba0ce0f725b6763cb48bd5581398b30e8f2458b26c85683242f2" exitCode=0 Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.431634 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" event={"ID":"7cfdd9a8-5caa-4426-8c88-15340686da2d","Type":"ContainerDied","Data":"8c8bea68a015ba0ce0f725b6763cb48bd5581398b30e8f2458b26c85683242f2"} Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.431671 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-59f79b7694-cd2tn" event={"ID":"7cfdd9a8-5caa-4426-8c88-15340686da2d","Type":"ContainerDied","Data":"ddfa3f4bc6eed394e9097d7bc89d5f3d00c9943b85b2e5830d3237351daf5766"} Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.431697 4831 scope.go:117] "RemoveContainer" containerID="8c8bea68a015ba0ce0f725b6763cb48bd5581398b30e8f2458b26c85683242f2" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.435890 4831 generic.go:334] "Generic (PLEG): container finished" podID="8c616f87-e9b7-492b-addc-73088c201213" containerID="3f2a82e089caadf5b3a3fc2fd1143cc93ce8e3224be6380e78892d5cede27d70" exitCode=0 Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.435933 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" event={"ID":"8c616f87-e9b7-492b-addc-73088c201213","Type":"ContainerDied","Data":"3f2a82e089caadf5b3a3fc2fd1143cc93ce8e3224be6380e78892d5cede27d70"} Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.436115 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" event={"ID":"8c616f87-e9b7-492b-addc-73088c201213","Type":"ContainerDied","Data":"5f3edbfea3ccf4112e5acd98ba8c6e0c64a70fd785d1ea074f04211101efa88a"} Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.435979 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.451751 4831 scope.go:117] "RemoveContainer" containerID="8c8bea68a015ba0ce0f725b6763cb48bd5581398b30e8f2458b26c85683242f2" Dec 03 06:34:46 crc kubenswrapper[4831]: E1203 06:34:46.452197 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c8bea68a015ba0ce0f725b6763cb48bd5581398b30e8f2458b26c85683242f2\": container with ID starting with 8c8bea68a015ba0ce0f725b6763cb48bd5581398b30e8f2458b26c85683242f2 not found: ID does not exist" containerID="8c8bea68a015ba0ce0f725b6763cb48bd5581398b30e8f2458b26c85683242f2" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.452227 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c8bea68a015ba0ce0f725b6763cb48bd5581398b30e8f2458b26c85683242f2"} err="failed to get container status \"8c8bea68a015ba0ce0f725b6763cb48bd5581398b30e8f2458b26c85683242f2\": rpc error: code = NotFound desc = could not find container \"8c8bea68a015ba0ce0f725b6763cb48bd5581398b30e8f2458b26c85683242f2\": container with ID starting with 8c8bea68a015ba0ce0f725b6763cb48bd5581398b30e8f2458b26c85683242f2 not found: ID does not exist" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.452251 4831 scope.go:117] "RemoveContainer" containerID="3f2a82e089caadf5b3a3fc2fd1143cc93ce8e3224be6380e78892d5cede27d70" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.464786 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-59f79b7694-cd2tn"] Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.467649 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-59f79b7694-cd2tn"] Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.472935 4831 scope.go:117] "RemoveContainer" containerID="3f2a82e089caadf5b3a3fc2fd1143cc93ce8e3224be6380e78892d5cede27d70" Dec 03 06:34:46 crc kubenswrapper[4831]: E1203 06:34:46.474688 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f2a82e089caadf5b3a3fc2fd1143cc93ce8e3224be6380e78892d5cede27d70\": container with ID starting with 3f2a82e089caadf5b3a3fc2fd1143cc93ce8e3224be6380e78892d5cede27d70 not found: ID does not exist" containerID="3f2a82e089caadf5b3a3fc2fd1143cc93ce8e3224be6380e78892d5cede27d70" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.474724 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f2a82e089caadf5b3a3fc2fd1143cc93ce8e3224be6380e78892d5cede27d70"} err="failed to get container status \"3f2a82e089caadf5b3a3fc2fd1143cc93ce8e3224be6380e78892d5cede27d70\": rpc error: code = NotFound desc = could not find container \"3f2a82e089caadf5b3a3fc2fd1143cc93ce8e3224be6380e78892d5cede27d70\": container with ID starting with 3f2a82e089caadf5b3a3fc2fd1143cc93ce8e3224be6380e78892d5cede27d70 not found: ID does not exist" Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.477080 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz"] Dec 03 06:34:46 crc kubenswrapper[4831]: I1203 06:34:46.479451 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-568767bfc4-qbswz"] Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.021960 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cfdd9a8-5caa-4426-8c88-15340686da2d" path="/var/lib/kubelet/pods/7cfdd9a8-5caa-4426-8c88-15340686da2d/volumes" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.022976 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c616f87-e9b7-492b-addc-73088c201213" path="/var/lib/kubelet/pods/8c616f87-e9b7-492b-addc-73088c201213/volumes" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.643725 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h"] Dec 03 06:34:47 crc kubenswrapper[4831]: E1203 06:34:47.644002 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c616f87-e9b7-492b-addc-73088c201213" containerName="route-controller-manager" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.644019 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c616f87-e9b7-492b-addc-73088c201213" containerName="route-controller-manager" Dec 03 06:34:47 crc kubenswrapper[4831]: E1203 06:34:47.644042 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cfdd9a8-5caa-4426-8c88-15340686da2d" containerName="controller-manager" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.644049 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cfdd9a8-5caa-4426-8c88-15340686da2d" containerName="controller-manager" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.644145 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c616f87-e9b7-492b-addc-73088c201213" containerName="route-controller-manager" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.644156 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cfdd9a8-5caa-4426-8c88-15340686da2d" containerName="controller-manager" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.644566 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.647376 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.647834 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.648152 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.648280 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.649058 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-c654897d-lk6vj"] Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.649160 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.650462 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.653824 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.655570 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.660616 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.661267 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.661813 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c654897d-lk6vj"] Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.661965 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.662142 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.662289 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.673784 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc142d4f-f7f6-4e1a-9433-123822d764f7-serving-cert\") pod \"route-controller-manager-7c5dcc5474-v766h\" (UID: \"fc142d4f-f7f6-4e1a-9433-123822d764f7\") " pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.673856 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bbf0edd-e940-4d6a-bc85-02ba14211491-config\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.673902 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1bbf0edd-e940-4d6a-bc85-02ba14211491-proxy-ca-bundles\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.674018 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bbf0edd-e940-4d6a-bc85-02ba14211491-serving-cert\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.674082 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpwrk\" (UniqueName: \"kubernetes.io/projected/fc142d4f-f7f6-4e1a-9433-123822d764f7-kube-api-access-bpwrk\") pod \"route-controller-manager-7c5dcc5474-v766h\" (UID: \"fc142d4f-f7f6-4e1a-9433-123822d764f7\") " pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.674160 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1bbf0edd-e940-4d6a-bc85-02ba14211491-client-ca\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.674232 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc142d4f-f7f6-4e1a-9433-123822d764f7-config\") pod \"route-controller-manager-7c5dcc5474-v766h\" (UID: \"fc142d4f-f7f6-4e1a-9433-123822d764f7\") " pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.674291 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gj48z\" (UniqueName: \"kubernetes.io/projected/1bbf0edd-e940-4d6a-bc85-02ba14211491-kube-api-access-gj48z\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.674388 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc142d4f-f7f6-4e1a-9433-123822d764f7-client-ca\") pod \"route-controller-manager-7c5dcc5474-v766h\" (UID: \"fc142d4f-f7f6-4e1a-9433-123822d764f7\") " pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.677816 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h"] Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.680617 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.775484 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bbf0edd-e940-4d6a-bc85-02ba14211491-serving-cert\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.775581 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpwrk\" (UniqueName: \"kubernetes.io/projected/fc142d4f-f7f6-4e1a-9433-123822d764f7-kube-api-access-bpwrk\") pod \"route-controller-manager-7c5dcc5474-v766h\" (UID: \"fc142d4f-f7f6-4e1a-9433-123822d764f7\") " pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.775629 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1bbf0edd-e940-4d6a-bc85-02ba14211491-client-ca\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.775685 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc142d4f-f7f6-4e1a-9433-123822d764f7-config\") pod \"route-controller-manager-7c5dcc5474-v766h\" (UID: \"fc142d4f-f7f6-4e1a-9433-123822d764f7\") " pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.775727 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gj48z\" (UniqueName: \"kubernetes.io/projected/1bbf0edd-e940-4d6a-bc85-02ba14211491-kube-api-access-gj48z\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.775764 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc142d4f-f7f6-4e1a-9433-123822d764f7-client-ca\") pod \"route-controller-manager-7c5dcc5474-v766h\" (UID: \"fc142d4f-f7f6-4e1a-9433-123822d764f7\") " pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.775798 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc142d4f-f7f6-4e1a-9433-123822d764f7-serving-cert\") pod \"route-controller-manager-7c5dcc5474-v766h\" (UID: \"fc142d4f-f7f6-4e1a-9433-123822d764f7\") " pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.775834 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bbf0edd-e940-4d6a-bc85-02ba14211491-config\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.775872 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1bbf0edd-e940-4d6a-bc85-02ba14211491-proxy-ca-bundles\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.777496 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc142d4f-f7f6-4e1a-9433-123822d764f7-client-ca\") pod \"route-controller-manager-7c5dcc5474-v766h\" (UID: \"fc142d4f-f7f6-4e1a-9433-123822d764f7\") " pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.777912 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1bbf0edd-e940-4d6a-bc85-02ba14211491-client-ca\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.777956 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bbf0edd-e940-4d6a-bc85-02ba14211491-config\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.779628 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1bbf0edd-e940-4d6a-bc85-02ba14211491-proxy-ca-bundles\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.779719 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc142d4f-f7f6-4e1a-9433-123822d764f7-config\") pod \"route-controller-manager-7c5dcc5474-v766h\" (UID: \"fc142d4f-f7f6-4e1a-9433-123822d764f7\") " pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.782104 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bbf0edd-e940-4d6a-bc85-02ba14211491-serving-cert\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.782263 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc142d4f-f7f6-4e1a-9433-123822d764f7-serving-cert\") pod \"route-controller-manager-7c5dcc5474-v766h\" (UID: \"fc142d4f-f7f6-4e1a-9433-123822d764f7\") " pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.793632 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gj48z\" (UniqueName: \"kubernetes.io/projected/1bbf0edd-e940-4d6a-bc85-02ba14211491-kube-api-access-gj48z\") pod \"controller-manager-c654897d-lk6vj\" (UID: \"1bbf0edd-e940-4d6a-bc85-02ba14211491\") " pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.803167 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpwrk\" (UniqueName: \"kubernetes.io/projected/fc142d4f-f7f6-4e1a-9433-123822d764f7-kube-api-access-bpwrk\") pod \"route-controller-manager-7c5dcc5474-v766h\" (UID: \"fc142d4f-f7f6-4e1a-9433-123822d764f7\") " pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:47 crc kubenswrapper[4831]: I1203 06:34:47.984358 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.011295 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.262018 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h"] Dec 03 06:34:48 crc kubenswrapper[4831]: W1203 06:34:48.265603 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc142d4f_f7f6_4e1a_9433_123822d764f7.slice/crio-6639c580cfa83cae20078bcf8dba85882f4e55bd93568b3d538dae2dcdd5f0fd WatchSource:0}: Error finding container 6639c580cfa83cae20078bcf8dba85882f4e55bd93568b3d538dae2dcdd5f0fd: Status 404 returned error can't find the container with id 6639c580cfa83cae20078bcf8dba85882f4e55bd93568b3d538dae2dcdd5f0fd Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.312731 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-c654897d-lk6vj"] Dec 03 06:34:48 crc kubenswrapper[4831]: W1203 06:34:48.319892 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bbf0edd_e940_4d6a_bc85_02ba14211491.slice/crio-6be04ea1c501adf3f113947bbb1704a6de8e63777f026e734956e8bdb41e92a2 WatchSource:0}: Error finding container 6be04ea1c501adf3f113947bbb1704a6de8e63777f026e734956e8bdb41e92a2: Status 404 returned error can't find the container with id 6be04ea1c501adf3f113947bbb1704a6de8e63777f026e734956e8bdb41e92a2 Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.451107 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" event={"ID":"1bbf0edd-e940-4d6a-bc85-02ba14211491","Type":"ContainerStarted","Data":"d3e87b8d82646ea79303f45ce918a22a5ea8fbf1671a29d87dc3d596d4c40e1d"} Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.451156 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" event={"ID":"1bbf0edd-e940-4d6a-bc85-02ba14211491","Type":"ContainerStarted","Data":"6be04ea1c501adf3f113947bbb1704a6de8e63777f026e734956e8bdb41e92a2"} Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.451247 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.452509 4831 patch_prober.go:28] interesting pod/controller-manager-c654897d-lk6vj container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.452549 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" podUID="1bbf0edd-e940-4d6a-bc85-02ba14211491" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.454205 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" event={"ID":"fc142d4f-f7f6-4e1a-9433-123822d764f7","Type":"ContainerStarted","Data":"b7d00327f95ca5a7e2cc3aaeacb196782abe3f3be78467c453c8c657da2b9d46"} Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.454241 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" event={"ID":"fc142d4f-f7f6-4e1a-9433-123822d764f7","Type":"ContainerStarted","Data":"6639c580cfa83cae20078bcf8dba85882f4e55bd93568b3d538dae2dcdd5f0fd"} Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.454453 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.455194 4831 patch_prober.go:28] interesting pod/route-controller-manager-7c5dcc5474-v766h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" start-of-body= Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.455233 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" podUID="fc142d4f-f7f6-4e1a-9433-123822d764f7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": dial tcp 10.217.0.60:8443: connect: connection refused" Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.465637 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" podStartSLOduration=3.4656191659999998 podStartE2EDuration="3.465619166s" podCreationTimestamp="2025-12-03 06:34:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:34:48.464927164 +0000 UTC m=+225.808510682" watchObservedRunningTime="2025-12-03 06:34:48.465619166 +0000 UTC m=+225.809202674" Dec 03 06:34:48 crc kubenswrapper[4831]: I1203 06:34:48.485368 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" podStartSLOduration=3.485351232 podStartE2EDuration="3.485351232s" podCreationTimestamp="2025-12-03 06:34:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:34:48.483083669 +0000 UTC m=+225.826667167" watchObservedRunningTime="2025-12-03 06:34:48.485351232 +0000 UTC m=+225.828934740" Dec 03 06:34:49 crc kubenswrapper[4831]: I1203 06:34:49.462995 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c5dcc5474-v766h" Dec 03 06:34:49 crc kubenswrapper[4831]: I1203 06:34:49.465193 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-c654897d-lk6vj" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.542436 4831 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.543306 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.543357 4831 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.543673 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801" gracePeriod=15 Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.543813 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364" gracePeriod=15 Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.543875 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f" gracePeriod=15 Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.543870 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1" gracePeriod=15 Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.543944 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c" gracePeriod=15 Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.546668 4831 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:34:55 crc kubenswrapper[4831]: E1203 06:34:55.546957 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.546990 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:34:55 crc kubenswrapper[4831]: E1203 06:34:55.547010 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.547023 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:34:55 crc kubenswrapper[4831]: E1203 06:34:55.547048 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.547059 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 06:34:55 crc kubenswrapper[4831]: E1203 06:34:55.547075 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.547088 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 06:34:55 crc kubenswrapper[4831]: E1203 06:34:55.547107 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.547117 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 06:34:55 crc kubenswrapper[4831]: E1203 06:34:55.547129 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.547140 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 06:34:55 crc kubenswrapper[4831]: E1203 06:34:55.547156 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.547167 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.547342 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.547368 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.547380 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.547394 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.547410 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.547426 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.621976 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.622471 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.622508 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.622546 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.622585 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.622624 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.622666 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.622702 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.628081 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723169 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723224 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723249 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723272 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723290 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723299 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723337 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723381 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723391 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723435 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723418 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723454 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723487 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723512 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723570 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.723537 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: I1203 06:34:55.915463 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:34:55 crc kubenswrapper[4831]: W1203 06:34:55.944870 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-5c7bb210d8d4d0cb1b6f3f1731f3d82f4903f712b26ae71b6282188af13813f9 WatchSource:0}: Error finding container 5c7bb210d8d4d0cb1b6f3f1731f3d82f4903f712b26ae71b6282188af13813f9: Status 404 returned error can't find the container with id 5c7bb210d8d4d0cb1b6f3f1731f3d82f4903f712b26ae71b6282188af13813f9 Dec 03 06:34:55 crc kubenswrapper[4831]: E1203 06:34:55.949541 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.234:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187da10a9c9ea652 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 06:34:55.948490322 +0000 UTC m=+233.292073830,LastTimestamp:2025-12-03 06:34:55.948490322 +0000 UTC m=+233.292073830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.497195 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.500052 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.502018 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364" exitCode=0 Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.502176 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c" exitCode=0 Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.502195 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1" exitCode=0 Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.502262 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f" exitCode=2 Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.502260 4831 scope.go:117] "RemoveContainer" containerID="e2675f3f76f19d9bdfb47ec76602f2739e15e0fac0cfee02a6688d54228d0812" Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.505510 4831 generic.go:334] "Generic (PLEG): container finished" podID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" containerID="709810af3d5ef6ce2ffccdad589fd52439a4e3c137d7a417e881a9c993bbf549" exitCode=0 Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.505595 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"baccf8bb-eb1e-4298-8841-0aaf91b213f6","Type":"ContainerDied","Data":"709810af3d5ef6ce2ffccdad589fd52439a4e3c137d7a417e881a9c993bbf549"} Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.506556 4831 status_manager.go:851] "Failed to get status for pod" podUID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.507120 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"042a92012f45e0a4383ab39957576846acaa32cc79b599dff8b98dbcb298e33a"} Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.507169 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"5c7bb210d8d4d0cb1b6f3f1731f3d82f4903f712b26ae71b6282188af13813f9"} Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.507112 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.507638 4831 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.508410 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.508891 4831 status_manager.go:851] "Failed to get status for pod" podUID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:56 crc kubenswrapper[4831]: I1203 06:34:56.509306 4831 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.008288 4831 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" start-of-body= Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.008357 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="Get \"https://192.168.126.11:6443/readyz\": dial tcp 192.168.126.11:6443: connect: connection refused" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.517437 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 06:34:57 crc kubenswrapper[4831]: E1203 06:34:57.897168 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:57 crc kubenswrapper[4831]: E1203 06:34:57.898135 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:57 crc kubenswrapper[4831]: E1203 06:34:57.898456 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:57 crc kubenswrapper[4831]: E1203 06:34:57.898772 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:57 crc kubenswrapper[4831]: E1203 06:34:57.899202 4831 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.899357 4831 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Dec 03 06:34:57 crc kubenswrapper[4831]: E1203 06:34:57.899777 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="200ms" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.952117 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.953639 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.954106 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.954685 4831 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.955431 4831 status_manager.go:851] "Failed to get status for pod" podUID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.956100 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.956677 4831 status_manager.go:851] "Failed to get status for pod" podUID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.957214 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.957786 4831 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.966702 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/baccf8bb-eb1e-4298-8841-0aaf91b213f6-var-lock\") pod \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\" (UID: \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\") " Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.966862 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baccf8bb-eb1e-4298-8841-0aaf91b213f6-kubelet-dir\") pod \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\" (UID: \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\") " Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.966861 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baccf8bb-eb1e-4298-8841-0aaf91b213f6-var-lock" (OuterVolumeSpecName: "var-lock") pod "baccf8bb-eb1e-4298-8841-0aaf91b213f6" (UID: "baccf8bb-eb1e-4298-8841-0aaf91b213f6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.967001 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baccf8bb-eb1e-4298-8841-0aaf91b213f6-kube-api-access\") pod \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\" (UID: \"baccf8bb-eb1e-4298-8841-0aaf91b213f6\") " Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.967092 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.967184 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.967213 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.966904 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baccf8bb-eb1e-4298-8841-0aaf91b213f6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "baccf8bb-eb1e-4298-8841-0aaf91b213f6" (UID: "baccf8bb-eb1e-4298-8841-0aaf91b213f6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.967465 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.967556 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.967419 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.967794 4831 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.967993 4831 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/baccf8bb-eb1e-4298-8841-0aaf91b213f6-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.968079 4831 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/baccf8bb-eb1e-4298-8841-0aaf91b213f6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.968169 4831 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:57 crc kubenswrapper[4831]: I1203 06:34:57.973104 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baccf8bb-eb1e-4298-8841-0aaf91b213f6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "baccf8bb-eb1e-4298-8841-0aaf91b213f6" (UID: "baccf8bb-eb1e-4298-8841-0aaf91b213f6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.070088 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/baccf8bb-eb1e-4298-8841-0aaf91b213f6-kube-api-access\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.071374 4831 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:34:58 crc kubenswrapper[4831]: E1203 06:34:58.100698 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="400ms" Dec 03 06:34:58 crc kubenswrapper[4831]: E1203 06:34:58.501799 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="800ms" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.530967 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.532264 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801" exitCode=0 Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.532447 4831 scope.go:117] "RemoveContainer" containerID="9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.532591 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.536491 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"baccf8bb-eb1e-4298-8841-0aaf91b213f6","Type":"ContainerDied","Data":"ddd82b6040ab359194a6bb32b371cf954cbfe6420235ca5bc36716de987fc41f"} Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.536521 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddd82b6040ab359194a6bb32b371cf954cbfe6420235ca5bc36716de987fc41f" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.536664 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.561607 4831 scope.go:117] "RemoveContainer" containerID="3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.567094 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.567598 4831 status_manager.go:851] "Failed to get status for pod" podUID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.568114 4831 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.568726 4831 status_manager.go:851] "Failed to get status for pod" podUID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.569180 4831 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.569666 4831 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.583725 4831 scope.go:117] "RemoveContainer" containerID="c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.605036 4831 scope.go:117] "RemoveContainer" containerID="9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.622864 4831 scope.go:117] "RemoveContainer" containerID="ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.641952 4831 scope.go:117] "RemoveContainer" containerID="368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.663959 4831 scope.go:117] "RemoveContainer" containerID="9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364" Dec 03 06:34:58 crc kubenswrapper[4831]: E1203 06:34:58.664964 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\": container with ID starting with 9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364 not found: ID does not exist" containerID="9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.665011 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364"} err="failed to get container status \"9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\": rpc error: code = NotFound desc = could not find container \"9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364\": container with ID starting with 9fb279b254af2ee87a9ef11c44e77f0cc361cbfcbc8095f85bd9ae0168737364 not found: ID does not exist" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.665110 4831 scope.go:117] "RemoveContainer" containerID="3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c" Dec 03 06:34:58 crc kubenswrapper[4831]: E1203 06:34:58.666674 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\": container with ID starting with 3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c not found: ID does not exist" containerID="3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.666713 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c"} err="failed to get container status \"3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\": rpc error: code = NotFound desc = could not find container \"3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c\": container with ID starting with 3a4128ce12ac2a457300d647954078c89de206c0fb618b8cd32920b0a2659e7c not found: ID does not exist" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.666737 4831 scope.go:117] "RemoveContainer" containerID="c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1" Dec 03 06:34:58 crc kubenswrapper[4831]: E1203 06:34:58.667290 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\": container with ID starting with c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1 not found: ID does not exist" containerID="c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.667344 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1"} err="failed to get container status \"c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\": rpc error: code = NotFound desc = could not find container \"c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1\": container with ID starting with c7356a30fd1fb2f17e676e37bf0fcded315d19db80cd9a18a9ab2f83ada3d9c1 not found: ID does not exist" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.667369 4831 scope.go:117] "RemoveContainer" containerID="9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f" Dec 03 06:34:58 crc kubenswrapper[4831]: E1203 06:34:58.667714 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\": container with ID starting with 9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f not found: ID does not exist" containerID="9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.667750 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f"} err="failed to get container status \"9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\": rpc error: code = NotFound desc = could not find container \"9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f\": container with ID starting with 9107b787c706987cf0352c6c7de28b1b4c6e3ec9bc23ab40b6dc895b3eab614f not found: ID does not exist" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.667773 4831 scope.go:117] "RemoveContainer" containerID="ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801" Dec 03 06:34:58 crc kubenswrapper[4831]: E1203 06:34:58.668232 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\": container with ID starting with ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801 not found: ID does not exist" containerID="ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.668264 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801"} err="failed to get container status \"ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\": rpc error: code = NotFound desc = could not find container \"ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801\": container with ID starting with ad8029c7a371234401417f006ca9932d50d6e784b4bbefddeb016aa85ac34801 not found: ID does not exist" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.668287 4831 scope.go:117] "RemoveContainer" containerID="368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a" Dec 03 06:34:58 crc kubenswrapper[4831]: E1203 06:34:58.668585 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\": container with ID starting with 368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a not found: ID does not exist" containerID="368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a" Dec 03 06:34:58 crc kubenswrapper[4831]: I1203 06:34:58.668615 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a"} err="failed to get container status \"368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\": rpc error: code = NotFound desc = could not find container \"368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a\": container with ID starting with 368e0270d099d9d855db5642cc00ddcd5e17e7d05c1764f1077d7482a88cd21a not found: ID does not exist" Dec 03 06:34:59 crc kubenswrapper[4831]: E1203 06:34:59.008273 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.234:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187da10a9c9ea652 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 06:34:55.948490322 +0000 UTC m=+233.292073830,LastTimestamp:2025-12-03 06:34:55.948490322 +0000 UTC m=+233.292073830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 06:34:59 crc kubenswrapper[4831]: I1203 06:34:59.021528 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Dec 03 06:34:59 crc kubenswrapper[4831]: E1203 06:34:59.303269 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="1.6s" Dec 03 06:35:00 crc kubenswrapper[4831]: E1203 06:35:00.904543 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="3.2s" Dec 03 06:35:03 crc kubenswrapper[4831]: I1203 06:35:03.015139 4831 status_manager.go:851] "Failed to get status for pod" podUID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:35:03 crc kubenswrapper[4831]: I1203 06:35:03.015652 4831 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:35:04 crc kubenswrapper[4831]: E1203 06:35:04.106497 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="6.4s" Dec 03 06:35:09 crc kubenswrapper[4831]: E1203 06:35:09.011063 4831 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.234:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.187da10a9c9ea652 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2025-12-03 06:34:55.948490322 +0000 UTC m=+233.292073830,LastTimestamp:2025-12-03 06:34:55.948490322 +0000 UTC m=+233.292073830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.011817 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.016603 4831 status_manager.go:851] "Failed to get status for pod" podUID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.016929 4831 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.034116 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8801f972-95eb-4de4-ae44-00da9ab048b3" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.034157 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8801f972-95eb-4de4-ae44-00da9ab048b3" Dec 03 06:35:10 crc kubenswrapper[4831]: E1203 06:35:10.034746 4831 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.035612 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:35:10 crc kubenswrapper[4831]: E1203 06:35:10.508171 4831 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.234:6443: connect: connection refused" interval="7s" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.616309 4831 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="7d41b8617b9ddc1eed44615e724b5d9c7798e9977ddd1b61f9e3cb41b7bebbab" exitCode=0 Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.616395 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"7d41b8617b9ddc1eed44615e724b5d9c7798e9977ddd1b61f9e3cb41b7bebbab"} Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.616429 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"cddec454ac3b915ffee0d131d9ae32a8a85066f30fa4b20f43310fd4aadf91b5"} Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.616722 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8801f972-95eb-4de4-ae44-00da9ab048b3" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.616739 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8801f972-95eb-4de4-ae44-00da9ab048b3" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.617162 4831 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:35:10 crc kubenswrapper[4831]: E1203 06:35:10.617184 4831 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.617493 4831 status_manager.go:851] "Failed to get status for pod" podUID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.620252 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.620346 4831 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a" exitCode=1 Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.620375 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a"} Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.620996 4831 scope.go:117] "RemoveContainer" containerID="9112cc00e66285df5b05b011c501114cab0282e7c348573334f0437e7a61ea3a" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.621221 4831 status_manager.go:851] "Failed to get status for pod" podUID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.621482 4831 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:35:10 crc kubenswrapper[4831]: I1203 06:35:10.621805 4831 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.234:6443: connect: connection refused" Dec 03 06:35:11 crc kubenswrapper[4831]: I1203 06:35:11.631198 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"67f974012d87f301dbf000e454bb8df6bfec4b1bc4f93b2c3a7e8c7231aa5cb7"} Dec 03 06:35:11 crc kubenswrapper[4831]: I1203 06:35:11.631475 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f63da3d246ba0a9b6800b6e9c4ac332ffa82582cc7cff3e2f3fda66a56ddade1"} Dec 03 06:35:11 crc kubenswrapper[4831]: I1203 06:35:11.631488 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"a85306bd9a1e743c74ab313ac1cb336dc3b08aee63e8c9c29adf6f5fe80d22a4"} Dec 03 06:35:11 crc kubenswrapper[4831]: I1203 06:35:11.631496 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7009c3a57ce17bcc3e7305ab1f6abc2be327b7f9179c86431048e72c8ab93cb2"} Dec 03 06:35:11 crc kubenswrapper[4831]: I1203 06:35:11.635892 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Dec 03 06:35:11 crc kubenswrapper[4831]: I1203 06:35:11.635925 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a1df7cddd91d77ec7116b53d09bafd4d7985a2558789b5252cf62fd29840f657"} Dec 03 06:35:12 crc kubenswrapper[4831]: I1203 06:35:12.644097 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d34d1dbddcff8f437a6ea452a2b76a2e9c2b5da2eb63be1f64a7370343645910"} Dec 03 06:35:12 crc kubenswrapper[4831]: I1203 06:35:12.644250 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:35:12 crc kubenswrapper[4831]: I1203 06:35:12.644356 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8801f972-95eb-4de4-ae44-00da9ab048b3" Dec 03 06:35:12 crc kubenswrapper[4831]: I1203 06:35:12.644379 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8801f972-95eb-4de4-ae44-00da9ab048b3" Dec 03 06:35:15 crc kubenswrapper[4831]: I1203 06:35:15.036652 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:35:15 crc kubenswrapper[4831]: I1203 06:35:15.037009 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:35:15 crc kubenswrapper[4831]: I1203 06:35:15.043665 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:35:17 crc kubenswrapper[4831]: I1203 06:35:17.650744 4831 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:35:17 crc kubenswrapper[4831]: I1203 06:35:17.671980 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8801f972-95eb-4de4-ae44-00da9ab048b3" Dec 03 06:35:17 crc kubenswrapper[4831]: I1203 06:35:17.672193 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8801f972-95eb-4de4-ae44-00da9ab048b3" Dec 03 06:35:17 crc kubenswrapper[4831]: I1203 06:35:17.675946 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:35:17 crc kubenswrapper[4831]: I1203 06:35:17.677963 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="c181e05d-0e80-4e21-8cce-94dfa97c52bc" Dec 03 06:35:18 crc kubenswrapper[4831]: I1203 06:35:18.594047 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:35:18 crc kubenswrapper[4831]: I1203 06:35:18.677590 4831 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8801f972-95eb-4de4-ae44-00da9ab048b3" Dec 03 06:35:18 crc kubenswrapper[4831]: I1203 06:35:18.677625 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="8801f972-95eb-4de4-ae44-00da9ab048b3" Dec 03 06:35:20 crc kubenswrapper[4831]: I1203 06:35:20.439660 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:35:20 crc kubenswrapper[4831]: I1203 06:35:20.444528 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:35:20 crc kubenswrapper[4831]: I1203 06:35:20.697503 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Dec 03 06:35:23 crc kubenswrapper[4831]: I1203 06:35:23.033196 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="c181e05d-0e80-4e21-8cce-94dfa97c52bc" Dec 03 06:35:26 crc kubenswrapper[4831]: I1203 06:35:26.736239 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Dec 03 06:35:26 crc kubenswrapper[4831]: I1203 06:35:26.969255 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Dec 03 06:35:27 crc kubenswrapper[4831]: I1203 06:35:27.109993 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Dec 03 06:35:27 crc kubenswrapper[4831]: I1203 06:35:27.342621 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Dec 03 06:35:27 crc kubenswrapper[4831]: I1203 06:35:27.810229 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Dec 03 06:35:27 crc kubenswrapper[4831]: I1203 06:35:27.832265 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Dec 03 06:35:27 crc kubenswrapper[4831]: I1203 06:35:27.871381 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Dec 03 06:35:28 crc kubenswrapper[4831]: I1203 06:35:28.121729 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Dec 03 06:35:28 crc kubenswrapper[4831]: I1203 06:35:28.122565 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Dec 03 06:35:28 crc kubenswrapper[4831]: I1203 06:35:28.394047 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Dec 03 06:35:28 crc kubenswrapper[4831]: I1203 06:35:28.616944 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Dec 03 06:35:28 crc kubenswrapper[4831]: I1203 06:35:28.875201 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Dec 03 06:35:28 crc kubenswrapper[4831]: I1203 06:35:28.891550 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Dec 03 06:35:29 crc kubenswrapper[4831]: I1203 06:35:29.058540 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Dec 03 06:35:29 crc kubenswrapper[4831]: I1203 06:35:29.452872 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Dec 03 06:35:29 crc kubenswrapper[4831]: I1203 06:35:29.603766 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Dec 03 06:35:29 crc kubenswrapper[4831]: I1203 06:35:29.607823 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Dec 03 06:35:29 crc kubenswrapper[4831]: I1203 06:35:29.811867 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Dec 03 06:35:29 crc kubenswrapper[4831]: I1203 06:35:29.841163 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Dec 03 06:35:29 crc kubenswrapper[4831]: I1203 06:35:29.893907 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Dec 03 06:35:29 crc kubenswrapper[4831]: I1203 06:35:29.938796 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Dec 03 06:35:30 crc kubenswrapper[4831]: I1203 06:35:30.150966 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Dec 03 06:35:30 crc kubenswrapper[4831]: I1203 06:35:30.283961 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 06:35:30 crc kubenswrapper[4831]: I1203 06:35:30.326827 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Dec 03 06:35:30 crc kubenswrapper[4831]: I1203 06:35:30.338481 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Dec 03 06:35:30 crc kubenswrapper[4831]: I1203 06:35:30.359306 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Dec 03 06:35:30 crc kubenswrapper[4831]: I1203 06:35:30.364882 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Dec 03 06:35:30 crc kubenswrapper[4831]: I1203 06:35:30.407874 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Dec 03 06:35:30 crc kubenswrapper[4831]: I1203 06:35:30.462355 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Dec 03 06:35:30 crc kubenswrapper[4831]: I1203 06:35:30.600993 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Dec 03 06:35:30 crc kubenswrapper[4831]: I1203 06:35:30.613906 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Dec 03 06:35:30 crc kubenswrapper[4831]: I1203 06:35:30.774246 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Dec 03 06:35:30 crc kubenswrapper[4831]: I1203 06:35:30.860333 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Dec 03 06:35:30 crc kubenswrapper[4831]: I1203 06:35:30.928896 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.084663 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.143715 4831 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.152664 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.163678 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.217516 4831 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.233003 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.240165 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.285109 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.313957 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.409732 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.415787 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.435312 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.511665 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.535256 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.625867 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.633873 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.636779 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.684244 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.750952 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.823780 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.866375 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Dec 03 06:35:31 crc kubenswrapper[4831]: I1203 06:35:31.868133 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.119081 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.132794 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.168419 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.183949 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.330465 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.426883 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.442104 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.583273 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.597586 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.750956 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.756486 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.829650 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.864232 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.865951 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.896056 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.907875 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.939056 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Dec 03 06:35:32 crc kubenswrapper[4831]: I1203 06:35:32.958129 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.009363 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.089626 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.092714 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.111878 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.171901 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.218623 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.389466 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.549866 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.554063 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.567851 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.575398 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.622026 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.635283 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.676889 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.681927 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.712911 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.777174 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.862195 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Dec 03 06:35:33 crc kubenswrapper[4831]: I1203 06:35:33.991535 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.020458 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.034491 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.159093 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.191848 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.194165 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.309030 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.813072 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.891069 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.896849 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.898879 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.925223 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.928693 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.959552 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Dec 03 06:35:34 crc kubenswrapper[4831]: I1203 06:35:34.989210 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.102376 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.113139 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.123960 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.197747 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.261975 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.266098 4831 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.312880 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.361793 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.427647 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.546202 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.610080 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.641466 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.813489 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.819554 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.929164 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.957255 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.992104 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Dec 03 06:35:35 crc kubenswrapper[4831]: I1203 06:35:35.999342 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Dec 03 06:35:36 crc kubenswrapper[4831]: I1203 06:35:36.007002 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Dec 03 06:35:36 crc kubenswrapper[4831]: I1203 06:35:36.124107 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Dec 03 06:35:36 crc kubenswrapper[4831]: I1203 06:35:36.295954 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Dec 03 06:35:36 crc kubenswrapper[4831]: I1203 06:35:36.320513 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Dec 03 06:35:36 crc kubenswrapper[4831]: I1203 06:35:36.353304 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Dec 03 06:35:36 crc kubenswrapper[4831]: I1203 06:35:36.369639 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Dec 03 06:35:36 crc kubenswrapper[4831]: I1203 06:35:36.396343 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Dec 03 06:35:36 crc kubenswrapper[4831]: I1203 06:35:36.479777 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Dec 03 06:35:36 crc kubenswrapper[4831]: I1203 06:35:36.705822 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Dec 03 06:35:36 crc kubenswrapper[4831]: I1203 06:35:36.862129 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.042429 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.055035 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.067296 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.082704 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.092892 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.158255 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.162980 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.250798 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.255463 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.285698 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.469201 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.483498 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.559368 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.661692 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.666596 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.705587 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.712416 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.713372 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.796789 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.831616 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.835057 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.837987 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.859546 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Dec 03 06:35:37 crc kubenswrapper[4831]: I1203 06:35:37.917581 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Dec 03 06:35:38 crc kubenswrapper[4831]: I1203 06:35:38.031786 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Dec 03 06:35:38 crc kubenswrapper[4831]: I1203 06:35:38.249668 4831 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Dec 03 06:35:38 crc kubenswrapper[4831]: I1203 06:35:38.357139 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Dec 03 06:35:38 crc kubenswrapper[4831]: I1203 06:35:38.361944 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Dec 03 06:35:38 crc kubenswrapper[4831]: I1203 06:35:38.622443 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Dec 03 06:35:38 crc kubenswrapper[4831]: I1203 06:35:38.661968 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Dec 03 06:35:38 crc kubenswrapper[4831]: I1203 06:35:38.798946 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Dec 03 06:35:38 crc kubenswrapper[4831]: I1203 06:35:38.868197 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 06:35:38 crc kubenswrapper[4831]: I1203 06:35:38.987386 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:38.999986 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.020028 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.069972 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.113711 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.136379 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.176523 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.291899 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.308024 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.359485 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.378701 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.415708 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.438205 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.475091 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.476937 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.503900 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.535021 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.542458 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.594071 4831 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.595771 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.598794 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.598766201 podStartE2EDuration="44.598766201s" podCreationTimestamp="2025-12-03 06:34:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:35:17.525213746 +0000 UTC m=+254.868797254" watchObservedRunningTime="2025-12-03 06:35:39.598766201 +0000 UTC m=+276.942349759" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.603411 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.603484 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j22fq","openshift-kube-apiserver/kube-apiserver-crc"] Dec 03 06:35:39 crc kubenswrapper[4831]: E1203 06:35:39.603775 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" containerName="installer" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.603802 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" containerName="installer" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.604054 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="baccf8bb-eb1e-4298-8841-0aaf91b213f6" containerName="installer" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.604782 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rtq2s","openshift-marketplace/marketplace-operator-79b997595-vjhwt","openshift-marketplace/redhat-marketplace-4v6c6","openshift-marketplace/redhat-operators-ndqmw","openshift-marketplace/certified-operators-64gsc"] Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.605111 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-64gsc" podUID="5cdecddf-df66-4aef-bc33-65cbcf74db58" containerName="registry-server" containerID="cri-o://cb4727a6296a5b28db8342f8ce962ce1ce7b3856e62cb0313a7e26022c14d138" gracePeriod=30 Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.605358 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.605634 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ndqmw" podUID="ea4d526d-cb74-4ac6-a3fe-33aad14c3444" containerName="registry-server" containerID="cri-o://05221bdd08b39cc75cc87540df869ad1463e16d95d607b8b644e68dfb26b1051" gracePeriod=30 Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.605917 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rtq2s" podUID="ccfbc043-c76e-4afd-a7e7-db0057427fa5" containerName="registry-server" containerID="cri-o://532e3ecfca649bcc16e35370a18d6adcedea5cff04eb73d44bf99d88c9a55ac6" gracePeriod=30 Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.606227 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-4v6c6" podUID="bf274771-c291-4ab4-9f69-1e1554707a6c" containerName="registry-server" containerID="cri-o://83f85b2da2946a979e73cc350b05e91bf7d357f8aa9bfb30ddcb7ecc07585feb" gracePeriod=30 Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.606395 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" podUID="135075ee-2f44-402b-a071-36b3b720d928" containerName="marketplace-operator" containerID="cri-o://7973a7bb0fe3f24d5a7fef9c18eefc12726352545301f6c9c6d16b23bc76b8c8" gracePeriod=30 Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.613773 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.630245 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.650508 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.650474489 podStartE2EDuration="22.650474489s" podCreationTimestamp="2025-12-03 06:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:35:39.638690184 +0000 UTC m=+276.982273772" watchObservedRunningTime="2025-12-03 06:35:39.650474489 +0000 UTC m=+276.994058037" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.651307 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.651820 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.675650 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.737419 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5c596f0f-729f-4beb-b1f7-58ce65c9a928-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j22fq\" (UID: \"5c596f0f-729f-4beb-b1f7-58ce65c9a928\") " pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.737439 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.737477 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wndwf\" (UniqueName: \"kubernetes.io/projected/5c596f0f-729f-4beb-b1f7-58ce65c9a928-kube-api-access-wndwf\") pod \"marketplace-operator-79b997595-j22fq\" (UID: \"5c596f0f-729f-4beb-b1f7-58ce65c9a928\") " pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.737521 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c596f0f-729f-4beb-b1f7-58ce65c9a928-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j22fq\" (UID: \"5c596f0f-729f-4beb-b1f7-58ce65c9a928\") " pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.752357 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.790447 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.798186 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.812172 4831 generic.go:334] "Generic (PLEG): container finished" podID="135075ee-2f44-402b-a071-36b3b720d928" containerID="7973a7bb0fe3f24d5a7fef9c18eefc12726352545301f6c9c6d16b23bc76b8c8" exitCode=0 Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.812214 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" event={"ID":"135075ee-2f44-402b-a071-36b3b720d928","Type":"ContainerDied","Data":"7973a7bb0fe3f24d5a7fef9c18eefc12726352545301f6c9c6d16b23bc76b8c8"} Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.815530 4831 generic.go:334] "Generic (PLEG): container finished" podID="ccfbc043-c76e-4afd-a7e7-db0057427fa5" containerID="532e3ecfca649bcc16e35370a18d6adcedea5cff04eb73d44bf99d88c9a55ac6" exitCode=0 Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.815574 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtq2s" event={"ID":"ccfbc043-c76e-4afd-a7e7-db0057427fa5","Type":"ContainerDied","Data":"532e3ecfca649bcc16e35370a18d6adcedea5cff04eb73d44bf99d88c9a55ac6"} Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.820939 4831 generic.go:334] "Generic (PLEG): container finished" podID="ea4d526d-cb74-4ac6-a3fe-33aad14c3444" containerID="05221bdd08b39cc75cc87540df869ad1463e16d95d607b8b644e68dfb26b1051" exitCode=0 Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.820991 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndqmw" event={"ID":"ea4d526d-cb74-4ac6-a3fe-33aad14c3444","Type":"ContainerDied","Data":"05221bdd08b39cc75cc87540df869ad1463e16d95d607b8b644e68dfb26b1051"} Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.823546 4831 generic.go:334] "Generic (PLEG): container finished" podID="bf274771-c291-4ab4-9f69-1e1554707a6c" containerID="83f85b2da2946a979e73cc350b05e91bf7d357f8aa9bfb30ddcb7ecc07585feb" exitCode=0 Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.824334 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4v6c6" event={"ID":"bf274771-c291-4ab4-9f69-1e1554707a6c","Type":"ContainerDied","Data":"83f85b2da2946a979e73cc350b05e91bf7d357f8aa9bfb30ddcb7ecc07585feb"} Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.834099 4831 generic.go:334] "Generic (PLEG): container finished" podID="5cdecddf-df66-4aef-bc33-65cbcf74db58" containerID="cb4727a6296a5b28db8342f8ce962ce1ce7b3856e62cb0313a7e26022c14d138" exitCode=0 Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.835066 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64gsc" event={"ID":"5cdecddf-df66-4aef-bc33-65cbcf74db58","Type":"ContainerDied","Data":"cb4727a6296a5b28db8342f8ce962ce1ce7b3856e62cb0313a7e26022c14d138"} Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.839227 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5c596f0f-729f-4beb-b1f7-58ce65c9a928-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j22fq\" (UID: \"5c596f0f-729f-4beb-b1f7-58ce65c9a928\") " pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.839268 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wndwf\" (UniqueName: \"kubernetes.io/projected/5c596f0f-729f-4beb-b1f7-58ce65c9a928-kube-api-access-wndwf\") pod \"marketplace-operator-79b997595-j22fq\" (UID: \"5c596f0f-729f-4beb-b1f7-58ce65c9a928\") " pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.839297 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c596f0f-729f-4beb-b1f7-58ce65c9a928-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j22fq\" (UID: \"5c596f0f-729f-4beb-b1f7-58ce65c9a928\") " pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.840561 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5c596f0f-729f-4beb-b1f7-58ce65c9a928-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-j22fq\" (UID: \"5c596f0f-729f-4beb-b1f7-58ce65c9a928\") " pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.843869 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.845995 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5c596f0f-729f-4beb-b1f7-58ce65c9a928-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-j22fq\" (UID: \"5c596f0f-729f-4beb-b1f7-58ce65c9a928\") " pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.854230 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wndwf\" (UniqueName: \"kubernetes.io/projected/5c596f0f-729f-4beb-b1f7-58ce65c9a928-kube-api-access-wndwf\") pod \"marketplace-operator-79b997595-j22fq\" (UID: \"5c596f0f-729f-4beb-b1f7-58ce65c9a928\") " pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.858749 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.921974 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Dec 03 06:35:39 crc kubenswrapper[4831]: I1203 06:35:39.945385 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" Dec 03 06:35:39 crc kubenswrapper[4831]: E1203 06:35:39.989038 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05221bdd08b39cc75cc87540df869ad1463e16d95d607b8b644e68dfb26b1051 is running failed: container process not found" containerID="05221bdd08b39cc75cc87540df869ad1463e16d95d607b8b644e68dfb26b1051" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 06:35:39 crc kubenswrapper[4831]: E1203 06:35:39.989596 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05221bdd08b39cc75cc87540df869ad1463e16d95d607b8b644e68dfb26b1051 is running failed: container process not found" containerID="05221bdd08b39cc75cc87540df869ad1463e16d95d607b8b644e68dfb26b1051" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 06:35:39 crc kubenswrapper[4831]: E1203 06:35:39.990074 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05221bdd08b39cc75cc87540df869ad1463e16d95d607b8b644e68dfb26b1051 is running failed: container process not found" containerID="05221bdd08b39cc75cc87540df869ad1463e16d95d607b8b644e68dfb26b1051" cmd=["grpc_health_probe","-addr=:50051"] Dec 03 06:35:39 crc kubenswrapper[4831]: E1203 06:35:39.990112 4831 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 05221bdd08b39cc75cc87540df869ad1463e16d95d607b8b644e68dfb26b1051 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-ndqmw" podUID="ea4d526d-cb74-4ac6-a3fe-33aad14c3444" containerName="registry-server" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.065958 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.093475 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.130878 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.136855 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.154864 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.158248 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.166768 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.227334 4831 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.227566 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://042a92012f45e0a4383ab39957576846acaa32cc79b599dff8b98dbcb298e33a" gracePeriod=5 Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244435 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cdecddf-df66-4aef-bc33-65cbcf74db58-utilities\") pod \"5cdecddf-df66-4aef-bc33-65cbcf74db58\" (UID: \"5cdecddf-df66-4aef-bc33-65cbcf74db58\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244474 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrstd\" (UniqueName: \"kubernetes.io/projected/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-kube-api-access-lrstd\") pod \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\" (UID: \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244528 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/135075ee-2f44-402b-a071-36b3b720d928-marketplace-trusted-ca\") pod \"135075ee-2f44-402b-a071-36b3b720d928\" (UID: \"135075ee-2f44-402b-a071-36b3b720d928\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244558 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf274771-c291-4ab4-9f69-1e1554707a6c-utilities\") pod \"bf274771-c291-4ab4-9f69-1e1554707a6c\" (UID: \"bf274771-c291-4ab4-9f69-1e1554707a6c\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244588 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kvvz\" (UniqueName: \"kubernetes.io/projected/bf274771-c291-4ab4-9f69-1e1554707a6c-kube-api-access-5kvvz\") pod \"bf274771-c291-4ab4-9f69-1e1554707a6c\" (UID: \"bf274771-c291-4ab4-9f69-1e1554707a6c\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244605 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cdecddf-df66-4aef-bc33-65cbcf74db58-catalog-content\") pod \"5cdecddf-df66-4aef-bc33-65cbcf74db58\" (UID: \"5cdecddf-df66-4aef-bc33-65cbcf74db58\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244645 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-utilities\") pod \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\" (UID: \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244686 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/135075ee-2f44-402b-a071-36b3b720d928-marketplace-operator-metrics\") pod \"135075ee-2f44-402b-a071-36b3b720d928\" (UID: \"135075ee-2f44-402b-a071-36b3b720d928\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244703 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4npb\" (UniqueName: \"kubernetes.io/projected/5cdecddf-df66-4aef-bc33-65cbcf74db58-kube-api-access-x4npb\") pod \"5cdecddf-df66-4aef-bc33-65cbcf74db58\" (UID: \"5cdecddf-df66-4aef-bc33-65cbcf74db58\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244722 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-drgmj\" (UniqueName: \"kubernetes.io/projected/135075ee-2f44-402b-a071-36b3b720d928-kube-api-access-drgmj\") pod \"135075ee-2f44-402b-a071-36b3b720d928\" (UID: \"135075ee-2f44-402b-a071-36b3b720d928\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244737 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-catalog-content\") pod \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\" (UID: \"ea4d526d-cb74-4ac6-a3fe-33aad14c3444\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244754 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccfbc043-c76e-4afd-a7e7-db0057427fa5-catalog-content\") pod \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\" (UID: \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244781 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf274771-c291-4ab4-9f69-1e1554707a6c-catalog-content\") pod \"bf274771-c291-4ab4-9f69-1e1554707a6c\" (UID: \"bf274771-c291-4ab4-9f69-1e1554707a6c\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244802 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccfbc043-c76e-4afd-a7e7-db0057427fa5-utilities\") pod \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\" (UID: \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.244822 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x9w5n\" (UniqueName: \"kubernetes.io/projected/ccfbc043-c76e-4afd-a7e7-db0057427fa5-kube-api-access-x9w5n\") pod \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\" (UID: \"ccfbc043-c76e-4afd-a7e7-db0057427fa5\") " Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.245380 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cdecddf-df66-4aef-bc33-65cbcf74db58-utilities" (OuterVolumeSpecName: "utilities") pod "5cdecddf-df66-4aef-bc33-65cbcf74db58" (UID: "5cdecddf-df66-4aef-bc33-65cbcf74db58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.245488 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf274771-c291-4ab4-9f69-1e1554707a6c-utilities" (OuterVolumeSpecName: "utilities") pod "bf274771-c291-4ab4-9f69-1e1554707a6c" (UID: "bf274771-c291-4ab4-9f69-1e1554707a6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.245922 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/135075ee-2f44-402b-a071-36b3b720d928-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "135075ee-2f44-402b-a071-36b3b720d928" (UID: "135075ee-2f44-402b-a071-36b3b720d928"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.249001 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cdecddf-df66-4aef-bc33-65cbcf74db58-kube-api-access-x4npb" (OuterVolumeSpecName: "kube-api-access-x4npb") pod "5cdecddf-df66-4aef-bc33-65cbcf74db58" (UID: "5cdecddf-df66-4aef-bc33-65cbcf74db58"). InnerVolumeSpecName "kube-api-access-x4npb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.249067 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccfbc043-c76e-4afd-a7e7-db0057427fa5-kube-api-access-x9w5n" (OuterVolumeSpecName: "kube-api-access-x9w5n") pod "ccfbc043-c76e-4afd-a7e7-db0057427fa5" (UID: "ccfbc043-c76e-4afd-a7e7-db0057427fa5"). InnerVolumeSpecName "kube-api-access-x9w5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.249332 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-kube-api-access-lrstd" (OuterVolumeSpecName: "kube-api-access-lrstd") pod "ea4d526d-cb74-4ac6-a3fe-33aad14c3444" (UID: "ea4d526d-cb74-4ac6-a3fe-33aad14c3444"). InnerVolumeSpecName "kube-api-access-lrstd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.249662 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-utilities" (OuterVolumeSpecName: "utilities") pod "ea4d526d-cb74-4ac6-a3fe-33aad14c3444" (UID: "ea4d526d-cb74-4ac6-a3fe-33aad14c3444"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.250483 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccfbc043-c76e-4afd-a7e7-db0057427fa5-utilities" (OuterVolumeSpecName: "utilities") pod "ccfbc043-c76e-4afd-a7e7-db0057427fa5" (UID: "ccfbc043-c76e-4afd-a7e7-db0057427fa5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.256100 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf274771-c291-4ab4-9f69-1e1554707a6c-kube-api-access-5kvvz" (OuterVolumeSpecName: "kube-api-access-5kvvz") pod "bf274771-c291-4ab4-9f69-1e1554707a6c" (UID: "bf274771-c291-4ab4-9f69-1e1554707a6c"). InnerVolumeSpecName "kube-api-access-5kvvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.256370 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/135075ee-2f44-402b-a071-36b3b720d928-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "135075ee-2f44-402b-a071-36b3b720d928" (UID: "135075ee-2f44-402b-a071-36b3b720d928"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.256511 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/135075ee-2f44-402b-a071-36b3b720d928-kube-api-access-drgmj" (OuterVolumeSpecName: "kube-api-access-drgmj") pod "135075ee-2f44-402b-a071-36b3b720d928" (UID: "135075ee-2f44-402b-a071-36b3b720d928"). InnerVolumeSpecName "kube-api-access-drgmj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.273651 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf274771-c291-4ab4-9f69-1e1554707a6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf274771-c291-4ab4-9f69-1e1554707a6c" (UID: "bf274771-c291-4ab4-9f69-1e1554707a6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.310406 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cdecddf-df66-4aef-bc33-65cbcf74db58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5cdecddf-df66-4aef-bc33-65cbcf74db58" (UID: "5cdecddf-df66-4aef-bc33-65cbcf74db58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.311361 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccfbc043-c76e-4afd-a7e7-db0057427fa5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ccfbc043-c76e-4afd-a7e7-db0057427fa5" (UID: "ccfbc043-c76e-4afd-a7e7-db0057427fa5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.321424 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.345996 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.346030 4831 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/135075ee-2f44-402b-a071-36b3b720d928-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.346067 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4npb\" (UniqueName: \"kubernetes.io/projected/5cdecddf-df66-4aef-bc33-65cbcf74db58-kube-api-access-x4npb\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.346079 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-drgmj\" (UniqueName: \"kubernetes.io/projected/135075ee-2f44-402b-a071-36b3b720d928-kube-api-access-drgmj\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.346091 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ccfbc043-c76e-4afd-a7e7-db0057427fa5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.346101 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf274771-c291-4ab4-9f69-1e1554707a6c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.346111 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ccfbc043-c76e-4afd-a7e7-db0057427fa5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.346121 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x9w5n\" (UniqueName: \"kubernetes.io/projected/ccfbc043-c76e-4afd-a7e7-db0057427fa5-kube-api-access-x9w5n\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.346132 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5cdecddf-df66-4aef-bc33-65cbcf74db58-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.346141 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrstd\" (UniqueName: \"kubernetes.io/projected/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-kube-api-access-lrstd\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.346151 4831 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/135075ee-2f44-402b-a071-36b3b720d928-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.346162 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf274771-c291-4ab4-9f69-1e1554707a6c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.346171 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5cdecddf-df66-4aef-bc33-65cbcf74db58-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.346181 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kvvz\" (UniqueName: \"kubernetes.io/projected/bf274771-c291-4ab4-9f69-1e1554707a6c-kube-api-access-5kvvz\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.361935 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ea4d526d-cb74-4ac6-a3fe-33aad14c3444" (UID: "ea4d526d-cb74-4ac6-a3fe-33aad14c3444"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.399162 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-j22fq"] Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.447552 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ea4d526d-cb74-4ac6-a3fe-33aad14c3444-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.472813 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.644504 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.669573 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.683305 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.731147 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.747261 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.826426 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.839907 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" event={"ID":"135075ee-2f44-402b-a071-36b3b720d928","Type":"ContainerDied","Data":"908ecc83c9385b6fe4b439ba14eb36002a432f67121564e193e7212a63b7eb14"} Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.840073 4831 scope.go:117] "RemoveContainer" containerID="7973a7bb0fe3f24d5a7fef9c18eefc12726352545301f6c9c6d16b23bc76b8c8" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.840179 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.839966 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-vjhwt" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.842131 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rtq2s" event={"ID":"ccfbc043-c76e-4afd-a7e7-db0057427fa5","Type":"ContainerDied","Data":"80fea317094a3d00023c8a4666edafa49fe939e30c3762fa57878d03865d1d13"} Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.842161 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rtq2s" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.844297 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ndqmw" event={"ID":"ea4d526d-cb74-4ac6-a3fe-33aad14c3444","Type":"ContainerDied","Data":"04123ef09724e760335fa9636da353b69aaecc1a8b321488746f3ab6a1c5eeb3"} Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.844437 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ndqmw" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.853031 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.854470 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-4v6c6" event={"ID":"bf274771-c291-4ab4-9f69-1e1554707a6c","Type":"ContainerDied","Data":"31ae8e730854386722f81226b580e143344d454983a84bbccd0b0e840a2274c5"} Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.854646 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-4v6c6" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.858338 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-64gsc" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.858334 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-64gsc" event={"ID":"5cdecddf-df66-4aef-bc33-65cbcf74db58","Type":"ContainerDied","Data":"aafa2a436c94f5a7e53dd1e7b9bc2f352897d17d3254b1c9b6fadf2528ac0a05"} Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.860154 4831 scope.go:117] "RemoveContainer" containerID="532e3ecfca649bcc16e35370a18d6adcedea5cff04eb73d44bf99d88c9a55ac6" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.865041 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" event={"ID":"5c596f0f-729f-4beb-b1f7-58ce65c9a928","Type":"ContainerStarted","Data":"96d77d39ad5a38b7a2437a94a26cded9804adbba2dca0c0c7a79d1b96fea08a0"} Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.865104 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" event={"ID":"5c596f0f-729f-4beb-b1f7-58ce65c9a928","Type":"ContainerStarted","Data":"15051dfb585d530565ba2458e4c3d0e02c2f4e04292b518c71a611bcd0b6baf6"} Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.866168 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.881923 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.884638 4831 scope.go:117] "RemoveContainer" containerID="399f0b88f7684a260f5bd513cdca14805cc25c5ff3e40c0639d005383b2b3613" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.885528 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vjhwt"] Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.894185 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-vjhwt"] Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.899623 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-4v6c6"] Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.900435 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.903478 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-4v6c6"] Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.910092 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" podStartSLOduration=1.9100776499999998 podStartE2EDuration="1.91007765s" podCreationTimestamp="2025-12-03 06:35:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:35:40.905740709 +0000 UTC m=+278.249324237" watchObservedRunningTime="2025-12-03 06:35:40.91007765 +0000 UTC m=+278.253661158" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.917687 4831 scope.go:117] "RemoveContainer" containerID="e190fc9cdacc185ec3c5349d8f84cb6c70e0f6278a5e7ab17e90e5e38642f498" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.935626 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-64gsc"] Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.939837 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-64gsc"] Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.947096 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ndqmw"] Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.954988 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.957233 4831 scope.go:117] "RemoveContainer" containerID="05221bdd08b39cc75cc87540df869ad1463e16d95d607b8b644e68dfb26b1051" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.959852 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ndqmw"] Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.965242 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rtq2s"] Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.969432 4831 scope.go:117] "RemoveContainer" containerID="62bda3a1a2584028128622934e95acaaf70b654738a7715764ca2a94d4df15db" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.973581 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rtq2s"] Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.985722 4831 scope.go:117] "RemoveContainer" containerID="0ad2bd96ad34a1e3ee11191d845b498994363bf825f46d81b32b477282ce9a6e" Dec 03 06:35:40 crc kubenswrapper[4831]: I1203 06:35:40.991990 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.003413 4831 scope.go:117] "RemoveContainer" containerID="83f85b2da2946a979e73cc350b05e91bf7d357f8aa9bfb30ddcb7ecc07585feb" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.019897 4831 scope.go:117] "RemoveContainer" containerID="02d715fcd5f09ec798cc093491d969cb08fae535535a67e096e4c7924b07ae17" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.025696 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="135075ee-2f44-402b-a071-36b3b720d928" path="/var/lib/kubelet/pods/135075ee-2f44-402b-a071-36b3b720d928/volumes" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.026153 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cdecddf-df66-4aef-bc33-65cbcf74db58" path="/var/lib/kubelet/pods/5cdecddf-df66-4aef-bc33-65cbcf74db58/volumes" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.026751 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf274771-c291-4ab4-9f69-1e1554707a6c" path="/var/lib/kubelet/pods/bf274771-c291-4ab4-9f69-1e1554707a6c/volumes" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.027754 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccfbc043-c76e-4afd-a7e7-db0057427fa5" path="/var/lib/kubelet/pods/ccfbc043-c76e-4afd-a7e7-db0057427fa5/volumes" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.028308 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea4d526d-cb74-4ac6-a3fe-33aad14c3444" path="/var/lib/kubelet/pods/ea4d526d-cb74-4ac6-a3fe-33aad14c3444/volumes" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.032245 4831 scope.go:117] "RemoveContainer" containerID="ddf5a50878aa159057b41d98cf6fdfef6298e80f750ff8dab2f4bd61cddcacf6" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.053366 4831 scope.go:117] "RemoveContainer" containerID="cb4727a6296a5b28db8342f8ce962ce1ce7b3856e62cb0313a7e26022c14d138" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.065086 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.068534 4831 scope.go:117] "RemoveContainer" containerID="0318b2ae97b99bfccd1fbe2b462c9dbee4817059d1caae8743e8c089b294c8ec" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.080647 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.083099 4831 scope.go:117] "RemoveContainer" containerID="f9874f5c209994a6cde1827fe496db55cdadf8f1545eb9eb0ab97a79619d9e34" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.097856 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.107557 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.138927 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.168710 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.182675 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.306527 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.365256 4831 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.387227 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.424509 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.426385 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.543440 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.551416 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.575042 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.653238 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.690137 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.694657 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.847978 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.903805 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Dec 03 06:35:41 crc kubenswrapper[4831]: I1203 06:35:41.937137 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Dec 03 06:35:42 crc kubenswrapper[4831]: I1203 06:35:42.182888 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Dec 03 06:35:42 crc kubenswrapper[4831]: I1203 06:35:42.360242 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Dec 03 06:35:42 crc kubenswrapper[4831]: I1203 06:35:42.579734 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Dec 03 06:35:42 crc kubenswrapper[4831]: I1203 06:35:42.631724 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Dec 03 06:35:42 crc kubenswrapper[4831]: I1203 06:35:42.937002 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Dec 03 06:35:42 crc kubenswrapper[4831]: I1203 06:35:42.947512 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Dec 03 06:35:43 crc kubenswrapper[4831]: I1203 06:35:43.292302 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Dec 03 06:35:45 crc kubenswrapper[4831]: I1203 06:35:43.337579 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Dec 03 06:35:45 crc kubenswrapper[4831]: I1203 06:35:43.442481 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Dec 03 06:35:45 crc kubenswrapper[4831]: I1203 06:35:43.537934 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Dec 03 06:35:45 crc kubenswrapper[4831]: I1203 06:35:43.793188 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Dec 03 06:35:45 crc kubenswrapper[4831]: I1203 06:35:43.828296 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Dec 03 06:35:45 crc kubenswrapper[4831]: I1203 06:35:43.843500 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Dec 03 06:35:45 crc kubenswrapper[4831]: I1203 06:35:45.010646 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Dec 03 06:35:45 crc kubenswrapper[4831]: I1203 06:35:45.668780 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.232758 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.233211 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.338633 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.338681 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.338742 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.338765 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.338805 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.339061 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.339097 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.339120 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.339140 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.346819 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.440477 4831 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.440529 4831 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.440551 4831 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.440568 4831 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.440588 4831 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.668428 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.668512 4831 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="042a92012f45e0a4383ab39957576846acaa32cc79b599dff8b98dbcb298e33a" exitCode=137 Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.668584 4831 scope.go:117] "RemoveContainer" containerID="042a92012f45e0a4383ab39957576846acaa32cc79b599dff8b98dbcb298e33a" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.668661 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.691737 4831 scope.go:117] "RemoveContainer" containerID="042a92012f45e0a4383ab39957576846acaa32cc79b599dff8b98dbcb298e33a" Dec 03 06:35:46 crc kubenswrapper[4831]: E1203 06:35:46.692411 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"042a92012f45e0a4383ab39957576846acaa32cc79b599dff8b98dbcb298e33a\": container with ID starting with 042a92012f45e0a4383ab39957576846acaa32cc79b599dff8b98dbcb298e33a not found: ID does not exist" containerID="042a92012f45e0a4383ab39957576846acaa32cc79b599dff8b98dbcb298e33a" Dec 03 06:35:46 crc kubenswrapper[4831]: I1203 06:35:46.692454 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"042a92012f45e0a4383ab39957576846acaa32cc79b599dff8b98dbcb298e33a"} err="failed to get container status \"042a92012f45e0a4383ab39957576846acaa32cc79b599dff8b98dbcb298e33a\": rpc error: code = NotFound desc = could not find container \"042a92012f45e0a4383ab39957576846acaa32cc79b599dff8b98dbcb298e33a\": container with ID starting with 042a92012f45e0a4383ab39957576846acaa32cc79b599dff8b98dbcb298e33a not found: ID does not exist" Dec 03 06:35:47 crc kubenswrapper[4831]: I1203 06:35:47.020852 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Dec 03 06:35:47 crc kubenswrapper[4831]: I1203 06:35:47.021074 4831 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Dec 03 06:35:47 crc kubenswrapper[4831]: I1203 06:35:47.032347 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 06:35:47 crc kubenswrapper[4831]: I1203 06:35:47.032389 4831 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="956e12d6-3816-4af2-b6c5-8977626edf7a" Dec 03 06:35:47 crc kubenswrapper[4831]: I1203 06:35:47.036631 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Dec 03 06:35:47 crc kubenswrapper[4831]: I1203 06:35:47.036656 4831 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="956e12d6-3816-4af2-b6c5-8977626edf7a" Dec 03 06:36:14 crc kubenswrapper[4831]: I1203 06:36:14.918132 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572376 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xdz7k"] Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572648 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4d526d-cb74-4ac6-a3fe-33aad14c3444" containerName="extract-content" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572666 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4d526d-cb74-4ac6-a3fe-33aad14c3444" containerName="extract-content" Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572681 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cdecddf-df66-4aef-bc33-65cbcf74db58" containerName="extract-content" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572691 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cdecddf-df66-4aef-bc33-65cbcf74db58" containerName="extract-content" Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572706 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf274771-c291-4ab4-9f69-1e1554707a6c" containerName="registry-server" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572716 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf274771-c291-4ab4-9f69-1e1554707a6c" containerName="registry-server" Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572737 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccfbc043-c76e-4afd-a7e7-db0057427fa5" containerName="registry-server" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572747 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccfbc043-c76e-4afd-a7e7-db0057427fa5" containerName="registry-server" Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572758 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572768 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572782 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccfbc043-c76e-4afd-a7e7-db0057427fa5" containerName="extract-utilities" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572792 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccfbc043-c76e-4afd-a7e7-db0057427fa5" containerName="extract-utilities" Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572803 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4d526d-cb74-4ac6-a3fe-33aad14c3444" containerName="extract-utilities" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572815 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4d526d-cb74-4ac6-a3fe-33aad14c3444" containerName="extract-utilities" Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572825 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="135075ee-2f44-402b-a071-36b3b720d928" containerName="marketplace-operator" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572835 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="135075ee-2f44-402b-a071-36b3b720d928" containerName="marketplace-operator" Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572849 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccfbc043-c76e-4afd-a7e7-db0057427fa5" containerName="extract-content" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572860 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccfbc043-c76e-4afd-a7e7-db0057427fa5" containerName="extract-content" Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572877 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf274771-c291-4ab4-9f69-1e1554707a6c" containerName="extract-utilities" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572887 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf274771-c291-4ab4-9f69-1e1554707a6c" containerName="extract-utilities" Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572900 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4d526d-cb74-4ac6-a3fe-33aad14c3444" containerName="registry-server" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572910 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4d526d-cb74-4ac6-a3fe-33aad14c3444" containerName="registry-server" Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572920 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cdecddf-df66-4aef-bc33-65cbcf74db58" containerName="extract-utilities" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572931 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cdecddf-df66-4aef-bc33-65cbcf74db58" containerName="extract-utilities" Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572947 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cdecddf-df66-4aef-bc33-65cbcf74db58" containerName="registry-server" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572957 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cdecddf-df66-4aef-bc33-65cbcf74db58" containerName="registry-server" Dec 03 06:36:15 crc kubenswrapper[4831]: E1203 06:36:15.572972 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf274771-c291-4ab4-9f69-1e1554707a6c" containerName="extract-content" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.572982 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf274771-c291-4ab4-9f69-1e1554707a6c" containerName="extract-content" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.573106 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.573123 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="135075ee-2f44-402b-a071-36b3b720d928" containerName="marketplace-operator" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.573138 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cdecddf-df66-4aef-bc33-65cbcf74db58" containerName="registry-server" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.573154 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf274771-c291-4ab4-9f69-1e1554707a6c" containerName="registry-server" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.573175 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4d526d-cb74-4ac6-a3fe-33aad14c3444" containerName="registry-server" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.573189 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccfbc043-c76e-4afd-a7e7-db0057427fa5" containerName="registry-server" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.574214 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.577071 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.586351 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xdz7k"] Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.625055 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3947b643-3e94-4f5b-81d2-b2132bb654f8-utilities\") pod \"community-operators-xdz7k\" (UID: \"3947b643-3e94-4f5b-81d2-b2132bb654f8\") " pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.625123 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3947b643-3e94-4f5b-81d2-b2132bb654f8-catalog-content\") pod \"community-operators-xdz7k\" (UID: \"3947b643-3e94-4f5b-81d2-b2132bb654f8\") " pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.625256 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6wpn\" (UniqueName: \"kubernetes.io/projected/3947b643-3e94-4f5b-81d2-b2132bb654f8-kube-api-access-v6wpn\") pod \"community-operators-xdz7k\" (UID: \"3947b643-3e94-4f5b-81d2-b2132bb654f8\") " pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.726813 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6wpn\" (UniqueName: \"kubernetes.io/projected/3947b643-3e94-4f5b-81d2-b2132bb654f8-kube-api-access-v6wpn\") pod \"community-operators-xdz7k\" (UID: \"3947b643-3e94-4f5b-81d2-b2132bb654f8\") " pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.726880 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3947b643-3e94-4f5b-81d2-b2132bb654f8-utilities\") pod \"community-operators-xdz7k\" (UID: \"3947b643-3e94-4f5b-81d2-b2132bb654f8\") " pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.726927 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3947b643-3e94-4f5b-81d2-b2132bb654f8-catalog-content\") pod \"community-operators-xdz7k\" (UID: \"3947b643-3e94-4f5b-81d2-b2132bb654f8\") " pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.727344 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3947b643-3e94-4f5b-81d2-b2132bb654f8-catalog-content\") pod \"community-operators-xdz7k\" (UID: \"3947b643-3e94-4f5b-81d2-b2132bb654f8\") " pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.727763 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3947b643-3e94-4f5b-81d2-b2132bb654f8-utilities\") pod \"community-operators-xdz7k\" (UID: \"3947b643-3e94-4f5b-81d2-b2132bb654f8\") " pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.745810 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6wpn\" (UniqueName: \"kubernetes.io/projected/3947b643-3e94-4f5b-81d2-b2132bb654f8-kube-api-access-v6wpn\") pod \"community-operators-xdz7k\" (UID: \"3947b643-3e94-4f5b-81d2-b2132bb654f8\") " pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.759914 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-9mjxh"] Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.761060 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.762805 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.774530 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9mjxh"] Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.827757 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03e08a40-1b90-40fb-a497-72589cdb0dcc-utilities\") pod \"certified-operators-9mjxh\" (UID: \"03e08a40-1b90-40fb-a497-72589cdb0dcc\") " pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.827869 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hztkx\" (UniqueName: \"kubernetes.io/projected/03e08a40-1b90-40fb-a497-72589cdb0dcc-kube-api-access-hztkx\") pod \"certified-operators-9mjxh\" (UID: \"03e08a40-1b90-40fb-a497-72589cdb0dcc\") " pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.827907 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03e08a40-1b90-40fb-a497-72589cdb0dcc-catalog-content\") pod \"certified-operators-9mjxh\" (UID: \"03e08a40-1b90-40fb-a497-72589cdb0dcc\") " pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.928958 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03e08a40-1b90-40fb-a497-72589cdb0dcc-catalog-content\") pod \"certified-operators-9mjxh\" (UID: \"03e08a40-1b90-40fb-a497-72589cdb0dcc\") " pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.929023 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03e08a40-1b90-40fb-a497-72589cdb0dcc-utilities\") pod \"certified-operators-9mjxh\" (UID: \"03e08a40-1b90-40fb-a497-72589cdb0dcc\") " pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.929063 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hztkx\" (UniqueName: \"kubernetes.io/projected/03e08a40-1b90-40fb-a497-72589cdb0dcc-kube-api-access-hztkx\") pod \"certified-operators-9mjxh\" (UID: \"03e08a40-1b90-40fb-a497-72589cdb0dcc\") " pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.929598 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03e08a40-1b90-40fb-a497-72589cdb0dcc-catalog-content\") pod \"certified-operators-9mjxh\" (UID: \"03e08a40-1b90-40fb-a497-72589cdb0dcc\") " pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.929682 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03e08a40-1b90-40fb-a497-72589cdb0dcc-utilities\") pod \"certified-operators-9mjxh\" (UID: \"03e08a40-1b90-40fb-a497-72589cdb0dcc\") " pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.941036 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:15 crc kubenswrapper[4831]: I1203 06:36:15.948623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hztkx\" (UniqueName: \"kubernetes.io/projected/03e08a40-1b90-40fb-a497-72589cdb0dcc-kube-api-access-hztkx\") pod \"certified-operators-9mjxh\" (UID: \"03e08a40-1b90-40fb-a497-72589cdb0dcc\") " pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:16 crc kubenswrapper[4831]: I1203 06:36:16.096763 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:16 crc kubenswrapper[4831]: I1203 06:36:16.152124 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xdz7k"] Dec 03 06:36:16 crc kubenswrapper[4831]: I1203 06:36:16.510061 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-9mjxh"] Dec 03 06:36:16 crc kubenswrapper[4831]: W1203 06:36:16.514563 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03e08a40_1b90_40fb_a497_72589cdb0dcc.slice/crio-1f0860c25afe3a5fefa5d395091777f72dd6e55cc12904c7479fb95756a1b98a WatchSource:0}: Error finding container 1f0860c25afe3a5fefa5d395091777f72dd6e55cc12904c7479fb95756a1b98a: Status 404 returned error can't find the container with id 1f0860c25afe3a5fefa5d395091777f72dd6e55cc12904c7479fb95756a1b98a Dec 03 06:36:16 crc kubenswrapper[4831]: I1203 06:36:16.837482 4831 generic.go:334] "Generic (PLEG): container finished" podID="03e08a40-1b90-40fb-a497-72589cdb0dcc" containerID="20bb4a09eabde14a8879b6ae0982718167bf0eab33fded11b875033dabd64ebf" exitCode=0 Dec 03 06:36:16 crc kubenswrapper[4831]: I1203 06:36:16.837630 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mjxh" event={"ID":"03e08a40-1b90-40fb-a497-72589cdb0dcc","Type":"ContainerDied","Data":"20bb4a09eabde14a8879b6ae0982718167bf0eab33fded11b875033dabd64ebf"} Dec 03 06:36:16 crc kubenswrapper[4831]: I1203 06:36:16.837841 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mjxh" event={"ID":"03e08a40-1b90-40fb-a497-72589cdb0dcc","Type":"ContainerStarted","Data":"1f0860c25afe3a5fefa5d395091777f72dd6e55cc12904c7479fb95756a1b98a"} Dec 03 06:36:16 crc kubenswrapper[4831]: I1203 06:36:16.843949 4831 generic.go:334] "Generic (PLEG): container finished" podID="3947b643-3e94-4f5b-81d2-b2132bb654f8" containerID="d60331c48ea2f7b73c5fe3185dc9f893d5651749b3b79e7062c9d0cc1b1408eb" exitCode=0 Dec 03 06:36:16 crc kubenswrapper[4831]: I1203 06:36:16.843994 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdz7k" event={"ID":"3947b643-3e94-4f5b-81d2-b2132bb654f8","Type":"ContainerDied","Data":"d60331c48ea2f7b73c5fe3185dc9f893d5651749b3b79e7062c9d0cc1b1408eb"} Dec 03 06:36:16 crc kubenswrapper[4831]: I1203 06:36:16.844041 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdz7k" event={"ID":"3947b643-3e94-4f5b-81d2-b2132bb654f8","Type":"ContainerStarted","Data":"b79afb3b88ece017c9ee646e826fc66d355e200077bdc50b5eb9f707aa27989e"} Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.360002 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-njc9k"] Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.361638 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.364407 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.389948 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-njc9k"] Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.456733 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl6rr\" (UniqueName: \"kubernetes.io/projected/4139c162-640d-47e5-878f-b6c3835bd31d-kube-api-access-fl6rr\") pod \"redhat-marketplace-njc9k\" (UID: \"4139c162-640d-47e5-878f-b6c3835bd31d\") " pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.456851 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4139c162-640d-47e5-878f-b6c3835bd31d-catalog-content\") pod \"redhat-marketplace-njc9k\" (UID: \"4139c162-640d-47e5-878f-b6c3835bd31d\") " pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.456888 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4139c162-640d-47e5-878f-b6c3835bd31d-utilities\") pod \"redhat-marketplace-njc9k\" (UID: \"4139c162-640d-47e5-878f-b6c3835bd31d\") " pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.559057 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl6rr\" (UniqueName: \"kubernetes.io/projected/4139c162-640d-47e5-878f-b6c3835bd31d-kube-api-access-fl6rr\") pod \"redhat-marketplace-njc9k\" (UID: \"4139c162-640d-47e5-878f-b6c3835bd31d\") " pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.559143 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4139c162-640d-47e5-878f-b6c3835bd31d-catalog-content\") pod \"redhat-marketplace-njc9k\" (UID: \"4139c162-640d-47e5-878f-b6c3835bd31d\") " pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.559167 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4139c162-640d-47e5-878f-b6c3835bd31d-utilities\") pod \"redhat-marketplace-njc9k\" (UID: \"4139c162-640d-47e5-878f-b6c3835bd31d\") " pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.559607 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4139c162-640d-47e5-878f-b6c3835bd31d-utilities\") pod \"redhat-marketplace-njc9k\" (UID: \"4139c162-640d-47e5-878f-b6c3835bd31d\") " pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.560363 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4139c162-640d-47e5-878f-b6c3835bd31d-catalog-content\") pod \"redhat-marketplace-njc9k\" (UID: \"4139c162-640d-47e5-878f-b6c3835bd31d\") " pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.582343 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl6rr\" (UniqueName: \"kubernetes.io/projected/4139c162-640d-47e5-878f-b6c3835bd31d-kube-api-access-fl6rr\") pod \"redhat-marketplace-njc9k\" (UID: \"4139c162-640d-47e5-878f-b6c3835bd31d\") " pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.684433 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.855196 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mjxh" event={"ID":"03e08a40-1b90-40fb-a497-72589cdb0dcc","Type":"ContainerStarted","Data":"639d13b1fb5d4e11c6ad425d8bd45f1ac36dd2d40829542d7c0a9dbeb3365f72"} Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.862902 4831 generic.go:334] "Generic (PLEG): container finished" podID="3947b643-3e94-4f5b-81d2-b2132bb654f8" containerID="314282480cdfb5db852f4b7e2c8fac6c0ead84eccd80ad70cf59bd1e7402e3c2" exitCode=0 Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.862938 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdz7k" event={"ID":"3947b643-3e94-4f5b-81d2-b2132bb654f8","Type":"ContainerDied","Data":"314282480cdfb5db852f4b7e2c8fac6c0ead84eccd80ad70cf59bd1e7402e3c2"} Dec 03 06:36:17 crc kubenswrapper[4831]: I1203 06:36:17.899339 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-njc9k"] Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.377173 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vshtj"] Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.379266 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.388440 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.400078 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vshtj"] Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.478169 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-utilities\") pod \"redhat-operators-vshtj\" (UID: \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\") " pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.478291 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-catalog-content\") pod \"redhat-operators-vshtj\" (UID: \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\") " pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.478399 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2j59\" (UniqueName: \"kubernetes.io/projected/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-kube-api-access-d2j59\") pod \"redhat-operators-vshtj\" (UID: \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\") " pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.580288 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2j59\" (UniqueName: \"kubernetes.io/projected/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-kube-api-access-d2j59\") pod \"redhat-operators-vshtj\" (UID: \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\") " pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.580499 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-utilities\") pod \"redhat-operators-vshtj\" (UID: \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\") " pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.580588 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-catalog-content\") pod \"redhat-operators-vshtj\" (UID: \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\") " pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.581426 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-utilities\") pod \"redhat-operators-vshtj\" (UID: \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\") " pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.581560 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-catalog-content\") pod \"redhat-operators-vshtj\" (UID: \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\") " pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.621512 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2j59\" (UniqueName: \"kubernetes.io/projected/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-kube-api-access-d2j59\") pod \"redhat-operators-vshtj\" (UID: \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\") " pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.737174 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.869804 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdz7k" event={"ID":"3947b643-3e94-4f5b-81d2-b2132bb654f8","Type":"ContainerStarted","Data":"e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688"} Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.870942 4831 generic.go:334] "Generic (PLEG): container finished" podID="4139c162-640d-47e5-878f-b6c3835bd31d" containerID="9f1e7e37591243a1ec2383a68bb99ad6bf05dde153c86d37ef24a2c0c63b7be4" exitCode=0 Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.871000 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njc9k" event={"ID":"4139c162-640d-47e5-878f-b6c3835bd31d","Type":"ContainerDied","Data":"9f1e7e37591243a1ec2383a68bb99ad6bf05dde153c86d37ef24a2c0c63b7be4"} Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.871017 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njc9k" event={"ID":"4139c162-640d-47e5-878f-b6c3835bd31d","Type":"ContainerStarted","Data":"1c08b9067fc5d45c31c8eeaced874a70561e0e0fca871f264efb0b9a3891d10e"} Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.880625 4831 generic.go:334] "Generic (PLEG): container finished" podID="03e08a40-1b90-40fb-a497-72589cdb0dcc" containerID="639d13b1fb5d4e11c6ad425d8bd45f1ac36dd2d40829542d7c0a9dbeb3365f72" exitCode=0 Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.880687 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mjxh" event={"ID":"03e08a40-1b90-40fb-a497-72589cdb0dcc","Type":"ContainerDied","Data":"639d13b1fb5d4e11c6ad425d8bd45f1ac36dd2d40829542d7c0a9dbeb3365f72"} Dec 03 06:36:18 crc kubenswrapper[4831]: I1203 06:36:18.903438 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xdz7k" podStartSLOduration=2.419267277 podStartE2EDuration="3.903416742s" podCreationTimestamp="2025-12-03 06:36:15 +0000 UTC" firstStartedPulling="2025-12-03 06:36:16.845959567 +0000 UTC m=+314.189543075" lastFinishedPulling="2025-12-03 06:36:18.330109002 +0000 UTC m=+315.673692540" observedRunningTime="2025-12-03 06:36:18.886005344 +0000 UTC m=+316.229588852" watchObservedRunningTime="2025-12-03 06:36:18.903416742 +0000 UTC m=+316.247000250" Dec 03 06:36:19 crc kubenswrapper[4831]: I1203 06:36:19.170480 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vshtj"] Dec 03 06:36:19 crc kubenswrapper[4831]: W1203 06:36:19.178622 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod89c1dc30_6b2e_4dbd_916d_0c6fd5907a71.slice/crio-33d389d2583fa36b98c2a5804a3956852ea846a7f412b3e3f3f7826bacced236 WatchSource:0}: Error finding container 33d389d2583fa36b98c2a5804a3956852ea846a7f412b3e3f3f7826bacced236: Status 404 returned error can't find the container with id 33d389d2583fa36b98c2a5804a3956852ea846a7f412b3e3f3f7826bacced236 Dec 03 06:36:19 crc kubenswrapper[4831]: I1203 06:36:19.888042 4831 generic.go:334] "Generic (PLEG): container finished" podID="4139c162-640d-47e5-878f-b6c3835bd31d" containerID="1b82ffe37f6f685b26c8de7fa3b6221c119f442225a7681fa9d5555756f3e0a9" exitCode=0 Dec 03 06:36:19 crc kubenswrapper[4831]: I1203 06:36:19.888197 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njc9k" event={"ID":"4139c162-640d-47e5-878f-b6c3835bd31d","Type":"ContainerDied","Data":"1b82ffe37f6f685b26c8de7fa3b6221c119f442225a7681fa9d5555756f3e0a9"} Dec 03 06:36:19 crc kubenswrapper[4831]: I1203 06:36:19.890817 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mjxh" event={"ID":"03e08a40-1b90-40fb-a497-72589cdb0dcc","Type":"ContainerStarted","Data":"23eec0c855d888ba42fae44af3741ef3e321cf53ade8e352e3b32fd77a01a3be"} Dec 03 06:36:19 crc kubenswrapper[4831]: I1203 06:36:19.892560 4831 generic.go:334] "Generic (PLEG): container finished" podID="89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" containerID="197066afae1593da3e7c50c1d22446492151340e589f572f207388e4f644f78b" exitCode=0 Dec 03 06:36:19 crc kubenswrapper[4831]: I1203 06:36:19.892684 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vshtj" event={"ID":"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71","Type":"ContainerDied","Data":"197066afae1593da3e7c50c1d22446492151340e589f572f207388e4f644f78b"} Dec 03 06:36:19 crc kubenswrapper[4831]: I1203 06:36:19.892804 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vshtj" event={"ID":"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71","Type":"ContainerStarted","Data":"33d389d2583fa36b98c2a5804a3956852ea846a7f412b3e3f3f7826bacced236"} Dec 03 06:36:19 crc kubenswrapper[4831]: I1203 06:36:19.945087 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-9mjxh" podStartSLOduration=2.4533349700000002 podStartE2EDuration="4.945069907s" podCreationTimestamp="2025-12-03 06:36:15 +0000 UTC" firstStartedPulling="2025-12-03 06:36:16.841298765 +0000 UTC m=+314.184882313" lastFinishedPulling="2025-12-03 06:36:19.333033742 +0000 UTC m=+316.676617250" observedRunningTime="2025-12-03 06:36:19.944058034 +0000 UTC m=+317.287641552" watchObservedRunningTime="2025-12-03 06:36:19.945069907 +0000 UTC m=+317.288653415" Dec 03 06:36:20 crc kubenswrapper[4831]: I1203 06:36:20.900006 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-njc9k" event={"ID":"4139c162-640d-47e5-878f-b6c3835bd31d","Type":"ContainerStarted","Data":"5d632e82db700675387a871ba5611842fa5ee2dce9ca550008e5517cfddeb283"} Dec 03 06:36:20 crc kubenswrapper[4831]: I1203 06:36:20.901866 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vshtj" event={"ID":"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71","Type":"ContainerStarted","Data":"f106ee1065d0edc8ea1907f13ee1f7274f73fd38a981dc7620cfcab19ae7e976"} Dec 03 06:36:20 crc kubenswrapper[4831]: I1203 06:36:20.919934 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-njc9k" podStartSLOduration=2.468635708 podStartE2EDuration="3.91991867s" podCreationTimestamp="2025-12-03 06:36:17 +0000 UTC" firstStartedPulling="2025-12-03 06:36:18.872878155 +0000 UTC m=+316.216461663" lastFinishedPulling="2025-12-03 06:36:20.324161117 +0000 UTC m=+317.667744625" observedRunningTime="2025-12-03 06:36:20.915600339 +0000 UTC m=+318.259183887" watchObservedRunningTime="2025-12-03 06:36:20.91991867 +0000 UTC m=+318.263502188" Dec 03 06:36:21 crc kubenswrapper[4831]: I1203 06:36:21.912665 4831 generic.go:334] "Generic (PLEG): container finished" podID="89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" containerID="f106ee1065d0edc8ea1907f13ee1f7274f73fd38a981dc7620cfcab19ae7e976" exitCode=0 Dec 03 06:36:21 crc kubenswrapper[4831]: I1203 06:36:21.912789 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vshtj" event={"ID":"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71","Type":"ContainerDied","Data":"f106ee1065d0edc8ea1907f13ee1f7274f73fd38a981dc7620cfcab19ae7e976"} Dec 03 06:36:22 crc kubenswrapper[4831]: I1203 06:36:22.275840 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Dec 03 06:36:22 crc kubenswrapper[4831]: I1203 06:36:22.920215 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vshtj" event={"ID":"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71","Type":"ContainerStarted","Data":"bfe1d53fe45e9da86d6522f96847e818410966057a73152043ae38589fbb626e"} Dec 03 06:36:22 crc kubenswrapper[4831]: I1203 06:36:22.943302 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vshtj" podStartSLOduration=2.489473094 podStartE2EDuration="4.943279692s" podCreationTimestamp="2025-12-03 06:36:18 +0000 UTC" firstStartedPulling="2025-12-03 06:36:19.893791912 +0000 UTC m=+317.237375430" lastFinishedPulling="2025-12-03 06:36:22.34759851 +0000 UTC m=+319.691182028" observedRunningTime="2025-12-03 06:36:22.939178968 +0000 UTC m=+320.282762506" watchObservedRunningTime="2025-12-03 06:36:22.943279692 +0000 UTC m=+320.286863210" Dec 03 06:36:25 crc kubenswrapper[4831]: I1203 06:36:25.942283 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:25 crc kubenswrapper[4831]: I1203 06:36:25.942657 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:26 crc kubenswrapper[4831]: I1203 06:36:26.003106 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:26 crc kubenswrapper[4831]: I1203 06:36:26.097883 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:26 crc kubenswrapper[4831]: I1203 06:36:26.097954 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:26 crc kubenswrapper[4831]: I1203 06:36:26.131836 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:26 crc kubenswrapper[4831]: I1203 06:36:26.980723 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xdz7k" Dec 03 06:36:26 crc kubenswrapper[4831]: I1203 06:36:26.991467 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 06:36:27 crc kubenswrapper[4831]: I1203 06:36:27.684660 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:27 crc kubenswrapper[4831]: I1203 06:36:27.684715 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:27 crc kubenswrapper[4831]: I1203 06:36:27.731683 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:27 crc kubenswrapper[4831]: I1203 06:36:27.996877 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-njc9k" Dec 03 06:36:28 crc kubenswrapper[4831]: I1203 06:36:28.737536 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:28 crc kubenswrapper[4831]: I1203 06:36:28.737590 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:28 crc kubenswrapper[4831]: I1203 06:36:28.807211 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:29 crc kubenswrapper[4831]: I1203 06:36:29.024356 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 06:36:57 crc kubenswrapper[4831]: I1203 06:36:57.596398 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:36:57 crc kubenswrapper[4831]: I1203 06:36:57.597148 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.492863 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mgxm2"] Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.495997 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.518150 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mgxm2"] Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.678556 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v798t\" (UniqueName: \"kubernetes.io/projected/46fc1fd0-5320-4272-8ba5-3c958ef8b948-kube-api-access-v798t\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.678681 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/46fc1fd0-5320-4272-8ba5-3c958ef8b948-registry-tls\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.678726 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/46fc1fd0-5320-4272-8ba5-3c958ef8b948-registry-certificates\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.678871 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.678943 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46fc1fd0-5320-4272-8ba5-3c958ef8b948-trusted-ca\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.679019 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46fc1fd0-5320-4272-8ba5-3c958ef8b948-bound-sa-token\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.679054 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/46fc1fd0-5320-4272-8ba5-3c958ef8b948-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.679098 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/46fc1fd0-5320-4272-8ba5-3c958ef8b948-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.703342 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.780518 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/46fc1fd0-5320-4272-8ba5-3c958ef8b948-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.780812 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/46fc1fd0-5320-4272-8ba5-3c958ef8b948-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.780866 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v798t\" (UniqueName: \"kubernetes.io/projected/46fc1fd0-5320-4272-8ba5-3c958ef8b948-kube-api-access-v798t\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.780890 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/46fc1fd0-5320-4272-8ba5-3c958ef8b948-registry-tls\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.780907 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/46fc1fd0-5320-4272-8ba5-3c958ef8b948-registry-certificates\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.780935 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46fc1fd0-5320-4272-8ba5-3c958ef8b948-trusted-ca\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.780963 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46fc1fd0-5320-4272-8ba5-3c958ef8b948-bound-sa-token\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.782095 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/46fc1fd0-5320-4272-8ba5-3c958ef8b948-ca-trust-extracted\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.782210 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/46fc1fd0-5320-4272-8ba5-3c958ef8b948-registry-certificates\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.782987 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/46fc1fd0-5320-4272-8ba5-3c958ef8b948-trusted-ca\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.790967 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/46fc1fd0-5320-4272-8ba5-3c958ef8b948-registry-tls\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.791978 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/46fc1fd0-5320-4272-8ba5-3c958ef8b948-installation-pull-secrets\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.805159 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/46fc1fd0-5320-4272-8ba5-3c958ef8b948-bound-sa-token\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.819619 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v798t\" (UniqueName: \"kubernetes.io/projected/46fc1fd0-5320-4272-8ba5-3c958ef8b948-kube-api-access-v798t\") pod \"image-registry-66df7c8f76-mgxm2\" (UID: \"46fc1fd0-5320-4272-8ba5-3c958ef8b948\") " pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:09 crc kubenswrapper[4831]: I1203 06:37:09.825226 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:10 crc kubenswrapper[4831]: I1203 06:37:10.050989 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-mgxm2"] Dec 03 06:37:10 crc kubenswrapper[4831]: I1203 06:37:10.575014 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" event={"ID":"46fc1fd0-5320-4272-8ba5-3c958ef8b948","Type":"ContainerStarted","Data":"bd2bd8d3a414dbf17b9bfc20a23bba138240f98fe896e204311212a86d257635"} Dec 03 06:37:10 crc kubenswrapper[4831]: I1203 06:37:10.575075 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" event={"ID":"46fc1fd0-5320-4272-8ba5-3c958ef8b948","Type":"ContainerStarted","Data":"4ca90e1126fe20fede6b6d3a2fb546e135b3440a1373e8c50d7e2a1e0332a291"} Dec 03 06:37:10 crc kubenswrapper[4831]: I1203 06:37:10.575274 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:10 crc kubenswrapper[4831]: I1203 06:37:10.602605 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" podStartSLOduration=1.602588217 podStartE2EDuration="1.602588217s" podCreationTimestamp="2025-12-03 06:37:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:37:10.601195302 +0000 UTC m=+367.944778820" watchObservedRunningTime="2025-12-03 06:37:10.602588217 +0000 UTC m=+367.946171735" Dec 03 06:37:27 crc kubenswrapper[4831]: I1203 06:37:27.596586 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:37:27 crc kubenswrapper[4831]: I1203 06:37:27.597008 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:37:29 crc kubenswrapper[4831]: I1203 06:37:29.838178 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-mgxm2" Dec 03 06:37:29 crc kubenswrapper[4831]: I1203 06:37:29.904184 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncw4v"] Dec 03 06:37:54 crc kubenswrapper[4831]: I1203 06:37:54.952902 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" podUID="1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" containerName="registry" containerID="cri-o://aa3ad225e2bfdcf38c973c450f6478afd52dd5d0d7942642e504aada16033080" gracePeriod=30 Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.389106 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.489913 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.489971 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-trusted-ca\") pod \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.490037 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9stz\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-kube-api-access-l9stz\") pod \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.490060 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-registry-tls\") pod \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.490084 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-registry-certificates\") pod \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.490145 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-installation-pull-secrets\") pod \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.490171 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-ca-trust-extracted\") pod \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.490198 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-bound-sa-token\") pod \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\" (UID: \"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4\") " Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.491181 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.491172 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.493196 4831 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-registry-certificates\") on node \"crc\" DevicePath \"\"" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.493244 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-trusted-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.497004 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.498503 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.503532 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.503983 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-kube-api-access-l9stz" (OuterVolumeSpecName: "kube-api-access-l9stz") pod "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4"). InnerVolumeSpecName "kube-api-access-l9stz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.505403 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.510989 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" (UID: "1d9d5be4-e63c-4f4a-85c0-ef3735894eb4"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.594611 4831 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.594656 4831 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.594667 4831 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-bound-sa-token\") on node \"crc\" DevicePath \"\"" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.594677 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l9stz\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-kube-api-access-l9stz\") on node \"crc\" DevicePath \"\"" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.594689 4831 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4-registry-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.874413 4831 generic.go:334] "Generic (PLEG): container finished" podID="1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" containerID="aa3ad225e2bfdcf38c973c450f6478afd52dd5d0d7942642e504aada16033080" exitCode=0 Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.874473 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" event={"ID":"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4","Type":"ContainerDied","Data":"aa3ad225e2bfdcf38c973c450f6478afd52dd5d0d7942642e504aada16033080"} Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.874518 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.874524 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-ncw4v" event={"ID":"1d9d5be4-e63c-4f4a-85c0-ef3735894eb4","Type":"ContainerDied","Data":"0dd3fa2e4c40f947b90235108f947a2e5fd174dc1790c383be63326f6073d8eb"} Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.874539 4831 scope.go:117] "RemoveContainer" containerID="aa3ad225e2bfdcf38c973c450f6478afd52dd5d0d7942642e504aada16033080" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.903424 4831 scope.go:117] "RemoveContainer" containerID="aa3ad225e2bfdcf38c973c450f6478afd52dd5d0d7942642e504aada16033080" Dec 03 06:37:55 crc kubenswrapper[4831]: E1203 06:37:55.904476 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa3ad225e2bfdcf38c973c450f6478afd52dd5d0d7942642e504aada16033080\": container with ID starting with aa3ad225e2bfdcf38c973c450f6478afd52dd5d0d7942642e504aada16033080 not found: ID does not exist" containerID="aa3ad225e2bfdcf38c973c450f6478afd52dd5d0d7942642e504aada16033080" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.904561 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa3ad225e2bfdcf38c973c450f6478afd52dd5d0d7942642e504aada16033080"} err="failed to get container status \"aa3ad225e2bfdcf38c973c450f6478afd52dd5d0d7942642e504aada16033080\": rpc error: code = NotFound desc = could not find container \"aa3ad225e2bfdcf38c973c450f6478afd52dd5d0d7942642e504aada16033080\": container with ID starting with aa3ad225e2bfdcf38c973c450f6478afd52dd5d0d7942642e504aada16033080 not found: ID does not exist" Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.945262 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncw4v"] Dec 03 06:37:55 crc kubenswrapper[4831]: I1203 06:37:55.952271 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-ncw4v"] Dec 03 06:37:57 crc kubenswrapper[4831]: I1203 06:37:57.023408 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" path="/var/lib/kubelet/pods/1d9d5be4-e63c-4f4a-85c0-ef3735894eb4/volumes" Dec 03 06:37:57 crc kubenswrapper[4831]: I1203 06:37:57.596816 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:37:57 crc kubenswrapper[4831]: I1203 06:37:57.596921 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:37:57 crc kubenswrapper[4831]: I1203 06:37:57.597002 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:37:57 crc kubenswrapper[4831]: I1203 06:37:57.598843 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b518ced4b22d22ab3d00838b322beb819fadfbd73a5137b1811cce469ae983d6"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:37:57 crc kubenswrapper[4831]: I1203 06:37:57.599275 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://b518ced4b22d22ab3d00838b322beb819fadfbd73a5137b1811cce469ae983d6" gracePeriod=600 Dec 03 06:37:57 crc kubenswrapper[4831]: I1203 06:37:57.891006 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="b518ced4b22d22ab3d00838b322beb819fadfbd73a5137b1811cce469ae983d6" exitCode=0 Dec 03 06:37:57 crc kubenswrapper[4831]: I1203 06:37:57.891048 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"b518ced4b22d22ab3d00838b322beb819fadfbd73a5137b1811cce469ae983d6"} Dec 03 06:37:57 crc kubenswrapper[4831]: I1203 06:37:57.891081 4831 scope.go:117] "RemoveContainer" containerID="d5f4fb897c02a83eeed7021028e23efc127e1a5f3043f0f2f6e5af657458891c" Dec 03 06:37:58 crc kubenswrapper[4831]: I1203 06:37:58.899966 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"20ad819c7678e6daebd9a1a40a71eabae034828538ab794dbea604a758d7449c"} Dec 03 06:39:57 crc kubenswrapper[4831]: I1203 06:39:57.596820 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:39:57 crc kubenswrapper[4831]: I1203 06:39:57.599360 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:40:27 crc kubenswrapper[4831]: I1203 06:40:27.597212 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:40:27 crc kubenswrapper[4831]: I1203 06:40:27.597959 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:40:57 crc kubenswrapper[4831]: I1203 06:40:57.597008 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:40:57 crc kubenswrapper[4831]: I1203 06:40:57.597645 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:40:57 crc kubenswrapper[4831]: I1203 06:40:57.597706 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:40:57 crc kubenswrapper[4831]: I1203 06:40:57.598443 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"20ad819c7678e6daebd9a1a40a71eabae034828538ab794dbea604a758d7449c"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:40:57 crc kubenswrapper[4831]: I1203 06:40:57.598556 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://20ad819c7678e6daebd9a1a40a71eabae034828538ab794dbea604a758d7449c" gracePeriod=600 Dec 03 06:40:58 crc kubenswrapper[4831]: I1203 06:40:58.066824 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="20ad819c7678e6daebd9a1a40a71eabae034828538ab794dbea604a758d7449c" exitCode=0 Dec 03 06:40:58 crc kubenswrapper[4831]: I1203 06:40:58.066862 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"20ad819c7678e6daebd9a1a40a71eabae034828538ab794dbea604a758d7449c"} Dec 03 06:40:58 crc kubenswrapper[4831]: I1203 06:40:58.067235 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"591c53f4c8c9620b5b60eed4f0d2632e242390fceb4ae25a90151135f08319c6"} Dec 03 06:40:58 crc kubenswrapper[4831]: I1203 06:40:58.067262 4831 scope.go:117] "RemoveContainer" containerID="b518ced4b22d22ab3d00838b322beb819fadfbd73a5137b1811cce469ae983d6" Dec 03 06:42:57 crc kubenswrapper[4831]: I1203 06:42:57.596848 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:42:57 crc kubenswrapper[4831]: I1203 06:42:57.597527 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:43:27 crc kubenswrapper[4831]: I1203 06:43:27.596254 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:43:27 crc kubenswrapper[4831]: I1203 06:43:27.596917 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:43:32 crc kubenswrapper[4831]: I1203 06:43:32.048029 4831 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.587848 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ps95j"] Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.588814 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovn-controller" containerID="cri-o://60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae" gracePeriod=30 Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.588906 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="sbdb" containerID="cri-o://a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f" gracePeriod=30 Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.588983 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="nbdb" containerID="cri-o://e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d" gracePeriod=30 Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.588956 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e" gracePeriod=30 Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.588972 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="kube-rbac-proxy-node" containerID="cri-o://179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14" gracePeriod=30 Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.588930 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="northd" containerID="cri-o://a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0" gracePeriod=30 Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.588986 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovn-acl-logging" containerID="cri-o://a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9" gracePeriod=30 Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.634766 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" containerID="cri-o://44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c" gracePeriod=30 Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.929550 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/3.log" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.932170 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovn-acl-logging/0.log" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.932684 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovn-controller/0.log" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.933379 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975021 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-var-lib-openvswitch\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975178 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-slash\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975103 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975249 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-var-lib-cni-networks-ovn-kubernetes\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975301 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-slash" (OuterVolumeSpecName: "host-slash") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975358 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d7d0c92-6857-4846-93ab-3364282a1e85-ovn-node-metrics-cert\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975390 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-ovnkube-config\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975412 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-cni-bin\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975405 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975434 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-log-socket\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975462 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-systemd-units\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975485 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-systemd\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975516 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-etc-openvswitch\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975541 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-env-overrides\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975570 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-run-ovn-kubernetes\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975592 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-node-log\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975638 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-ovn\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975659 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-kubelet\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975681 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sntzl\" (UniqueName: \"kubernetes.io/projected/3d7d0c92-6857-4846-93ab-3364282a1e85-kube-api-access-sntzl\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975700 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-run-netns\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975721 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-cni-netd\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975753 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-ovnkube-script-lib\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975771 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-openvswitch\") pod \"3d7d0c92-6857-4846-93ab-3364282a1e85\" (UID: \"3d7d0c92-6857-4846-93ab-3364282a1e85\") " Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975992 4831 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.975994 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.976006 4831 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-slash\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.976021 4831 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.976051 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.976077 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.976102 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-log-socket" (OuterVolumeSpecName: "log-socket") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.976124 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.976689 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.976729 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.976744 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.976696 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.976977 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.977016 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.977098 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.977046 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-node-log" (OuterVolumeSpecName: "node-log") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.983186 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.984058 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3d7d0c92-6857-4846-93ab-3364282a1e85-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.984599 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d7d0c92-6857-4846-93ab-3364282a1e85-kube-api-access-sntzl" (OuterVolumeSpecName: "kube-api-access-sntzl") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "kube-api-access-sntzl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990264 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-lgj4z"] Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.990513 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990534 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.990547 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990556 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.990569 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="northd" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990578 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="northd" Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.990587 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990596 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.990605 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990613 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.990629 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990638 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.990650 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="nbdb" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990658 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="nbdb" Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.990670 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="kubecfg-setup" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990678 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="kubecfg-setup" Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.990693 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" containerName="registry" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990701 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" containerName="registry" Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.990713 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovn-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990722 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovn-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.990733 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovn-acl-logging" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990741 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovn-acl-logging" Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.990753 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="kube-rbac-proxy-node" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990761 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="kube-rbac-proxy-node" Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.990773 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="sbdb" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990781 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="sbdb" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990899 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990911 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="sbdb" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990926 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990935 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovn-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990946 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovn-acl-logging" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990955 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="kube-rbac-proxy-node" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990966 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="kube-rbac-proxy-ovn-metrics" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990978 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d9d5be4-e63c-4f4a-85c0-ef3735894eb4" containerName="registry" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990989 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.990997 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.991007 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="northd" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.991018 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="nbdb" Dec 03 06:43:39 crc kubenswrapper[4831]: E1203 06:43:39.991131 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.991140 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.991269 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerName="ovnkube-controller" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.992184 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "3d7d0c92-6857-4846-93ab-3364282a1e85" (UID: "3d7d0c92-6857-4846-93ab-3364282a1e85"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:43:39 crc kubenswrapper[4831]: I1203 06:43:39.993281 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.076823 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.077116 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/181008eb-05b9-49cb-a868-1d91372a75e6-ovn-node-metrics-cert\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.077243 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-etc-openvswitch\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.077405 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-log-socket\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.077505 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-slash\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.077587 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-run-openvswitch\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.077680 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-var-lib-openvswitch\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.077773 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-node-log\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.077845 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-cni-netd\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.077920 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/181008eb-05b9-49cb-a868-1d91372a75e6-ovnkube-script-lib\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.078000 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-run-ovn\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.078118 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-kubelet\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.078152 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-systemd-units\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.078178 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/181008eb-05b9-49cb-a868-1d91372a75e6-env-overrides\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.078200 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-run-systemd\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.078417 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-cni-bin\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.078498 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-run-netns\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.078537 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.078566 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/181008eb-05b9-49cb-a868-1d91372a75e6-ovnkube-config\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.078665 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jj9b\" (UniqueName: \"kubernetes.io/projected/181008eb-05b9-49cb-a868-1d91372a75e6-kube-api-access-4jj9b\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.078984 4831 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-node-log\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.079085 4831 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.079166 4831 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-kubelet\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.079242 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sntzl\" (UniqueName: \"kubernetes.io/projected/3d7d0c92-6857-4846-93ab-3364282a1e85-kube-api-access-sntzl\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.079364 4831 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-run-netns\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.079456 4831 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-cni-netd\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.079530 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.079612 4831 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.079687 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3d7d0c92-6857-4846-93ab-3364282a1e85-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.079768 4831 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-ovnkube-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.079842 4831 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-cni-bin\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.079919 4831 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-log-socket\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.080000 4831 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-systemd-units\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.080075 4831 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-run-systemd\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.080154 4831 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.080249 4831 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3d7d0c92-6857-4846-93ab-3364282a1e85-env-overrides\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.080338 4831 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3d7d0c92-6857-4846-93ab-3364282a1e85-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.178342 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovnkube-controller/3.log" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181251 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-run-openvswitch\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181308 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-var-lib-openvswitch\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181344 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-node-log\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181362 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-cni-netd\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181382 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/181008eb-05b9-49cb-a868-1d91372a75e6-ovnkube-script-lib\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181401 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-run-ovn\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181419 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-kubelet\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181435 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-systemd-units\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181474 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/181008eb-05b9-49cb-a868-1d91372a75e6-env-overrides\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181494 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-run-systemd\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181518 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-cni-bin\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181533 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-run-netns\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181540 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-cni-netd\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181557 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181606 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181643 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/181008eb-05b9-49cb-a868-1d91372a75e6-ovnkube-config\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181767 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jj9b\" (UniqueName: \"kubernetes.io/projected/181008eb-05b9-49cb-a868-1d91372a75e6-kube-api-access-4jj9b\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181833 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovn-acl-logging/0.log" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181885 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181927 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/181008eb-05b9-49cb-a868-1d91372a75e6-ovn-node-metrics-cert\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181952 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-run-ovn\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.181984 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-etc-openvswitch\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182031 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-run-systemd\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182046 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-log-socket\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182072 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-slash\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182073 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-cni-bin\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182198 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-kubelet\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182198 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-systemd-units\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182218 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-run-netns\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182259 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-run-ovn-kubernetes\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182477 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-log-socket\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182595 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-etc-openvswitch\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182697 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-var-lib-openvswitch\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182733 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-run-openvswitch\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182761 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-node-log\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.182839 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/181008eb-05b9-49cb-a868-1d91372a75e6-host-slash\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183091 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-ps95j_3d7d0c92-6857-4846-93ab-3364282a1e85/ovn-controller/0.log" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183170 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/181008eb-05b9-49cb-a868-1d91372a75e6-env-overrides\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183491 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/181008eb-05b9-49cb-a868-1d91372a75e6-ovnkube-script-lib\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183520 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/181008eb-05b9-49cb-a868-1d91372a75e6-ovnkube-config\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183691 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerID="44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c" exitCode=0 Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183735 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerID="a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f" exitCode=0 Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183754 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerID="e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d" exitCode=0 Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183771 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerID="a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0" exitCode=0 Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183789 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerID="59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e" exitCode=0 Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183803 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerID="179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14" exitCode=0 Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183816 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerID="a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9" exitCode=143 Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183830 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d7d0c92-6857-4846-93ab-3364282a1e85" containerID="60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae" exitCode=143 Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183940 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.183983 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184005 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184022 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184047 4831 scope.go:117] "RemoveContainer" containerID="44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184028 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184342 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184366 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184386 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184404 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184416 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184428 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184442 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184453 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184464 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184477 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184488 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184502 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184518 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184530 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184540 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184551 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184561 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184571 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184581 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184628 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184639 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184650 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184665 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184681 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184694 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184705 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184716 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184727 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184738 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184749 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184759 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184770 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184780 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184794 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-ps95j" event={"ID":"3d7d0c92-6857-4846-93ab-3364282a1e85","Type":"ContainerDied","Data":"18f819e0cd64ee0bce1037ff1266ef042a5491ae047c2f407f4e4fb3d863dea3"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184809 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184824 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184834 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184845 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184855 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184866 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184876 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184887 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184902 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.184913 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.186927 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vz8ft_74a16df4-1f25-4b0f-9e08-f6486f262a68/kube-multus/2.log" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.188156 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vz8ft_74a16df4-1f25-4b0f-9e08-f6486f262a68/kube-multus/1.log" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.188218 4831 generic.go:334] "Generic (PLEG): container finished" podID="74a16df4-1f25-4b0f-9e08-f6486f262a68" containerID="cccd1d19fe7a46c7f1bfe0299b4666ece8cecce74a0354d94cf9edcb4d647bd5" exitCode=2 Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.188250 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vz8ft" event={"ID":"74a16df4-1f25-4b0f-9e08-f6486f262a68","Type":"ContainerDied","Data":"cccd1d19fe7a46c7f1bfe0299b4666ece8cecce74a0354d94cf9edcb4d647bd5"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.188274 4831 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2eb1b0783eb62f680604ecf7af5a64083dfba10f7beb1cb91448817c62828da4"} Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.188972 4831 scope.go:117] "RemoveContainer" containerID="cccd1d19fe7a46c7f1bfe0299b4666ece8cecce74a0354d94cf9edcb4d647bd5" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.194428 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/181008eb-05b9-49cb-a868-1d91372a75e6-ovn-node-metrics-cert\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.206853 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jj9b\" (UniqueName: \"kubernetes.io/projected/181008eb-05b9-49cb-a868-1d91372a75e6-kube-api-access-4jj9b\") pod \"ovnkube-node-lgj4z\" (UID: \"181008eb-05b9-49cb-a868-1d91372a75e6\") " pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.214049 4831 scope.go:117] "RemoveContainer" containerID="b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.240459 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ps95j"] Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.242712 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-ps95j"] Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.254470 4831 scope.go:117] "RemoveContainer" containerID="a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.273672 4831 scope.go:117] "RemoveContainer" containerID="e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.291745 4831 scope.go:117] "RemoveContainer" containerID="a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.309981 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.314046 4831 scope.go:117] "RemoveContainer" containerID="59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.337537 4831 scope.go:117] "RemoveContainer" containerID="179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14" Dec 03 06:43:40 crc kubenswrapper[4831]: W1203 06:43:40.347600 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod181008eb_05b9_49cb_a868_1d91372a75e6.slice/crio-4c4401f3e5b6c88eb72d643e8da6f690497c644abed8bfa4930cabb8bc4aaf9d WatchSource:0}: Error finding container 4c4401f3e5b6c88eb72d643e8da6f690497c644abed8bfa4930cabb8bc4aaf9d: Status 404 returned error can't find the container with id 4c4401f3e5b6c88eb72d643e8da6f690497c644abed8bfa4930cabb8bc4aaf9d Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.361959 4831 scope.go:117] "RemoveContainer" containerID="a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.384032 4831 scope.go:117] "RemoveContainer" containerID="60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.400308 4831 scope.go:117] "RemoveContainer" containerID="f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.418862 4831 scope.go:117] "RemoveContainer" containerID="44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c" Dec 03 06:43:40 crc kubenswrapper[4831]: E1203 06:43:40.419194 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c\": container with ID starting with 44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c not found: ID does not exist" containerID="44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.419336 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c"} err="failed to get container status \"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c\": rpc error: code = NotFound desc = could not find container \"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c\": container with ID starting with 44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.419434 4831 scope.go:117] "RemoveContainer" containerID="b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2" Dec 03 06:43:40 crc kubenswrapper[4831]: E1203 06:43:40.419766 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\": container with ID starting with b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2 not found: ID does not exist" containerID="b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.419857 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2"} err="failed to get container status \"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\": rpc error: code = NotFound desc = could not find container \"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\": container with ID starting with b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.419955 4831 scope.go:117] "RemoveContainer" containerID="a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f" Dec 03 06:43:40 crc kubenswrapper[4831]: E1203 06:43:40.420305 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\": container with ID starting with a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f not found: ID does not exist" containerID="a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.420421 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f"} err="failed to get container status \"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\": rpc error: code = NotFound desc = could not find container \"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\": container with ID starting with a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.420506 4831 scope.go:117] "RemoveContainer" containerID="e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d" Dec 03 06:43:40 crc kubenswrapper[4831]: E1203 06:43:40.420788 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\": container with ID starting with e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d not found: ID does not exist" containerID="e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.420924 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d"} err="failed to get container status \"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\": rpc error: code = NotFound desc = could not find container \"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\": container with ID starting with e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.421013 4831 scope.go:117] "RemoveContainer" containerID="a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0" Dec 03 06:43:40 crc kubenswrapper[4831]: E1203 06:43:40.421305 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\": container with ID starting with a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0 not found: ID does not exist" containerID="a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.421421 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0"} err="failed to get container status \"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\": rpc error: code = NotFound desc = could not find container \"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\": container with ID starting with a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.421512 4831 scope.go:117] "RemoveContainer" containerID="59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e" Dec 03 06:43:40 crc kubenswrapper[4831]: E1203 06:43:40.421793 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\": container with ID starting with 59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e not found: ID does not exist" containerID="59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.421895 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e"} err="failed to get container status \"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\": rpc error: code = NotFound desc = could not find container \"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\": container with ID starting with 59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.421977 4831 scope.go:117] "RemoveContainer" containerID="179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14" Dec 03 06:43:40 crc kubenswrapper[4831]: E1203 06:43:40.422259 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\": container with ID starting with 179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14 not found: ID does not exist" containerID="179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.422372 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14"} err="failed to get container status \"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\": rpc error: code = NotFound desc = could not find container \"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\": container with ID starting with 179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.422466 4831 scope.go:117] "RemoveContainer" containerID="a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9" Dec 03 06:43:40 crc kubenswrapper[4831]: E1203 06:43:40.422914 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\": container with ID starting with a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9 not found: ID does not exist" containerID="a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.423028 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9"} err="failed to get container status \"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\": rpc error: code = NotFound desc = could not find container \"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\": container with ID starting with a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.423145 4831 scope.go:117] "RemoveContainer" containerID="60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae" Dec 03 06:43:40 crc kubenswrapper[4831]: E1203 06:43:40.423520 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\": container with ID starting with 60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae not found: ID does not exist" containerID="60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.423622 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae"} err="failed to get container status \"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\": rpc error: code = NotFound desc = could not find container \"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\": container with ID starting with 60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.423711 4831 scope.go:117] "RemoveContainer" containerID="f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797" Dec 03 06:43:40 crc kubenswrapper[4831]: E1203 06:43:40.423965 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\": container with ID starting with f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797 not found: ID does not exist" containerID="f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.424071 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797"} err="failed to get container status \"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\": rpc error: code = NotFound desc = could not find container \"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\": container with ID starting with f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.424152 4831 scope.go:117] "RemoveContainer" containerID="44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.424619 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c"} err="failed to get container status \"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c\": rpc error: code = NotFound desc = could not find container \"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c\": container with ID starting with 44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.424768 4831 scope.go:117] "RemoveContainer" containerID="b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.425119 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2"} err="failed to get container status \"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\": rpc error: code = NotFound desc = could not find container \"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\": container with ID starting with b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.425233 4831 scope.go:117] "RemoveContainer" containerID="a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.425540 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f"} err="failed to get container status \"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\": rpc error: code = NotFound desc = could not find container \"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\": container with ID starting with a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.425559 4831 scope.go:117] "RemoveContainer" containerID="e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.425818 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d"} err="failed to get container status \"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\": rpc error: code = NotFound desc = could not find container \"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\": container with ID starting with e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.425908 4831 scope.go:117] "RemoveContainer" containerID="a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.426220 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0"} err="failed to get container status \"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\": rpc error: code = NotFound desc = could not find container \"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\": container with ID starting with a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.426329 4831 scope.go:117] "RemoveContainer" containerID="59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.426615 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e"} err="failed to get container status \"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\": rpc error: code = NotFound desc = could not find container \"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\": container with ID starting with 59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.426630 4831 scope.go:117] "RemoveContainer" containerID="179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.426811 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14"} err="failed to get container status \"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\": rpc error: code = NotFound desc = could not find container \"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\": container with ID starting with 179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.426900 4831 scope.go:117] "RemoveContainer" containerID="a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.427243 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9"} err="failed to get container status \"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\": rpc error: code = NotFound desc = could not find container \"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\": container with ID starting with a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.427261 4831 scope.go:117] "RemoveContainer" containerID="60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.427462 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae"} err="failed to get container status \"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\": rpc error: code = NotFound desc = could not find container \"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\": container with ID starting with 60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.427573 4831 scope.go:117] "RemoveContainer" containerID="f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.427863 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797"} err="failed to get container status \"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\": rpc error: code = NotFound desc = could not find container \"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\": container with ID starting with f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.427879 4831 scope.go:117] "RemoveContainer" containerID="44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.428060 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c"} err="failed to get container status \"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c\": rpc error: code = NotFound desc = could not find container \"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c\": container with ID starting with 44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.428200 4831 scope.go:117] "RemoveContainer" containerID="b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.428582 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2"} err="failed to get container status \"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\": rpc error: code = NotFound desc = could not find container \"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\": container with ID starting with b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.428601 4831 scope.go:117] "RemoveContainer" containerID="a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.428882 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f"} err="failed to get container status \"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\": rpc error: code = NotFound desc = could not find container \"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\": container with ID starting with a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.429008 4831 scope.go:117] "RemoveContainer" containerID="e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.429430 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d"} err="failed to get container status \"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\": rpc error: code = NotFound desc = could not find container \"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\": container with ID starting with e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.429537 4831 scope.go:117] "RemoveContainer" containerID="a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.429790 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0"} err="failed to get container status \"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\": rpc error: code = NotFound desc = could not find container \"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\": container with ID starting with a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.429807 4831 scope.go:117] "RemoveContainer" containerID="59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.430645 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e"} err="failed to get container status \"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\": rpc error: code = NotFound desc = could not find container \"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\": container with ID starting with 59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.430661 4831 scope.go:117] "RemoveContainer" containerID="179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.430847 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14"} err="failed to get container status \"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\": rpc error: code = NotFound desc = could not find container \"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\": container with ID starting with 179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.430954 4831 scope.go:117] "RemoveContainer" containerID="a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.431352 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9"} err="failed to get container status \"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\": rpc error: code = NotFound desc = could not find container \"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\": container with ID starting with a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.431454 4831 scope.go:117] "RemoveContainer" containerID="60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.431817 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae"} err="failed to get container status \"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\": rpc error: code = NotFound desc = could not find container \"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\": container with ID starting with 60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.432008 4831 scope.go:117] "RemoveContainer" containerID="f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.432294 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797"} err="failed to get container status \"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\": rpc error: code = NotFound desc = could not find container \"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\": container with ID starting with f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.432407 4831 scope.go:117] "RemoveContainer" containerID="44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.433065 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c"} err="failed to get container status \"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c\": rpc error: code = NotFound desc = could not find container \"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c\": container with ID starting with 44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.433082 4831 scope.go:117] "RemoveContainer" containerID="b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.433268 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2"} err="failed to get container status \"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\": rpc error: code = NotFound desc = could not find container \"b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2\": container with ID starting with b5612fb364a6497ebfe5585d8a64bbff2fc8e840e225d3ce83d0ef855669f0f2 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.433433 4831 scope.go:117] "RemoveContainer" containerID="a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.433834 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f"} err="failed to get container status \"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\": rpc error: code = NotFound desc = could not find container \"a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f\": container with ID starting with a9ca1771a8b3623b2133de23529c9738bfa77d2ddc1cc34cc52c781dc2c3f93f not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.433925 4831 scope.go:117] "RemoveContainer" containerID="e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.434245 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d"} err="failed to get container status \"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\": rpc error: code = NotFound desc = could not find container \"e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d\": container with ID starting with e291cd32a68cba081bd6543fba92ceaacc240c2855f9563c6709c1cca1c14f8d not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.434459 4831 scope.go:117] "RemoveContainer" containerID="a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.434844 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0"} err="failed to get container status \"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\": rpc error: code = NotFound desc = could not find container \"a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0\": container with ID starting with a1721f47da8dba8d4f6d4b7e99e681f1ac702b200711965cd64cd7d1ba2053f0 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.434941 4831 scope.go:117] "RemoveContainer" containerID="59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.435296 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e"} err="failed to get container status \"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\": rpc error: code = NotFound desc = could not find container \"59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e\": container with ID starting with 59c7a7d2c987e7e4d3b6513b9c2938934503bde399482241857ad9b46337703e not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.435409 4831 scope.go:117] "RemoveContainer" containerID="179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.435736 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14"} err="failed to get container status \"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\": rpc error: code = NotFound desc = could not find container \"179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14\": container with ID starting with 179199e076c0983172305abb4aeb681ac0c3b40617acf736e75b28f28e51ae14 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.435786 4831 scope.go:117] "RemoveContainer" containerID="a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.436048 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9"} err="failed to get container status \"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\": rpc error: code = NotFound desc = could not find container \"a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9\": container with ID starting with a3eb72fd22e0c2a859af8df2e7e2164514b6ae14242edeb786905eec55ab14f9 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.436135 4831 scope.go:117] "RemoveContainer" containerID="60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.436475 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae"} err="failed to get container status \"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\": rpc error: code = NotFound desc = could not find container \"60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae\": container with ID starting with 60608d5b4aadcda72078291846927d25faa976703aec4af4ba3247fc7ed397ae not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.436617 4831 scope.go:117] "RemoveContainer" containerID="f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.436974 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797"} err="failed to get container status \"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\": rpc error: code = NotFound desc = could not find container \"f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797\": container with ID starting with f568e41d44e69c74e987c2589a9139e61577a756eaadc926003d5492768d5797 not found: ID does not exist" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.437070 4831 scope.go:117] "RemoveContainer" containerID="44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c" Dec 03 06:43:40 crc kubenswrapper[4831]: I1203 06:43:40.437334 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c"} err="failed to get container status \"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c\": rpc error: code = NotFound desc = could not find container \"44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c\": container with ID starting with 44fb0cb80fbc9d3c91709bec76785a3a37833c5a0fc0c98ef01cd4e5e3597e7c not found: ID does not exist" Dec 03 06:43:41 crc kubenswrapper[4831]: I1203 06:43:41.019502 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3d7d0c92-6857-4846-93ab-3364282a1e85" path="/var/lib/kubelet/pods/3d7d0c92-6857-4846-93ab-3364282a1e85/volumes" Dec 03 06:43:41 crc kubenswrapper[4831]: I1203 06:43:41.193727 4831 generic.go:334] "Generic (PLEG): container finished" podID="181008eb-05b9-49cb-a868-1d91372a75e6" containerID="db685e0775928747b93abf1b214cd6056b55324249b6fc4047e8e78d6b309032" exitCode=0 Dec 03 06:43:41 crc kubenswrapper[4831]: I1203 06:43:41.193794 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" event={"ID":"181008eb-05b9-49cb-a868-1d91372a75e6","Type":"ContainerDied","Data":"db685e0775928747b93abf1b214cd6056b55324249b6fc4047e8e78d6b309032"} Dec 03 06:43:41 crc kubenswrapper[4831]: I1203 06:43:41.193824 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" event={"ID":"181008eb-05b9-49cb-a868-1d91372a75e6","Type":"ContainerStarted","Data":"4c4401f3e5b6c88eb72d643e8da6f690497c644abed8bfa4930cabb8bc4aaf9d"} Dec 03 06:43:41 crc kubenswrapper[4831]: I1203 06:43:41.198089 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vz8ft_74a16df4-1f25-4b0f-9e08-f6486f262a68/kube-multus/2.log" Dec 03 06:43:41 crc kubenswrapper[4831]: I1203 06:43:41.198459 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vz8ft_74a16df4-1f25-4b0f-9e08-f6486f262a68/kube-multus/1.log" Dec 03 06:43:41 crc kubenswrapper[4831]: I1203 06:43:41.198547 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vz8ft" event={"ID":"74a16df4-1f25-4b0f-9e08-f6486f262a68","Type":"ContainerStarted","Data":"ddd2c8bf4fb4137086d71bb15f971a0e83a2f7b8bcc9bc8dbbf9797b5919c47c"} Dec 03 06:43:42 crc kubenswrapper[4831]: I1203 06:43:42.209792 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" event={"ID":"181008eb-05b9-49cb-a868-1d91372a75e6","Type":"ContainerStarted","Data":"0085c7bc71516f4f529ac1743444e8e31626ce27ed09345c3d676d750d715367"} Dec 03 06:43:42 crc kubenswrapper[4831]: I1203 06:43:42.210353 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" event={"ID":"181008eb-05b9-49cb-a868-1d91372a75e6","Type":"ContainerStarted","Data":"afa8e2ce019d226376496a0baafe88b4d270c6e74c9407dfae4ed355452bf57a"} Dec 03 06:43:42 crc kubenswrapper[4831]: I1203 06:43:42.210381 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" event={"ID":"181008eb-05b9-49cb-a868-1d91372a75e6","Type":"ContainerStarted","Data":"c40f3fe63b47469162220c1c4f9aeee952d1fc62ea358b1b33f8cbacb4131667"} Dec 03 06:43:42 crc kubenswrapper[4831]: I1203 06:43:42.210395 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" event={"ID":"181008eb-05b9-49cb-a868-1d91372a75e6","Type":"ContainerStarted","Data":"01b3d68b7428c993dc376e574f1d5b00ced70be114d036350a16148b94a68541"} Dec 03 06:43:42 crc kubenswrapper[4831]: I1203 06:43:42.210407 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" event={"ID":"181008eb-05b9-49cb-a868-1d91372a75e6","Type":"ContainerStarted","Data":"350d275227f5c7e710455b202c0085b1da190ff988ae73735085f9ce0f61a623"} Dec 03 06:43:42 crc kubenswrapper[4831]: I1203 06:43:42.210418 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" event={"ID":"181008eb-05b9-49cb-a868-1d91372a75e6","Type":"ContainerStarted","Data":"4840d7401a385cd6b2950eef85b171a9790d6bb89c8ac10c5ee9b28006d31659"} Dec 03 06:43:45 crc kubenswrapper[4831]: I1203 06:43:45.258341 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" event={"ID":"181008eb-05b9-49cb-a868-1d91372a75e6","Type":"ContainerStarted","Data":"06c5669780e7fd489d1369372767f9e3399fa017c86317e88bc97552ea114df3"} Dec 03 06:43:47 crc kubenswrapper[4831]: I1203 06:43:47.278055 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" event={"ID":"181008eb-05b9-49cb-a868-1d91372a75e6","Type":"ContainerStarted","Data":"183976fa57fc496d90e1951505a4fd5fe42a5c79293d9c789ed5b85460a4596b"} Dec 03 06:43:47 crc kubenswrapper[4831]: I1203 06:43:47.960147 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-p2xx8"] Dec 03 06:43:47 crc kubenswrapper[4831]: I1203 06:43:47.960735 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:47 crc kubenswrapper[4831]: I1203 06:43:47.963349 4831 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-6mcp8" Dec 03 06:43:47 crc kubenswrapper[4831]: I1203 06:43:47.964616 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 03 06:43:47 crc kubenswrapper[4831]: I1203 06:43:47.964795 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 03 06:43:47 crc kubenswrapper[4831]: I1203 06:43:47.964851 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 03 06:43:47 crc kubenswrapper[4831]: I1203 06:43:47.974272 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p2xx8"] Dec 03 06:43:47 crc kubenswrapper[4831]: I1203 06:43:47.983191 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a25efc27-9981-4e58-bf27-8e9650464fc4-crc-storage\") pod \"crc-storage-crc-p2xx8\" (UID: \"a25efc27-9981-4e58-bf27-8e9650464fc4\") " pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:47 crc kubenswrapper[4831]: I1203 06:43:47.983232 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbhqr\" (UniqueName: \"kubernetes.io/projected/a25efc27-9981-4e58-bf27-8e9650464fc4-kube-api-access-zbhqr\") pod \"crc-storage-crc-p2xx8\" (UID: \"a25efc27-9981-4e58-bf27-8e9650464fc4\") " pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:47 crc kubenswrapper[4831]: I1203 06:43:47.983259 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a25efc27-9981-4e58-bf27-8e9650464fc4-node-mnt\") pod \"crc-storage-crc-p2xx8\" (UID: \"a25efc27-9981-4e58-bf27-8e9650464fc4\") " pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:48 crc kubenswrapper[4831]: I1203 06:43:48.084909 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a25efc27-9981-4e58-bf27-8e9650464fc4-crc-storage\") pod \"crc-storage-crc-p2xx8\" (UID: \"a25efc27-9981-4e58-bf27-8e9650464fc4\") " pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:48 crc kubenswrapper[4831]: I1203 06:43:48.084968 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbhqr\" (UniqueName: \"kubernetes.io/projected/a25efc27-9981-4e58-bf27-8e9650464fc4-kube-api-access-zbhqr\") pod \"crc-storage-crc-p2xx8\" (UID: \"a25efc27-9981-4e58-bf27-8e9650464fc4\") " pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:48 crc kubenswrapper[4831]: I1203 06:43:48.085002 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a25efc27-9981-4e58-bf27-8e9650464fc4-node-mnt\") pod \"crc-storage-crc-p2xx8\" (UID: \"a25efc27-9981-4e58-bf27-8e9650464fc4\") " pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:48 crc kubenswrapper[4831]: I1203 06:43:48.085273 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a25efc27-9981-4e58-bf27-8e9650464fc4-node-mnt\") pod \"crc-storage-crc-p2xx8\" (UID: \"a25efc27-9981-4e58-bf27-8e9650464fc4\") " pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:48 crc kubenswrapper[4831]: I1203 06:43:48.086499 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a25efc27-9981-4e58-bf27-8e9650464fc4-crc-storage\") pod \"crc-storage-crc-p2xx8\" (UID: \"a25efc27-9981-4e58-bf27-8e9650464fc4\") " pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:48 crc kubenswrapper[4831]: I1203 06:43:48.113202 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbhqr\" (UniqueName: \"kubernetes.io/projected/a25efc27-9981-4e58-bf27-8e9650464fc4-kube-api-access-zbhqr\") pod \"crc-storage-crc-p2xx8\" (UID: \"a25efc27-9981-4e58-bf27-8e9650464fc4\") " pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:48 crc kubenswrapper[4831]: I1203 06:43:48.283029 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:48 crc kubenswrapper[4831]: I1203 06:43:48.291143 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:48 crc kubenswrapper[4831]: I1203 06:43:48.291182 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:48 crc kubenswrapper[4831]: E1203 06:43:48.322266 4831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p2xx8_crc-storage_a25efc27-9981-4e58-bf27-8e9650464fc4_0(e1bedcead93c1d491c369db75ea0b0b995e1a2ad1240c94820d676d349e264da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 06:43:48 crc kubenswrapper[4831]: E1203 06:43:48.322363 4831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p2xx8_crc-storage_a25efc27-9981-4e58-bf27-8e9650464fc4_0(e1bedcead93c1d491c369db75ea0b0b995e1a2ad1240c94820d676d349e264da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:48 crc kubenswrapper[4831]: E1203 06:43:48.322386 4831 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p2xx8_crc-storage_a25efc27-9981-4e58-bf27-8e9650464fc4_0(e1bedcead93c1d491c369db75ea0b0b995e1a2ad1240c94820d676d349e264da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:48 crc kubenswrapper[4831]: E1203 06:43:48.322436 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-p2xx8_crc-storage(a25efc27-9981-4e58-bf27-8e9650464fc4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-p2xx8_crc-storage(a25efc27-9981-4e58-bf27-8e9650464fc4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p2xx8_crc-storage_a25efc27-9981-4e58-bf27-8e9650464fc4_0(e1bedcead93c1d491c369db75ea0b0b995e1a2ad1240c94820d676d349e264da): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-p2xx8" podUID="a25efc27-9981-4e58-bf27-8e9650464fc4" Dec 03 06:43:48 crc kubenswrapper[4831]: I1203 06:43:48.329518 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" podStartSLOduration=9.329497272 podStartE2EDuration="9.329497272s" podCreationTimestamp="2025-12-03 06:43:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:43:48.324154542 +0000 UTC m=+765.667738050" watchObservedRunningTime="2025-12-03 06:43:48.329497272 +0000 UTC m=+765.673080780" Dec 03 06:43:48 crc kubenswrapper[4831]: I1203 06:43:48.341673 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:49 crc kubenswrapper[4831]: I1203 06:43:49.296675 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:49 crc kubenswrapper[4831]: I1203 06:43:49.297522 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:49 crc kubenswrapper[4831]: I1203 06:43:49.297629 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:49 crc kubenswrapper[4831]: E1203 06:43:49.341270 4831 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p2xx8_crc-storage_a25efc27-9981-4e58-bf27-8e9650464fc4_0(05c4629e9cecbe9b1426d32a7d106b169be31cd0c30e1a98b73c7fb4cdf65533): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Dec 03 06:43:49 crc kubenswrapper[4831]: E1203 06:43:49.341425 4831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p2xx8_crc-storage_a25efc27-9981-4e58-bf27-8e9650464fc4_0(05c4629e9cecbe9b1426d32a7d106b169be31cd0c30e1a98b73c7fb4cdf65533): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:49 crc kubenswrapper[4831]: E1203 06:43:49.341475 4831 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p2xx8_crc-storage_a25efc27-9981-4e58-bf27-8e9650464fc4_0(05c4629e9cecbe9b1426d32a7d106b169be31cd0c30e1a98b73c7fb4cdf65533): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:43:49 crc kubenswrapper[4831]: E1203 06:43:49.341593 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"crc-storage-crc-p2xx8_crc-storage(a25efc27-9981-4e58-bf27-8e9650464fc4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"crc-storage-crc-p2xx8_crc-storage(a25efc27-9981-4e58-bf27-8e9650464fc4)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_crc-storage-crc-p2xx8_crc-storage_a25efc27-9981-4e58-bf27-8e9650464fc4_0(05c4629e9cecbe9b1426d32a7d106b169be31cd0c30e1a98b73c7fb4cdf65533): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="crc-storage/crc-storage-crc-p2xx8" podUID="a25efc27-9981-4e58-bf27-8e9650464fc4" Dec 03 06:43:49 crc kubenswrapper[4831]: I1203 06:43:49.392698 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:43:57 crc kubenswrapper[4831]: I1203 06:43:57.597259 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:43:57 crc kubenswrapper[4831]: I1203 06:43:57.598053 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:43:57 crc kubenswrapper[4831]: I1203 06:43:57.598137 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:43:57 crc kubenswrapper[4831]: I1203 06:43:57.598954 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"591c53f4c8c9620b5b60eed4f0d2632e242390fceb4ae25a90151135f08319c6"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:43:57 crc kubenswrapper[4831]: I1203 06:43:57.599056 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://591c53f4c8c9620b5b60eed4f0d2632e242390fceb4ae25a90151135f08319c6" gracePeriod=600 Dec 03 06:43:58 crc kubenswrapper[4831]: I1203 06:43:58.369959 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="591c53f4c8c9620b5b60eed4f0d2632e242390fceb4ae25a90151135f08319c6" exitCode=0 Dec 03 06:43:58 crc kubenswrapper[4831]: I1203 06:43:58.370007 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"591c53f4c8c9620b5b60eed4f0d2632e242390fceb4ae25a90151135f08319c6"} Dec 03 06:43:58 crc kubenswrapper[4831]: I1203 06:43:58.370435 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"9c066cdb31940f01296e6a59517a0bf4cdb6c0c7137c9abb1a013450afc9368b"} Dec 03 06:43:58 crc kubenswrapper[4831]: I1203 06:43:58.370474 4831 scope.go:117] "RemoveContainer" containerID="20ad819c7678e6daebd9a1a40a71eabae034828538ab794dbea604a758d7449c" Dec 03 06:44:03 crc kubenswrapper[4831]: I1203 06:44:03.012630 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:44:03 crc kubenswrapper[4831]: I1203 06:44:03.017565 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:44:03 crc kubenswrapper[4831]: I1203 06:44:03.294811 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-p2xx8"] Dec 03 06:44:03 crc kubenswrapper[4831]: W1203 06:44:03.303933 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda25efc27_9981_4e58_bf27_8e9650464fc4.slice/crio-55b1241a6581b6fe810304249a97c6d1670a5823493a0e445ee5b7f5f97f74fb WatchSource:0}: Error finding container 55b1241a6581b6fe810304249a97c6d1670a5823493a0e445ee5b7f5f97f74fb: Status 404 returned error can't find the container with id 55b1241a6581b6fe810304249a97c6d1670a5823493a0e445ee5b7f5f97f74fb Dec 03 06:44:03 crc kubenswrapper[4831]: I1203 06:44:03.307259 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 06:44:03 crc kubenswrapper[4831]: I1203 06:44:03.328799 4831 scope.go:117] "RemoveContainer" containerID="2eb1b0783eb62f680604ecf7af5a64083dfba10f7beb1cb91448817c62828da4" Dec 03 06:44:03 crc kubenswrapper[4831]: I1203 06:44:03.405253 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p2xx8" event={"ID":"a25efc27-9981-4e58-bf27-8e9650464fc4","Type":"ContainerStarted","Data":"55b1241a6581b6fe810304249a97c6d1670a5823493a0e445ee5b7f5f97f74fb"} Dec 03 06:44:03 crc kubenswrapper[4831]: I1203 06:44:03.407704 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vz8ft_74a16df4-1f25-4b0f-9e08-f6486f262a68/kube-multus/2.log" Dec 03 06:44:05 crc kubenswrapper[4831]: I1203 06:44:05.423894 4831 generic.go:334] "Generic (PLEG): container finished" podID="a25efc27-9981-4e58-bf27-8e9650464fc4" containerID="ef6263ec235effaf9016453dc7996fc4818b46714a07f7a0cfbfe3d2031d0c15" exitCode=0 Dec 03 06:44:05 crc kubenswrapper[4831]: I1203 06:44:05.423994 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p2xx8" event={"ID":"a25efc27-9981-4e58-bf27-8e9650464fc4","Type":"ContainerDied","Data":"ef6263ec235effaf9016453dc7996fc4818b46714a07f7a0cfbfe3d2031d0c15"} Dec 03 06:44:06 crc kubenswrapper[4831]: I1203 06:44:06.726827 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:44:06 crc kubenswrapper[4831]: I1203 06:44:06.831649 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbhqr\" (UniqueName: \"kubernetes.io/projected/a25efc27-9981-4e58-bf27-8e9650464fc4-kube-api-access-zbhqr\") pod \"a25efc27-9981-4e58-bf27-8e9650464fc4\" (UID: \"a25efc27-9981-4e58-bf27-8e9650464fc4\") " Dec 03 06:44:06 crc kubenswrapper[4831]: I1203 06:44:06.831756 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a25efc27-9981-4e58-bf27-8e9650464fc4-node-mnt\") pod \"a25efc27-9981-4e58-bf27-8e9650464fc4\" (UID: \"a25efc27-9981-4e58-bf27-8e9650464fc4\") " Dec 03 06:44:06 crc kubenswrapper[4831]: I1203 06:44:06.831876 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a25efc27-9981-4e58-bf27-8e9650464fc4-crc-storage\") pod \"a25efc27-9981-4e58-bf27-8e9650464fc4\" (UID: \"a25efc27-9981-4e58-bf27-8e9650464fc4\") " Dec 03 06:44:06 crc kubenswrapper[4831]: I1203 06:44:06.831897 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a25efc27-9981-4e58-bf27-8e9650464fc4-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "a25efc27-9981-4e58-bf27-8e9650464fc4" (UID: "a25efc27-9981-4e58-bf27-8e9650464fc4"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:44:06 crc kubenswrapper[4831]: I1203 06:44:06.832191 4831 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a25efc27-9981-4e58-bf27-8e9650464fc4-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 03 06:44:06 crc kubenswrapper[4831]: I1203 06:44:06.839306 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a25efc27-9981-4e58-bf27-8e9650464fc4-kube-api-access-zbhqr" (OuterVolumeSpecName: "kube-api-access-zbhqr") pod "a25efc27-9981-4e58-bf27-8e9650464fc4" (UID: "a25efc27-9981-4e58-bf27-8e9650464fc4"). InnerVolumeSpecName "kube-api-access-zbhqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:44:06 crc kubenswrapper[4831]: I1203 06:44:06.846994 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a25efc27-9981-4e58-bf27-8e9650464fc4-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "a25efc27-9981-4e58-bf27-8e9650464fc4" (UID: "a25efc27-9981-4e58-bf27-8e9650464fc4"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:44:06 crc kubenswrapper[4831]: I1203 06:44:06.933919 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbhqr\" (UniqueName: \"kubernetes.io/projected/a25efc27-9981-4e58-bf27-8e9650464fc4-kube-api-access-zbhqr\") on node \"crc\" DevicePath \"\"" Dec 03 06:44:06 crc kubenswrapper[4831]: I1203 06:44:06.933954 4831 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a25efc27-9981-4e58-bf27-8e9650464fc4-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 03 06:44:07 crc kubenswrapper[4831]: I1203 06:44:07.439504 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-p2xx8" event={"ID":"a25efc27-9981-4e58-bf27-8e9650464fc4","Type":"ContainerDied","Data":"55b1241a6581b6fe810304249a97c6d1670a5823493a0e445ee5b7f5f97f74fb"} Dec 03 06:44:07 crc kubenswrapper[4831]: I1203 06:44:07.439561 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-p2xx8" Dec 03 06:44:07 crc kubenswrapper[4831]: I1203 06:44:07.439588 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="55b1241a6581b6fe810304249a97c6d1670a5823493a0e445ee5b7f5f97f74fb" Dec 03 06:44:10 crc kubenswrapper[4831]: I1203 06:44:10.347484 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-lgj4z" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.564701 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf"] Dec 03 06:44:14 crc kubenswrapper[4831]: E1203 06:44:14.565029 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a25efc27-9981-4e58-bf27-8e9650464fc4" containerName="storage" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.565049 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a25efc27-9981-4e58-bf27-8e9650464fc4" containerName="storage" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.565206 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a25efc27-9981-4e58-bf27-8e9650464fc4" containerName="storage" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.566275 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.568756 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.576424 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf"] Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.639158 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf\" (UID: \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.639238 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqchp\" (UniqueName: \"kubernetes.io/projected/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-kube-api-access-vqchp\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf\" (UID: \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.639277 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf\" (UID: \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.740483 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf\" (UID: \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.740542 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqchp\" (UniqueName: \"kubernetes.io/projected/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-kube-api-access-vqchp\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf\" (UID: \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.740573 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf\" (UID: \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.741249 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-util\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf\" (UID: \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.741445 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-bundle\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf\" (UID: \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.766384 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqchp\" (UniqueName: \"kubernetes.io/projected/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-kube-api-access-vqchp\") pod \"5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf\" (UID: \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\") " pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" Dec 03 06:44:14 crc kubenswrapper[4831]: I1203 06:44:14.884819 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" Dec 03 06:44:15 crc kubenswrapper[4831]: I1203 06:44:15.174676 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf"] Dec 03 06:44:15 crc kubenswrapper[4831]: W1203 06:44:15.186094 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62af8b2f_d7af_47ca_9111_1e4fc68aaf8f.slice/crio-b6f819660368c4c33156e42647ffab2506d7ed23a8736c6d04182980e1fb90d3 WatchSource:0}: Error finding container b6f819660368c4c33156e42647ffab2506d7ed23a8736c6d04182980e1fb90d3: Status 404 returned error can't find the container with id b6f819660368c4c33156e42647ffab2506d7ed23a8736c6d04182980e1fb90d3 Dec 03 06:44:15 crc kubenswrapper[4831]: I1203 06:44:15.488239 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" event={"ID":"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f","Type":"ContainerStarted","Data":"a1d5916a7d1dc68dee6619bffe1c8844765bc8c01cee5c41a49f2f8508bbaa81"} Dec 03 06:44:15 crc kubenswrapper[4831]: I1203 06:44:15.488366 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" event={"ID":"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f","Type":"ContainerStarted","Data":"b6f819660368c4c33156e42647ffab2506d7ed23a8736c6d04182980e1fb90d3"} Dec 03 06:44:16 crc kubenswrapper[4831]: I1203 06:44:16.496081 4831 generic.go:334] "Generic (PLEG): container finished" podID="62af8b2f-d7af-47ca-9111-1e4fc68aaf8f" containerID="a1d5916a7d1dc68dee6619bffe1c8844765bc8c01cee5c41a49f2f8508bbaa81" exitCode=0 Dec 03 06:44:16 crc kubenswrapper[4831]: I1203 06:44:16.496153 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" event={"ID":"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f","Type":"ContainerDied","Data":"a1d5916a7d1dc68dee6619bffe1c8844765bc8c01cee5c41a49f2f8508bbaa81"} Dec 03 06:44:16 crc kubenswrapper[4831]: I1203 06:44:16.913530 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-stpmp"] Dec 03 06:44:16 crc kubenswrapper[4831]: I1203 06:44:16.915567 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:16 crc kubenswrapper[4831]: I1203 06:44:16.935163 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-stpmp"] Dec 03 06:44:16 crc kubenswrapper[4831]: I1203 06:44:16.974211 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r8xd\" (UniqueName: \"kubernetes.io/projected/fa9dc00f-84d3-4a03-9e94-237dc59561d9-kube-api-access-4r8xd\") pod \"redhat-operators-stpmp\" (UID: \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\") " pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:16 crc kubenswrapper[4831]: I1203 06:44:16.974287 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9dc00f-84d3-4a03-9e94-237dc59561d9-utilities\") pod \"redhat-operators-stpmp\" (UID: \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\") " pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:16 crc kubenswrapper[4831]: I1203 06:44:16.974531 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9dc00f-84d3-4a03-9e94-237dc59561d9-catalog-content\") pod \"redhat-operators-stpmp\" (UID: \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\") " pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:17 crc kubenswrapper[4831]: I1203 06:44:17.075913 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9dc00f-84d3-4a03-9e94-237dc59561d9-utilities\") pod \"redhat-operators-stpmp\" (UID: \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\") " pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:17 crc kubenswrapper[4831]: I1203 06:44:17.076059 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9dc00f-84d3-4a03-9e94-237dc59561d9-catalog-content\") pod \"redhat-operators-stpmp\" (UID: \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\") " pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:17 crc kubenswrapper[4831]: I1203 06:44:17.076112 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r8xd\" (UniqueName: \"kubernetes.io/projected/fa9dc00f-84d3-4a03-9e94-237dc59561d9-kube-api-access-4r8xd\") pod \"redhat-operators-stpmp\" (UID: \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\") " pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:17 crc kubenswrapper[4831]: I1203 06:44:17.076725 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9dc00f-84d3-4a03-9e94-237dc59561d9-utilities\") pod \"redhat-operators-stpmp\" (UID: \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\") " pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:17 crc kubenswrapper[4831]: I1203 06:44:17.077265 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9dc00f-84d3-4a03-9e94-237dc59561d9-catalog-content\") pod \"redhat-operators-stpmp\" (UID: \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\") " pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:17 crc kubenswrapper[4831]: I1203 06:44:17.102552 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r8xd\" (UniqueName: \"kubernetes.io/projected/fa9dc00f-84d3-4a03-9e94-237dc59561d9-kube-api-access-4r8xd\") pod \"redhat-operators-stpmp\" (UID: \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\") " pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:17 crc kubenswrapper[4831]: I1203 06:44:17.255104 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:17 crc kubenswrapper[4831]: I1203 06:44:17.454472 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-stpmp"] Dec 03 06:44:17 crc kubenswrapper[4831]: I1203 06:44:17.505850 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stpmp" event={"ID":"fa9dc00f-84d3-4a03-9e94-237dc59561d9","Type":"ContainerStarted","Data":"4c60c0e968cd5402b8b13c7d0913d6d1d188e78f34afb088e91c1239e35c3925"} Dec 03 06:44:18 crc kubenswrapper[4831]: I1203 06:44:18.513152 4831 generic.go:334] "Generic (PLEG): container finished" podID="62af8b2f-d7af-47ca-9111-1e4fc68aaf8f" containerID="95f679ebfc904c07411ed0e3c6fa249211dd737d8fc83d247c59d37f26ef9687" exitCode=0 Dec 03 06:44:18 crc kubenswrapper[4831]: I1203 06:44:18.513254 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" event={"ID":"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f","Type":"ContainerDied","Data":"95f679ebfc904c07411ed0e3c6fa249211dd737d8fc83d247c59d37f26ef9687"} Dec 03 06:44:18 crc kubenswrapper[4831]: I1203 06:44:18.515629 4831 generic.go:334] "Generic (PLEG): container finished" podID="fa9dc00f-84d3-4a03-9e94-237dc59561d9" containerID="898e7dda1df93c4e18252be600bd44d72f9d562e9500bed8d95216d8fbcf65d4" exitCode=0 Dec 03 06:44:18 crc kubenswrapper[4831]: I1203 06:44:18.515669 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stpmp" event={"ID":"fa9dc00f-84d3-4a03-9e94-237dc59561d9","Type":"ContainerDied","Data":"898e7dda1df93c4e18252be600bd44d72f9d562e9500bed8d95216d8fbcf65d4"} Dec 03 06:44:19 crc kubenswrapper[4831]: I1203 06:44:19.527528 4831 generic.go:334] "Generic (PLEG): container finished" podID="62af8b2f-d7af-47ca-9111-1e4fc68aaf8f" containerID="a60cf22652cdbd587505ee6f992a3424d1e2c656a2c334554a8ae825aecadbb4" exitCode=0 Dec 03 06:44:19 crc kubenswrapper[4831]: I1203 06:44:19.527581 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" event={"ID":"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f","Type":"ContainerDied","Data":"a60cf22652cdbd587505ee6f992a3424d1e2c656a2c334554a8ae825aecadbb4"} Dec 03 06:44:19 crc kubenswrapper[4831]: I1203 06:44:19.532380 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stpmp" event={"ID":"fa9dc00f-84d3-4a03-9e94-237dc59561d9","Type":"ContainerStarted","Data":"f07fb920c8afb21786bcc8ebb7f09f817e979594b1b5293bb8c388b011ed46b8"} Dec 03 06:44:20 crc kubenswrapper[4831]: I1203 06:44:20.546951 4831 generic.go:334] "Generic (PLEG): container finished" podID="fa9dc00f-84d3-4a03-9e94-237dc59561d9" containerID="f07fb920c8afb21786bcc8ebb7f09f817e979594b1b5293bb8c388b011ed46b8" exitCode=0 Dec 03 06:44:20 crc kubenswrapper[4831]: I1203 06:44:20.547017 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stpmp" event={"ID":"fa9dc00f-84d3-4a03-9e94-237dc59561d9","Type":"ContainerDied","Data":"f07fb920c8afb21786bcc8ebb7f09f817e979594b1b5293bb8c388b011ed46b8"} Dec 03 06:44:20 crc kubenswrapper[4831]: I1203 06:44:20.831353 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" Dec 03 06:44:20 crc kubenswrapper[4831]: I1203 06:44:20.922746 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-bundle\") pod \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\" (UID: \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\") " Dec 03 06:44:20 crc kubenswrapper[4831]: I1203 06:44:20.922791 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-util\") pod \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\" (UID: \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\") " Dec 03 06:44:20 crc kubenswrapper[4831]: I1203 06:44:20.922874 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vqchp\" (UniqueName: \"kubernetes.io/projected/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-kube-api-access-vqchp\") pod \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\" (UID: \"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f\") " Dec 03 06:44:20 crc kubenswrapper[4831]: I1203 06:44:20.923397 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-bundle" (OuterVolumeSpecName: "bundle") pod "62af8b2f-d7af-47ca-9111-1e4fc68aaf8f" (UID: "62af8b2f-d7af-47ca-9111-1e4fc68aaf8f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:44:20 crc kubenswrapper[4831]: I1203 06:44:20.931196 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-kube-api-access-vqchp" (OuterVolumeSpecName: "kube-api-access-vqchp") pod "62af8b2f-d7af-47ca-9111-1e4fc68aaf8f" (UID: "62af8b2f-d7af-47ca-9111-1e4fc68aaf8f"). InnerVolumeSpecName "kube-api-access-vqchp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:44:20 crc kubenswrapper[4831]: I1203 06:44:20.938708 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-util" (OuterVolumeSpecName: "util") pod "62af8b2f-d7af-47ca-9111-1e4fc68aaf8f" (UID: "62af8b2f-d7af-47ca-9111-1e4fc68aaf8f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:44:21 crc kubenswrapper[4831]: I1203 06:44:21.024493 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vqchp\" (UniqueName: \"kubernetes.io/projected/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-kube-api-access-vqchp\") on node \"crc\" DevicePath \"\"" Dec 03 06:44:21 crc kubenswrapper[4831]: I1203 06:44:21.024533 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:44:21 crc kubenswrapper[4831]: I1203 06:44:21.024548 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/62af8b2f-d7af-47ca-9111-1e4fc68aaf8f-util\") on node \"crc\" DevicePath \"\"" Dec 03 06:44:21 crc kubenswrapper[4831]: I1203 06:44:21.555187 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" event={"ID":"62af8b2f-d7af-47ca-9111-1e4fc68aaf8f","Type":"ContainerDied","Data":"b6f819660368c4c33156e42647ffab2506d7ed23a8736c6d04182980e1fb90d3"} Dec 03 06:44:21 crc kubenswrapper[4831]: I1203 06:44:21.555222 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf" Dec 03 06:44:21 crc kubenswrapper[4831]: I1203 06:44:21.555226 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b6f819660368c4c33156e42647ffab2506d7ed23a8736c6d04182980e1fb90d3" Dec 03 06:44:21 crc kubenswrapper[4831]: I1203 06:44:21.558201 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stpmp" event={"ID":"fa9dc00f-84d3-4a03-9e94-237dc59561d9","Type":"ContainerStarted","Data":"010f067f9a017200c6dfcdacb52ced0445ce1f5561ee18495f2bd8f29dfd0392"} Dec 03 06:44:21 crc kubenswrapper[4831]: I1203 06:44:21.583476 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-stpmp" podStartSLOduration=3.031087676 podStartE2EDuration="5.583457753s" podCreationTimestamp="2025-12-03 06:44:16 +0000 UTC" firstStartedPulling="2025-12-03 06:44:18.52049312 +0000 UTC m=+795.864076628" lastFinishedPulling="2025-12-03 06:44:21.072863197 +0000 UTC m=+798.416446705" observedRunningTime="2025-12-03 06:44:21.57830678 +0000 UTC m=+798.921890368" watchObservedRunningTime="2025-12-03 06:44:21.583457753 +0000 UTC m=+798.927041271" Dec 03 06:44:24 crc kubenswrapper[4831]: I1203 06:44:24.815107 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-j4lt4"] Dec 03 06:44:24 crc kubenswrapper[4831]: E1203 06:44:24.815753 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62af8b2f-d7af-47ca-9111-1e4fc68aaf8f" containerName="pull" Dec 03 06:44:24 crc kubenswrapper[4831]: I1203 06:44:24.815763 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="62af8b2f-d7af-47ca-9111-1e4fc68aaf8f" containerName="pull" Dec 03 06:44:24 crc kubenswrapper[4831]: E1203 06:44:24.815778 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62af8b2f-d7af-47ca-9111-1e4fc68aaf8f" containerName="util" Dec 03 06:44:24 crc kubenswrapper[4831]: I1203 06:44:24.815783 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="62af8b2f-d7af-47ca-9111-1e4fc68aaf8f" containerName="util" Dec 03 06:44:24 crc kubenswrapper[4831]: E1203 06:44:24.815795 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62af8b2f-d7af-47ca-9111-1e4fc68aaf8f" containerName="extract" Dec 03 06:44:24 crc kubenswrapper[4831]: I1203 06:44:24.815802 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="62af8b2f-d7af-47ca-9111-1e4fc68aaf8f" containerName="extract" Dec 03 06:44:24 crc kubenswrapper[4831]: I1203 06:44:24.815896 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="62af8b2f-d7af-47ca-9111-1e4fc68aaf8f" containerName="extract" Dec 03 06:44:24 crc kubenswrapper[4831]: I1203 06:44:24.816224 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-j4lt4" Dec 03 06:44:24 crc kubenswrapper[4831]: I1203 06:44:24.818546 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Dec 03 06:44:24 crc kubenswrapper[4831]: I1203 06:44:24.818779 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Dec 03 06:44:24 crc kubenswrapper[4831]: I1203 06:44:24.819766 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-7fpvl" Dec 03 06:44:24 crc kubenswrapper[4831]: I1203 06:44:24.827360 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-j4lt4"] Dec 03 06:44:24 crc kubenswrapper[4831]: I1203 06:44:24.870383 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8glm8\" (UniqueName: \"kubernetes.io/projected/52e0691e-6d8e-473a-84d0-11d5872313d7-kube-api-access-8glm8\") pod \"nmstate-operator-5b5b58f5c8-j4lt4\" (UID: \"52e0691e-6d8e-473a-84d0-11d5872313d7\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-j4lt4" Dec 03 06:44:24 crc kubenswrapper[4831]: I1203 06:44:24.972042 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8glm8\" (UniqueName: \"kubernetes.io/projected/52e0691e-6d8e-473a-84d0-11d5872313d7-kube-api-access-8glm8\") pod \"nmstate-operator-5b5b58f5c8-j4lt4\" (UID: \"52e0691e-6d8e-473a-84d0-11d5872313d7\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-j4lt4" Dec 03 06:44:24 crc kubenswrapper[4831]: I1203 06:44:24.993971 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8glm8\" (UniqueName: \"kubernetes.io/projected/52e0691e-6d8e-473a-84d0-11d5872313d7-kube-api-access-8glm8\") pod \"nmstate-operator-5b5b58f5c8-j4lt4\" (UID: \"52e0691e-6d8e-473a-84d0-11d5872313d7\") " pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-j4lt4" Dec 03 06:44:25 crc kubenswrapper[4831]: I1203 06:44:25.133465 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-j4lt4" Dec 03 06:44:25 crc kubenswrapper[4831]: I1203 06:44:25.372879 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-5b5b58f5c8-j4lt4"] Dec 03 06:44:25 crc kubenswrapper[4831]: I1203 06:44:25.581598 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-j4lt4" event={"ID":"52e0691e-6d8e-473a-84d0-11d5872313d7","Type":"ContainerStarted","Data":"4838d7b52fc2d6666ba2267c97c7ba92ad9f402752c61ef2da21cf1af0276761"} Dec 03 06:44:27 crc kubenswrapper[4831]: I1203 06:44:27.256122 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:27 crc kubenswrapper[4831]: I1203 06:44:27.256481 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:27 crc kubenswrapper[4831]: I1203 06:44:27.295978 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:27 crc kubenswrapper[4831]: I1203 06:44:27.645939 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:28 crc kubenswrapper[4831]: I1203 06:44:28.600208 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-j4lt4" event={"ID":"52e0691e-6d8e-473a-84d0-11d5872313d7","Type":"ContainerStarted","Data":"a29155a2260f0a4e5a868bba4502446d582e82bf7059e73462cf057c61accf54"} Dec 03 06:44:29 crc kubenswrapper[4831]: I1203 06:44:29.694828 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-5b5b58f5c8-j4lt4" podStartSLOduration=3.1861529 podStartE2EDuration="5.694795158s" podCreationTimestamp="2025-12-03 06:44:24 +0000 UTC" firstStartedPulling="2025-12-03 06:44:25.386829858 +0000 UTC m=+802.730413366" lastFinishedPulling="2025-12-03 06:44:27.895472116 +0000 UTC m=+805.239055624" observedRunningTime="2025-12-03 06:44:28.625555369 +0000 UTC m=+805.969138917" watchObservedRunningTime="2025-12-03 06:44:29.694795158 +0000 UTC m=+807.038378696" Dec 03 06:44:29 crc kubenswrapper[4831]: I1203 06:44:29.700868 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-stpmp"] Dec 03 06:44:30 crc kubenswrapper[4831]: I1203 06:44:30.610874 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-stpmp" podUID="fa9dc00f-84d3-4a03-9e94-237dc59561d9" containerName="registry-server" containerID="cri-o://010f067f9a017200c6dfcdacb52ced0445ce1f5561ee18495f2bd8f29dfd0392" gracePeriod=2 Dec 03 06:44:31 crc kubenswrapper[4831]: I1203 06:44:31.618766 4831 generic.go:334] "Generic (PLEG): container finished" podID="fa9dc00f-84d3-4a03-9e94-237dc59561d9" containerID="010f067f9a017200c6dfcdacb52ced0445ce1f5561ee18495f2bd8f29dfd0392" exitCode=0 Dec 03 06:44:31 crc kubenswrapper[4831]: I1203 06:44:31.618829 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stpmp" event={"ID":"fa9dc00f-84d3-4a03-9e94-237dc59561d9","Type":"ContainerDied","Data":"010f067f9a017200c6dfcdacb52ced0445ce1f5561ee18495f2bd8f29dfd0392"} Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.117906 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.279846 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9dc00f-84d3-4a03-9e94-237dc59561d9-utilities\") pod \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\" (UID: \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\") " Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.279896 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9dc00f-84d3-4a03-9e94-237dc59561d9-catalog-content\") pod \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\" (UID: \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\") " Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.279943 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4r8xd\" (UniqueName: \"kubernetes.io/projected/fa9dc00f-84d3-4a03-9e94-237dc59561d9-kube-api-access-4r8xd\") pod \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\" (UID: \"fa9dc00f-84d3-4a03-9e94-237dc59561d9\") " Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.282638 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa9dc00f-84d3-4a03-9e94-237dc59561d9-utilities" (OuterVolumeSpecName: "utilities") pod "fa9dc00f-84d3-4a03-9e94-237dc59561d9" (UID: "fa9dc00f-84d3-4a03-9e94-237dc59561d9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.285584 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9dc00f-84d3-4a03-9e94-237dc59561d9-kube-api-access-4r8xd" (OuterVolumeSpecName: "kube-api-access-4r8xd") pod "fa9dc00f-84d3-4a03-9e94-237dc59561d9" (UID: "fa9dc00f-84d3-4a03-9e94-237dc59561d9"). InnerVolumeSpecName "kube-api-access-4r8xd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.381406 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fa9dc00f-84d3-4a03-9e94-237dc59561d9-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.381441 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4r8xd\" (UniqueName: \"kubernetes.io/projected/fa9dc00f-84d3-4a03-9e94-237dc59561d9-kube-api-access-4r8xd\") on node \"crc\" DevicePath \"\"" Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.432521 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fa9dc00f-84d3-4a03-9e94-237dc59561d9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fa9dc00f-84d3-4a03-9e94-237dc59561d9" (UID: "fa9dc00f-84d3-4a03-9e94-237dc59561d9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.483002 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fa9dc00f-84d3-4a03-9e94-237dc59561d9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.630040 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-stpmp" event={"ID":"fa9dc00f-84d3-4a03-9e94-237dc59561d9","Type":"ContainerDied","Data":"4c60c0e968cd5402b8b13c7d0913d6d1d188e78f34afb088e91c1239e35c3925"} Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.631574 4831 scope.go:117] "RemoveContainer" containerID="010f067f9a017200c6dfcdacb52ced0445ce1f5561ee18495f2bd8f29dfd0392" Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.632445 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-stpmp" Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.660440 4831 scope.go:117] "RemoveContainer" containerID="f07fb920c8afb21786bcc8ebb7f09f817e979594b1b5293bb8c388b011ed46b8" Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.682045 4831 scope.go:117] "RemoveContainer" containerID="898e7dda1df93c4e18252be600bd44d72f9d562e9500bed8d95216d8fbcf65d4" Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.686885 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-stpmp"] Dec 03 06:44:32 crc kubenswrapper[4831]: I1203 06:44:32.694174 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-stpmp"] Dec 03 06:44:33 crc kubenswrapper[4831]: I1203 06:44:33.025109 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9dc00f-84d3-4a03-9e94-237dc59561d9" path="/var/lib/kubelet/pods/fa9dc00f-84d3-4a03-9e94-237dc59561d9/volumes" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.129022 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-m2cm2"] Dec 03 06:44:35 crc kubenswrapper[4831]: E1203 06:44:35.129624 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9dc00f-84d3-4a03-9e94-237dc59561d9" containerName="extract-utilities" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.129645 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9dc00f-84d3-4a03-9e94-237dc59561d9" containerName="extract-utilities" Dec 03 06:44:35 crc kubenswrapper[4831]: E1203 06:44:35.129670 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9dc00f-84d3-4a03-9e94-237dc59561d9" containerName="registry-server" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.129681 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9dc00f-84d3-4a03-9e94-237dc59561d9" containerName="registry-server" Dec 03 06:44:35 crc kubenswrapper[4831]: E1203 06:44:35.129695 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9dc00f-84d3-4a03-9e94-237dc59561d9" containerName="extract-content" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.129707 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9dc00f-84d3-4a03-9e94-237dc59561d9" containerName="extract-content" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.129863 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9dc00f-84d3-4a03-9e94-237dc59561d9" containerName="registry-server" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.130636 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m2cm2" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.139611 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns"] Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.140383 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.144299 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.145050 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-m2cm2"] Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.148733 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-2lwsm" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.159866 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-frsqc"] Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.161159 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.176814 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns"] Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.228934 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbrmz\" (UniqueName: \"kubernetes.io/projected/328e107f-bb1a-448c-92c0-7aacaa6bb84f-kube-api-access-rbrmz\") pod \"nmstate-metrics-7f946cbc9-m2cm2\" (UID: \"328e107f-bb1a-448c-92c0-7aacaa6bb84f\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m2cm2" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.254161 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz"] Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.254896 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.257867 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.257974 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-j55mq" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.257878 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.269512 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz"] Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.330430 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/123ef7d5-4ad6-4a82-8dc7-63621e57d51c-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-gl6ns\" (UID: \"123ef7d5-4ad6-4a82-8dc7-63621e57d51c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.330498 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bxpw\" (UniqueName: \"kubernetes.io/projected/a85480d7-0191-4b3b-8542-ee01a494109f-kube-api-access-8bxpw\") pod \"nmstate-handler-frsqc\" (UID: \"a85480d7-0191-4b3b-8542-ee01a494109f\") " pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.330571 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbrmz\" (UniqueName: \"kubernetes.io/projected/328e107f-bb1a-448c-92c0-7aacaa6bb84f-kube-api-access-rbrmz\") pod \"nmstate-metrics-7f946cbc9-m2cm2\" (UID: \"328e107f-bb1a-448c-92c0-7aacaa6bb84f\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m2cm2" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.330601 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a85480d7-0191-4b3b-8542-ee01a494109f-nmstate-lock\") pod \"nmstate-handler-frsqc\" (UID: \"a85480d7-0191-4b3b-8542-ee01a494109f\") " pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.330626 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a85480d7-0191-4b3b-8542-ee01a494109f-dbus-socket\") pod \"nmstate-handler-frsqc\" (UID: \"a85480d7-0191-4b3b-8542-ee01a494109f\") " pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.330646 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a85480d7-0191-4b3b-8542-ee01a494109f-ovs-socket\") pod \"nmstate-handler-frsqc\" (UID: \"a85480d7-0191-4b3b-8542-ee01a494109f\") " pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.330661 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktxxr\" (UniqueName: \"kubernetes.io/projected/123ef7d5-4ad6-4a82-8dc7-63621e57d51c-kube-api-access-ktxxr\") pod \"nmstate-webhook-5f6d4c5ccb-gl6ns\" (UID: \"123ef7d5-4ad6-4a82-8dc7-63621e57d51c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.370471 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbrmz\" (UniqueName: \"kubernetes.io/projected/328e107f-bb1a-448c-92c0-7aacaa6bb84f-kube-api-access-rbrmz\") pod \"nmstate-metrics-7f946cbc9-m2cm2\" (UID: \"328e107f-bb1a-448c-92c0-7aacaa6bb84f\") " pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m2cm2" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.431783 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a85480d7-0191-4b3b-8542-ee01a494109f-nmstate-lock\") pod \"nmstate-handler-frsqc\" (UID: \"a85480d7-0191-4b3b-8542-ee01a494109f\") " pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.431855 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a85480d7-0191-4b3b-8542-ee01a494109f-dbus-socket\") pod \"nmstate-handler-frsqc\" (UID: \"a85480d7-0191-4b3b-8542-ee01a494109f\") " pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.431882 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a85480d7-0191-4b3b-8542-ee01a494109f-ovs-socket\") pod \"nmstate-handler-frsqc\" (UID: \"a85480d7-0191-4b3b-8542-ee01a494109f\") " pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.431895 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a85480d7-0191-4b3b-8542-ee01a494109f-nmstate-lock\") pod \"nmstate-handler-frsqc\" (UID: \"a85480d7-0191-4b3b-8542-ee01a494109f\") " pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.431904 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktxxr\" (UniqueName: \"kubernetes.io/projected/123ef7d5-4ad6-4a82-8dc7-63621e57d51c-kube-api-access-ktxxr\") pod \"nmstate-webhook-5f6d4c5ccb-gl6ns\" (UID: \"123ef7d5-4ad6-4a82-8dc7-63621e57d51c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.432147 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a85480d7-0191-4b3b-8542-ee01a494109f-ovs-socket\") pod \"nmstate-handler-frsqc\" (UID: \"a85480d7-0191-4b3b-8542-ee01a494109f\") " pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.432165 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9b13740f-a5e5-40c1-8925-12aaa3a9498c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-4h5zz\" (UID: \"9b13740f-a5e5-40c1-8925-12aaa3a9498c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.432197 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/123ef7d5-4ad6-4a82-8dc7-63621e57d51c-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-gl6ns\" (UID: \"123ef7d5-4ad6-4a82-8dc7-63621e57d51c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.432239 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b13740f-a5e5-40c1-8925-12aaa3a9498c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-4h5zz\" (UID: \"9b13740f-a5e5-40c1-8925-12aaa3a9498c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.432284 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bxpw\" (UniqueName: \"kubernetes.io/projected/a85480d7-0191-4b3b-8542-ee01a494109f-kube-api-access-8bxpw\") pod \"nmstate-handler-frsqc\" (UID: \"a85480d7-0191-4b3b-8542-ee01a494109f\") " pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.432336 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sk7g6\" (UniqueName: \"kubernetes.io/projected/9b13740f-a5e5-40c1-8925-12aaa3a9498c-kube-api-access-sk7g6\") pod \"nmstate-console-plugin-7fbb5f6569-4h5zz\" (UID: \"9b13740f-a5e5-40c1-8925-12aaa3a9498c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.432895 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a85480d7-0191-4b3b-8542-ee01a494109f-dbus-socket\") pod \"nmstate-handler-frsqc\" (UID: \"a85480d7-0191-4b3b-8542-ee01a494109f\") " pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.442131 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/123ef7d5-4ad6-4a82-8dc7-63621e57d51c-tls-key-pair\") pod \"nmstate-webhook-5f6d4c5ccb-gl6ns\" (UID: \"123ef7d5-4ad6-4a82-8dc7-63621e57d51c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.450211 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bxpw\" (UniqueName: \"kubernetes.io/projected/a85480d7-0191-4b3b-8542-ee01a494109f-kube-api-access-8bxpw\") pod \"nmstate-handler-frsqc\" (UID: \"a85480d7-0191-4b3b-8542-ee01a494109f\") " pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.455016 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktxxr\" (UniqueName: \"kubernetes.io/projected/123ef7d5-4ad6-4a82-8dc7-63621e57d51c-kube-api-access-ktxxr\") pod \"nmstate-webhook-5f6d4c5ccb-gl6ns\" (UID: \"123ef7d5-4ad6-4a82-8dc7-63621e57d51c\") " pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.462644 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.463558 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m2cm2" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.486385 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.499872 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-d5cb88c8d-k2jfg"] Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.500696 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.517010 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d5cb88c8d-k2jfg"] Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.533175 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b13740f-a5e5-40c1-8925-12aaa3a9498c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-4h5zz\" (UID: \"9b13740f-a5e5-40c1-8925-12aaa3a9498c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.533223 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sk7g6\" (UniqueName: \"kubernetes.io/projected/9b13740f-a5e5-40c1-8925-12aaa3a9498c-kube-api-access-sk7g6\") pod \"nmstate-console-plugin-7fbb5f6569-4h5zz\" (UID: \"9b13740f-a5e5-40c1-8925-12aaa3a9498c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.533275 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9b13740f-a5e5-40c1-8925-12aaa3a9498c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-4h5zz\" (UID: \"9b13740f-a5e5-40c1-8925-12aaa3a9498c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.534489 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/9b13740f-a5e5-40c1-8925-12aaa3a9498c-nginx-conf\") pod \"nmstate-console-plugin-7fbb5f6569-4h5zz\" (UID: \"9b13740f-a5e5-40c1-8925-12aaa3a9498c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.541664 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/9b13740f-a5e5-40c1-8925-12aaa3a9498c-plugin-serving-cert\") pod \"nmstate-console-plugin-7fbb5f6569-4h5zz\" (UID: \"9b13740f-a5e5-40c1-8925-12aaa3a9498c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.553601 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sk7g6\" (UniqueName: \"kubernetes.io/projected/9b13740f-a5e5-40c1-8925-12aaa3a9498c-kube-api-access-sk7g6\") pod \"nmstate-console-plugin-7fbb5f6569-4h5zz\" (UID: \"9b13740f-a5e5-40c1-8925-12aaa3a9498c\") " pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.569708 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.634702 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/218bafdc-a6a9-4ec4-9f96-868589faeb75-console-serving-cert\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.634753 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/218bafdc-a6a9-4ec4-9f96-868589faeb75-service-ca\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.634782 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/218bafdc-a6a9-4ec4-9f96-868589faeb75-console-oauth-config\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.634803 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/218bafdc-a6a9-4ec4-9f96-868589faeb75-oauth-serving-cert\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.634824 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/218bafdc-a6a9-4ec4-9f96-868589faeb75-trusted-ca-bundle\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.634843 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw8px\" (UniqueName: \"kubernetes.io/projected/218bafdc-a6a9-4ec4-9f96-868589faeb75-kube-api-access-zw8px\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.634860 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/218bafdc-a6a9-4ec4-9f96-868589faeb75-console-config\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.656352 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-frsqc" event={"ID":"a85480d7-0191-4b3b-8542-ee01a494109f","Type":"ContainerStarted","Data":"9c8291e1100acf09bbd9cd0cd7f08bf229661dd2f674aee633a5e8d4f451ba17"} Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.693228 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns"] Dec 03 06:44:35 crc kubenswrapper[4831]: W1203 06:44:35.703499 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod123ef7d5_4ad6_4a82_8dc7_63621e57d51c.slice/crio-77baae172d1d6d58f3592b9b9df84caeed8281026e3eb7a4f4139aa6fa34721f WatchSource:0}: Error finding container 77baae172d1d6d58f3592b9b9df84caeed8281026e3eb7a4f4139aa6fa34721f: Status 404 returned error can't find the container with id 77baae172d1d6d58f3592b9b9df84caeed8281026e3eb7a4f4139aa6fa34721f Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.736014 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/218bafdc-a6a9-4ec4-9f96-868589faeb75-console-serving-cert\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.736078 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/218bafdc-a6a9-4ec4-9f96-868589faeb75-service-ca\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.736112 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/218bafdc-a6a9-4ec4-9f96-868589faeb75-console-oauth-config\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.736138 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/218bafdc-a6a9-4ec4-9f96-868589faeb75-oauth-serving-cert\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.736169 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/218bafdc-a6a9-4ec4-9f96-868589faeb75-trusted-ca-bundle\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.736190 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zw8px\" (UniqueName: \"kubernetes.io/projected/218bafdc-a6a9-4ec4-9f96-868589faeb75-kube-api-access-zw8px\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.736216 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/218bafdc-a6a9-4ec4-9f96-868589faeb75-console-config\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.737272 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/218bafdc-a6a9-4ec4-9f96-868589faeb75-console-config\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.737272 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/218bafdc-a6a9-4ec4-9f96-868589faeb75-oauth-serving-cert\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.737397 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/218bafdc-a6a9-4ec4-9f96-868589faeb75-service-ca\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.738273 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/218bafdc-a6a9-4ec4-9f96-868589faeb75-trusted-ca-bundle\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.739970 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/218bafdc-a6a9-4ec4-9f96-868589faeb75-console-oauth-config\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.740152 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-7f946cbc9-m2cm2"] Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.742346 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/218bafdc-a6a9-4ec4-9f96-868589faeb75-console-serving-cert\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.757037 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw8px\" (UniqueName: \"kubernetes.io/projected/218bafdc-a6a9-4ec4-9f96-868589faeb75-kube-api-access-zw8px\") pod \"console-d5cb88c8d-k2jfg\" (UID: \"218bafdc-a6a9-4ec4-9f96-868589faeb75\") " pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:35 crc kubenswrapper[4831]: I1203 06:44:35.839941 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:36 crc kubenswrapper[4831]: I1203 06:44:36.012929 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz"] Dec 03 06:44:36 crc kubenswrapper[4831]: W1203 06:44:36.020903 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b13740f_a5e5_40c1_8925_12aaa3a9498c.slice/crio-e7217d680d333151336dbffedbb90f03159f4a732840df901bcdef12a8a10e97 WatchSource:0}: Error finding container e7217d680d333151336dbffedbb90f03159f4a732840df901bcdef12a8a10e97: Status 404 returned error can't find the container with id e7217d680d333151336dbffedbb90f03159f4a732840df901bcdef12a8a10e97 Dec 03 06:44:36 crc kubenswrapper[4831]: I1203 06:44:36.030425 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-d5cb88c8d-k2jfg"] Dec 03 06:44:36 crc kubenswrapper[4831]: W1203 06:44:36.036151 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod218bafdc_a6a9_4ec4_9f96_868589faeb75.slice/crio-42b0d825167a6a1433c44d074f95dd478c34185ebb660390f01d83b9f3fb4567 WatchSource:0}: Error finding container 42b0d825167a6a1433c44d074f95dd478c34185ebb660390f01d83b9f3fb4567: Status 404 returned error can't find the container with id 42b0d825167a6a1433c44d074f95dd478c34185ebb660390f01d83b9f3fb4567 Dec 03 06:44:36 crc kubenswrapper[4831]: I1203 06:44:36.666389 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns" event={"ID":"123ef7d5-4ad6-4a82-8dc7-63621e57d51c","Type":"ContainerStarted","Data":"77baae172d1d6d58f3592b9b9df84caeed8281026e3eb7a4f4139aa6fa34721f"} Dec 03 06:44:36 crc kubenswrapper[4831]: I1203 06:44:36.667768 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d5cb88c8d-k2jfg" event={"ID":"218bafdc-a6a9-4ec4-9f96-868589faeb75","Type":"ContainerStarted","Data":"31defedb8298e7240704c197537e146862f1e75973940f8f16078ee69f8dfe88"} Dec 03 06:44:36 crc kubenswrapper[4831]: I1203 06:44:36.667805 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-d5cb88c8d-k2jfg" event={"ID":"218bafdc-a6a9-4ec4-9f96-868589faeb75","Type":"ContainerStarted","Data":"42b0d825167a6a1433c44d074f95dd478c34185ebb660390f01d83b9f3fb4567"} Dec 03 06:44:36 crc kubenswrapper[4831]: I1203 06:44:36.669285 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" event={"ID":"9b13740f-a5e5-40c1-8925-12aaa3a9498c","Type":"ContainerStarted","Data":"e7217d680d333151336dbffedbb90f03159f4a732840df901bcdef12a8a10e97"} Dec 03 06:44:36 crc kubenswrapper[4831]: I1203 06:44:36.671637 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m2cm2" event={"ID":"328e107f-bb1a-448c-92c0-7aacaa6bb84f","Type":"ContainerStarted","Data":"b2e24c004729a6f8c516725f83454d19c458062746edea57241ad3cfb911f8fb"} Dec 03 06:44:36 crc kubenswrapper[4831]: I1203 06:44:36.689659 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-d5cb88c8d-k2jfg" podStartSLOduration=1.689642474 podStartE2EDuration="1.689642474s" podCreationTimestamp="2025-12-03 06:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:44:36.688202649 +0000 UTC m=+814.031786157" watchObservedRunningTime="2025-12-03 06:44:36.689642474 +0000 UTC m=+814.033225982" Dec 03 06:44:38 crc kubenswrapper[4831]: I1203 06:44:38.684006 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-frsqc" event={"ID":"a85480d7-0191-4b3b-8542-ee01a494109f","Type":"ContainerStarted","Data":"35749d6bb2bb08374168c1825f9ec30e4ab294418de0673db2df5c6b4613bb5c"} Dec 03 06:44:38 crc kubenswrapper[4831]: I1203 06:44:38.684581 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:38 crc kubenswrapper[4831]: I1203 06:44:38.688373 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m2cm2" event={"ID":"328e107f-bb1a-448c-92c0-7aacaa6bb84f","Type":"ContainerStarted","Data":"bb49ed655117969e6fd554c2e06ea25a4d43abf292c5e060f06d43aa078c976e"} Dec 03 06:44:38 crc kubenswrapper[4831]: I1203 06:44:38.689882 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns" event={"ID":"123ef7d5-4ad6-4a82-8dc7-63621e57d51c","Type":"ContainerStarted","Data":"c4c7c72fbb17f96e352adf4882034c0c866094ed9993d7d83060fd69362c761f"} Dec 03 06:44:38 crc kubenswrapper[4831]: I1203 06:44:38.690010 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns" Dec 03 06:44:38 crc kubenswrapper[4831]: I1203 06:44:38.702120 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-frsqc" podStartSLOduration=1.42016799 podStartE2EDuration="3.702104252s" podCreationTimestamp="2025-12-03 06:44:35 +0000 UTC" firstStartedPulling="2025-12-03 06:44:35.547447869 +0000 UTC m=+812.891031377" lastFinishedPulling="2025-12-03 06:44:37.829384131 +0000 UTC m=+815.172967639" observedRunningTime="2025-12-03 06:44:38.697585959 +0000 UTC m=+816.041169467" watchObservedRunningTime="2025-12-03 06:44:38.702104252 +0000 UTC m=+816.045687760" Dec 03 06:44:39 crc kubenswrapper[4831]: I1203 06:44:39.695382 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" event={"ID":"9b13740f-a5e5-40c1-8925-12aaa3a9498c","Type":"ContainerStarted","Data":"b33fd139dbba4a89a6e010118e31af89559f52ed78fc517315c1799325739b3d"} Dec 03 06:44:39 crc kubenswrapper[4831]: I1203 06:44:39.716541 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7fbb5f6569-4h5zz" podStartSLOduration=2.040648735 podStartE2EDuration="4.716209341s" podCreationTimestamp="2025-12-03 06:44:35 +0000 UTC" firstStartedPulling="2025-12-03 06:44:36.027777426 +0000 UTC m=+813.371360934" lastFinishedPulling="2025-12-03 06:44:38.703338032 +0000 UTC m=+816.046921540" observedRunningTime="2025-12-03 06:44:39.713258038 +0000 UTC m=+817.056841546" watchObservedRunningTime="2025-12-03 06:44:39.716209341 +0000 UTC m=+817.059792849" Dec 03 06:44:39 crc kubenswrapper[4831]: I1203 06:44:39.718297 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns" podStartSLOduration=2.580683907 podStartE2EDuration="4.718288637s" podCreationTimestamp="2025-12-03 06:44:35 +0000 UTC" firstStartedPulling="2025-12-03 06:44:35.70561053 +0000 UTC m=+813.049194038" lastFinishedPulling="2025-12-03 06:44:37.84321526 +0000 UTC m=+815.186798768" observedRunningTime="2025-12-03 06:44:38.717660016 +0000 UTC m=+816.061243524" watchObservedRunningTime="2025-12-03 06:44:39.718288637 +0000 UTC m=+817.061872145" Dec 03 06:44:40 crc kubenswrapper[4831]: I1203 06:44:40.703806 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m2cm2" event={"ID":"328e107f-bb1a-448c-92c0-7aacaa6bb84f","Type":"ContainerStarted","Data":"69e6717a96719136b4816ca96f950cfca20f8eb6c41f4c36a8fb1f2337db6fe9"} Dec 03 06:44:40 crc kubenswrapper[4831]: I1203 06:44:40.728040 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-7f946cbc9-m2cm2" podStartSLOduration=1.497183007 podStartE2EDuration="5.728014638s" podCreationTimestamp="2025-12-03 06:44:35 +0000 UTC" firstStartedPulling="2025-12-03 06:44:35.745234108 +0000 UTC m=+813.088817616" lastFinishedPulling="2025-12-03 06:44:39.976065729 +0000 UTC m=+817.319649247" observedRunningTime="2025-12-03 06:44:40.725769057 +0000 UTC m=+818.069352625" watchObservedRunningTime="2025-12-03 06:44:40.728014638 +0000 UTC m=+818.071598176" Dec 03 06:44:45 crc kubenswrapper[4831]: I1203 06:44:45.528023 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-frsqc" Dec 03 06:44:45 crc kubenswrapper[4831]: I1203 06:44:45.840203 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:45 crc kubenswrapper[4831]: I1203 06:44:45.840300 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:45 crc kubenswrapper[4831]: I1203 06:44:45.847753 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:46 crc kubenswrapper[4831]: I1203 06:44:46.751543 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-d5cb88c8d-k2jfg" Dec 03 06:44:46 crc kubenswrapper[4831]: I1203 06:44:46.824008 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dwsb6"] Dec 03 06:44:55 crc kubenswrapper[4831]: I1203 06:44:55.470728 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f6d4c5ccb-gl6ns" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.145814 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c"] Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.146836 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.148629 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.149760 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.159234 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c"] Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.268061 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snskk\" (UniqueName: \"kubernetes.io/projected/81173897-29c5-4ce0-a308-f48eadb82cc4-kube-api-access-snskk\") pod \"collect-profiles-29412405-bq88c\" (UID: \"81173897-29c5-4ce0-a308-f48eadb82cc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.268138 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81173897-29c5-4ce0-a308-f48eadb82cc4-config-volume\") pod \"collect-profiles-29412405-bq88c\" (UID: \"81173897-29c5-4ce0-a308-f48eadb82cc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.268181 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81173897-29c5-4ce0-a308-f48eadb82cc4-secret-volume\") pod \"collect-profiles-29412405-bq88c\" (UID: \"81173897-29c5-4ce0-a308-f48eadb82cc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.369776 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81173897-29c5-4ce0-a308-f48eadb82cc4-secret-volume\") pod \"collect-profiles-29412405-bq88c\" (UID: \"81173897-29c5-4ce0-a308-f48eadb82cc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.369898 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snskk\" (UniqueName: \"kubernetes.io/projected/81173897-29c5-4ce0-a308-f48eadb82cc4-kube-api-access-snskk\") pod \"collect-profiles-29412405-bq88c\" (UID: \"81173897-29c5-4ce0-a308-f48eadb82cc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.369929 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81173897-29c5-4ce0-a308-f48eadb82cc4-config-volume\") pod \"collect-profiles-29412405-bq88c\" (UID: \"81173897-29c5-4ce0-a308-f48eadb82cc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.375463 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81173897-29c5-4ce0-a308-f48eadb82cc4-config-volume\") pod \"collect-profiles-29412405-bq88c\" (UID: \"81173897-29c5-4ce0-a308-f48eadb82cc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.378353 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81173897-29c5-4ce0-a308-f48eadb82cc4-secret-volume\") pod \"collect-profiles-29412405-bq88c\" (UID: \"81173897-29c5-4ce0-a308-f48eadb82cc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.385384 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snskk\" (UniqueName: \"kubernetes.io/projected/81173897-29c5-4ce0-a308-f48eadb82cc4-kube-api-access-snskk\") pod \"collect-profiles-29412405-bq88c\" (UID: \"81173897-29c5-4ce0-a308-f48eadb82cc4\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.462991 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.704339 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c"] Dec 03 06:45:00 crc kubenswrapper[4831]: W1203 06:45:00.709841 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81173897_29c5_4ce0_a308_f48eadb82cc4.slice/crio-42c1d75cdab3a30bc415b1f3bac926f15a9af1bec92f841c751f7c34e3848d70 WatchSource:0}: Error finding container 42c1d75cdab3a30bc415b1f3bac926f15a9af1bec92f841c751f7c34e3848d70: Status 404 returned error can't find the container with id 42c1d75cdab3a30bc415b1f3bac926f15a9af1bec92f841c751f7c34e3848d70 Dec 03 06:45:00 crc kubenswrapper[4831]: I1203 06:45:00.834358 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" event={"ID":"81173897-29c5-4ce0-a308-f48eadb82cc4","Type":"ContainerStarted","Data":"42c1d75cdab3a30bc415b1f3bac926f15a9af1bec92f841c751f7c34e3848d70"} Dec 03 06:45:01 crc kubenswrapper[4831]: I1203 06:45:01.842284 4831 generic.go:334] "Generic (PLEG): container finished" podID="81173897-29c5-4ce0-a308-f48eadb82cc4" containerID="a05de0e8b4131a01238775493bdc11ac50bac649e8c4b73a406888571d862008" exitCode=0 Dec 03 06:45:01 crc kubenswrapper[4831]: I1203 06:45:01.842363 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" event={"ID":"81173897-29c5-4ce0-a308-f48eadb82cc4","Type":"ContainerDied","Data":"a05de0e8b4131a01238775493bdc11ac50bac649e8c4b73a406888571d862008"} Dec 03 06:45:03 crc kubenswrapper[4831]: I1203 06:45:03.148144 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" Dec 03 06:45:03 crc kubenswrapper[4831]: I1203 06:45:03.317815 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81173897-29c5-4ce0-a308-f48eadb82cc4-config-volume\") pod \"81173897-29c5-4ce0-a308-f48eadb82cc4\" (UID: \"81173897-29c5-4ce0-a308-f48eadb82cc4\") " Dec 03 06:45:03 crc kubenswrapper[4831]: I1203 06:45:03.318098 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81173897-29c5-4ce0-a308-f48eadb82cc4-secret-volume\") pod \"81173897-29c5-4ce0-a308-f48eadb82cc4\" (UID: \"81173897-29c5-4ce0-a308-f48eadb82cc4\") " Dec 03 06:45:03 crc kubenswrapper[4831]: I1203 06:45:03.318150 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snskk\" (UniqueName: \"kubernetes.io/projected/81173897-29c5-4ce0-a308-f48eadb82cc4-kube-api-access-snskk\") pod \"81173897-29c5-4ce0-a308-f48eadb82cc4\" (UID: \"81173897-29c5-4ce0-a308-f48eadb82cc4\") " Dec 03 06:45:03 crc kubenswrapper[4831]: I1203 06:45:03.318310 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81173897-29c5-4ce0-a308-f48eadb82cc4-config-volume" (OuterVolumeSpecName: "config-volume") pod "81173897-29c5-4ce0-a308-f48eadb82cc4" (UID: "81173897-29c5-4ce0-a308-f48eadb82cc4"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:03 crc kubenswrapper[4831]: I1203 06:45:03.318517 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/81173897-29c5-4ce0-a308-f48eadb82cc4-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:03 crc kubenswrapper[4831]: I1203 06:45:03.322705 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81173897-29c5-4ce0-a308-f48eadb82cc4-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "81173897-29c5-4ce0-a308-f48eadb82cc4" (UID: "81173897-29c5-4ce0-a308-f48eadb82cc4"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:03 crc kubenswrapper[4831]: I1203 06:45:03.328503 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81173897-29c5-4ce0-a308-f48eadb82cc4-kube-api-access-snskk" (OuterVolumeSpecName: "kube-api-access-snskk") pod "81173897-29c5-4ce0-a308-f48eadb82cc4" (UID: "81173897-29c5-4ce0-a308-f48eadb82cc4"). InnerVolumeSpecName "kube-api-access-snskk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:03 crc kubenswrapper[4831]: I1203 06:45:03.419340 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/81173897-29c5-4ce0-a308-f48eadb82cc4-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:03 crc kubenswrapper[4831]: I1203 06:45:03.419370 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snskk\" (UniqueName: \"kubernetes.io/projected/81173897-29c5-4ce0-a308-f48eadb82cc4-kube-api-access-snskk\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:03 crc kubenswrapper[4831]: I1203 06:45:03.852303 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" event={"ID":"81173897-29c5-4ce0-a308-f48eadb82cc4","Type":"ContainerDied","Data":"42c1d75cdab3a30bc415b1f3bac926f15a9af1bec92f841c751f7c34e3848d70"} Dec 03 06:45:03 crc kubenswrapper[4831]: I1203 06:45:03.852375 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42c1d75cdab3a30bc415b1f3bac926f15a9af1bec92f841c751f7c34e3848d70" Dec 03 06:45:03 crc kubenswrapper[4831]: I1203 06:45:03.852352 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.267179 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj"] Dec 03 06:45:11 crc kubenswrapper[4831]: E1203 06:45:11.268011 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81173897-29c5-4ce0-a308-f48eadb82cc4" containerName="collect-profiles" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.268027 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="81173897-29c5-4ce0-a308-f48eadb82cc4" containerName="collect-profiles" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.268187 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="81173897-29c5-4ce0-a308-f48eadb82cc4" containerName="collect-profiles" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.269359 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.272950 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.285401 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj"] Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.343160 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6be5b407-422d-4d4c-8897-d2477b1c1ae1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj\" (UID: \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.343305 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hf2ks\" (UniqueName: \"kubernetes.io/projected/6be5b407-422d-4d4c-8897-d2477b1c1ae1-kube-api-access-hf2ks\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj\" (UID: \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.343405 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6be5b407-422d-4d4c-8897-d2477b1c1ae1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj\" (UID: \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.444343 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6be5b407-422d-4d4c-8897-d2477b1c1ae1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj\" (UID: \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.444419 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hf2ks\" (UniqueName: \"kubernetes.io/projected/6be5b407-422d-4d4c-8897-d2477b1c1ae1-kube-api-access-hf2ks\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj\" (UID: \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.444478 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6be5b407-422d-4d4c-8897-d2477b1c1ae1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj\" (UID: \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.445051 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6be5b407-422d-4d4c-8897-d2477b1c1ae1-bundle\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj\" (UID: \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.446657 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6be5b407-422d-4d4c-8897-d2477b1c1ae1-util\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj\" (UID: \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.472266 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hf2ks\" (UniqueName: \"kubernetes.io/projected/6be5b407-422d-4d4c-8897-d2477b1c1ae1-kube-api-access-hf2ks\") pod \"af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj\" (UID: \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\") " pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.591563 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" Dec 03 06:45:11 crc kubenswrapper[4831]: I1203 06:45:11.883007 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-dwsb6" podUID="c289d28d-642e-4cc4-9d25-f025800585d1" containerName="console" containerID="cri-o://0e7b20210f38cf82dcda97b3c3ee0a459fc00cbccda882277073a39f4efc7e6f" gracePeriod=15 Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.049524 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj"] Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.177577 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dwsb6_c289d28d-642e-4cc4-9d25-f025800585d1/console/0.log" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.177636 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.355183 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-oauth-serving-cert\") pod \"c289d28d-642e-4cc4-9d25-f025800585d1\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.355258 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-console-config\") pod \"c289d28d-642e-4cc4-9d25-f025800585d1\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.355361 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jc4qm\" (UniqueName: \"kubernetes.io/projected/c289d28d-642e-4cc4-9d25-f025800585d1-kube-api-access-jc4qm\") pod \"c289d28d-642e-4cc4-9d25-f025800585d1\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.355533 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c289d28d-642e-4cc4-9d25-f025800585d1-console-serving-cert\") pod \"c289d28d-642e-4cc4-9d25-f025800585d1\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.355578 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-trusted-ca-bundle\") pod \"c289d28d-642e-4cc4-9d25-f025800585d1\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.355619 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c289d28d-642e-4cc4-9d25-f025800585d1-console-oauth-config\") pod \"c289d28d-642e-4cc4-9d25-f025800585d1\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.355673 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-service-ca\") pod \"c289d28d-642e-4cc4-9d25-f025800585d1\" (UID: \"c289d28d-642e-4cc4-9d25-f025800585d1\") " Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.356254 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "c289d28d-642e-4cc4-9d25-f025800585d1" (UID: "c289d28d-642e-4cc4-9d25-f025800585d1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.356920 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-service-ca" (OuterVolumeSpecName: "service-ca") pod "c289d28d-642e-4cc4-9d25-f025800585d1" (UID: "c289d28d-642e-4cc4-9d25-f025800585d1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.356953 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "c289d28d-642e-4cc4-9d25-f025800585d1" (UID: "c289d28d-642e-4cc4-9d25-f025800585d1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.357036 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-console-config" (OuterVolumeSpecName: "console-config") pod "c289d28d-642e-4cc4-9d25-f025800585d1" (UID: "c289d28d-642e-4cc4-9d25-f025800585d1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.360422 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c289d28d-642e-4cc4-9d25-f025800585d1-kube-api-access-jc4qm" (OuterVolumeSpecName: "kube-api-access-jc4qm") pod "c289d28d-642e-4cc4-9d25-f025800585d1" (UID: "c289d28d-642e-4cc4-9d25-f025800585d1"). InnerVolumeSpecName "kube-api-access-jc4qm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.361132 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c289d28d-642e-4cc4-9d25-f025800585d1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "c289d28d-642e-4cc4-9d25-f025800585d1" (UID: "c289d28d-642e-4cc4-9d25-f025800585d1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.361816 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c289d28d-642e-4cc4-9d25-f025800585d1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "c289d28d-642e-4cc4-9d25-f025800585d1" (UID: "c289d28d-642e-4cc4-9d25-f025800585d1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.458136 4831 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-service-ca\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.458203 4831 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-console-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.458222 4831 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.458240 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jc4qm\" (UniqueName: \"kubernetes.io/projected/c289d28d-642e-4cc4-9d25-f025800585d1-kube-api-access-jc4qm\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.458257 4831 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/c289d28d-642e-4cc4-9d25-f025800585d1-console-serving-cert\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.458276 4831 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c289d28d-642e-4cc4-9d25-f025800585d1-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.458293 4831 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/c289d28d-642e-4cc4-9d25-f025800585d1-console-oauth-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.907275 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-dwsb6_c289d28d-642e-4cc4-9d25-f025800585d1/console/0.log" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.907633 4831 generic.go:334] "Generic (PLEG): container finished" podID="c289d28d-642e-4cc4-9d25-f025800585d1" containerID="0e7b20210f38cf82dcda97b3c3ee0a459fc00cbccda882277073a39f4efc7e6f" exitCode=2 Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.907717 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dwsb6" event={"ID":"c289d28d-642e-4cc4-9d25-f025800585d1","Type":"ContainerDied","Data":"0e7b20210f38cf82dcda97b3c3ee0a459fc00cbccda882277073a39f4efc7e6f"} Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.907729 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-dwsb6" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.907749 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-dwsb6" event={"ID":"c289d28d-642e-4cc4-9d25-f025800585d1","Type":"ContainerDied","Data":"40d62e3900bad6dab75bc85f9d1a23c515aea109667b2a9eb825f19b5a35c0c4"} Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.907771 4831 scope.go:117] "RemoveContainer" containerID="0e7b20210f38cf82dcda97b3c3ee0a459fc00cbccda882277073a39f4efc7e6f" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.909307 4831 generic.go:334] "Generic (PLEG): container finished" podID="6be5b407-422d-4d4c-8897-d2477b1c1ae1" containerID="8d347f8bda36aeedb35d11dfba4dc7b4059d22ca4f374e62841129d8acb8b77a" exitCode=0 Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.909344 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" event={"ID":"6be5b407-422d-4d4c-8897-d2477b1c1ae1","Type":"ContainerDied","Data":"8d347f8bda36aeedb35d11dfba4dc7b4059d22ca4f374e62841129d8acb8b77a"} Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.909383 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" event={"ID":"6be5b407-422d-4d4c-8897-d2477b1c1ae1","Type":"ContainerStarted","Data":"f27a7dfce6cf1f73164632e176274be2ba968d3d9b1ed3e6cbf0e77accc3b989"} Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.925742 4831 scope.go:117] "RemoveContainer" containerID="0e7b20210f38cf82dcda97b3c3ee0a459fc00cbccda882277073a39f4efc7e6f" Dec 03 06:45:12 crc kubenswrapper[4831]: E1203 06:45:12.926131 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e7b20210f38cf82dcda97b3c3ee0a459fc00cbccda882277073a39f4efc7e6f\": container with ID starting with 0e7b20210f38cf82dcda97b3c3ee0a459fc00cbccda882277073a39f4efc7e6f not found: ID does not exist" containerID="0e7b20210f38cf82dcda97b3c3ee0a459fc00cbccda882277073a39f4efc7e6f" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.926163 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e7b20210f38cf82dcda97b3c3ee0a459fc00cbccda882277073a39f4efc7e6f"} err="failed to get container status \"0e7b20210f38cf82dcda97b3c3ee0a459fc00cbccda882277073a39f4efc7e6f\": rpc error: code = NotFound desc = could not find container \"0e7b20210f38cf82dcda97b3c3ee0a459fc00cbccda882277073a39f4efc7e6f\": container with ID starting with 0e7b20210f38cf82dcda97b3c3ee0a459fc00cbccda882277073a39f4efc7e6f not found: ID does not exist" Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.955243 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-dwsb6"] Dec 03 06:45:12 crc kubenswrapper[4831]: I1203 06:45:12.962836 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-dwsb6"] Dec 03 06:45:13 crc kubenswrapper[4831]: I1203 06:45:13.023455 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c289d28d-642e-4cc4-9d25-f025800585d1" path="/var/lib/kubelet/pods/c289d28d-642e-4cc4-9d25-f025800585d1/volumes" Dec 03 06:45:14 crc kubenswrapper[4831]: I1203 06:45:14.930159 4831 generic.go:334] "Generic (PLEG): container finished" podID="6be5b407-422d-4d4c-8897-d2477b1c1ae1" containerID="673b5aede08c0dd271298c83ccb98fbc056e06c4bb5426725bc9e485c49e3f33" exitCode=0 Dec 03 06:45:14 crc kubenswrapper[4831]: I1203 06:45:14.930274 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" event={"ID":"6be5b407-422d-4d4c-8897-d2477b1c1ae1","Type":"ContainerDied","Data":"673b5aede08c0dd271298c83ccb98fbc056e06c4bb5426725bc9e485c49e3f33"} Dec 03 06:45:15 crc kubenswrapper[4831]: I1203 06:45:15.940710 4831 generic.go:334] "Generic (PLEG): container finished" podID="6be5b407-422d-4d4c-8897-d2477b1c1ae1" containerID="2b74dc710ec425aed72c6748f45ae1fed4074d1e23ec25f43e2fade0780126f0" exitCode=0 Dec 03 06:45:15 crc kubenswrapper[4831]: I1203 06:45:15.940778 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" event={"ID":"6be5b407-422d-4d4c-8897-d2477b1c1ae1","Type":"ContainerDied","Data":"2b74dc710ec425aed72c6748f45ae1fed4074d1e23ec25f43e2fade0780126f0"} Dec 03 06:45:17 crc kubenswrapper[4831]: I1203 06:45:17.248023 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" Dec 03 06:45:17 crc kubenswrapper[4831]: I1203 06:45:17.434090 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hf2ks\" (UniqueName: \"kubernetes.io/projected/6be5b407-422d-4d4c-8897-d2477b1c1ae1-kube-api-access-hf2ks\") pod \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\" (UID: \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\") " Dec 03 06:45:17 crc kubenswrapper[4831]: I1203 06:45:17.434154 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6be5b407-422d-4d4c-8897-d2477b1c1ae1-util\") pod \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\" (UID: \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\") " Dec 03 06:45:17 crc kubenswrapper[4831]: I1203 06:45:17.434246 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6be5b407-422d-4d4c-8897-d2477b1c1ae1-bundle\") pod \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\" (UID: \"6be5b407-422d-4d4c-8897-d2477b1c1ae1\") " Dec 03 06:45:17 crc kubenswrapper[4831]: I1203 06:45:17.435596 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6be5b407-422d-4d4c-8897-d2477b1c1ae1-bundle" (OuterVolumeSpecName: "bundle") pod "6be5b407-422d-4d4c-8897-d2477b1c1ae1" (UID: "6be5b407-422d-4d4c-8897-d2477b1c1ae1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:45:17 crc kubenswrapper[4831]: I1203 06:45:17.445448 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6be5b407-422d-4d4c-8897-d2477b1c1ae1-kube-api-access-hf2ks" (OuterVolumeSpecName: "kube-api-access-hf2ks") pod "6be5b407-422d-4d4c-8897-d2477b1c1ae1" (UID: "6be5b407-422d-4d4c-8897-d2477b1c1ae1"). InnerVolumeSpecName "kube-api-access-hf2ks". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:45:17 crc kubenswrapper[4831]: I1203 06:45:17.535386 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6be5b407-422d-4d4c-8897-d2477b1c1ae1-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:17 crc kubenswrapper[4831]: I1203 06:45:17.535434 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hf2ks\" (UniqueName: \"kubernetes.io/projected/6be5b407-422d-4d4c-8897-d2477b1c1ae1-kube-api-access-hf2ks\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:17 crc kubenswrapper[4831]: I1203 06:45:17.546038 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6be5b407-422d-4d4c-8897-d2477b1c1ae1-util" (OuterVolumeSpecName: "util") pod "6be5b407-422d-4d4c-8897-d2477b1c1ae1" (UID: "6be5b407-422d-4d4c-8897-d2477b1c1ae1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:45:17 crc kubenswrapper[4831]: I1203 06:45:17.636825 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6be5b407-422d-4d4c-8897-d2477b1c1ae1-util\") on node \"crc\" DevicePath \"\"" Dec 03 06:45:17 crc kubenswrapper[4831]: I1203 06:45:17.969161 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" event={"ID":"6be5b407-422d-4d4c-8897-d2477b1c1ae1","Type":"ContainerDied","Data":"f27a7dfce6cf1f73164632e176274be2ba968d3d9b1ed3e6cbf0e77accc3b989"} Dec 03 06:45:17 crc kubenswrapper[4831]: I1203 06:45:17.969517 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f27a7dfce6cf1f73164632e176274be2ba968d3d9b1ed3e6cbf0e77accc3b989" Dec 03 06:45:17 crc kubenswrapper[4831]: I1203 06:45:17.969234 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.126492 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb"] Dec 03 06:45:27 crc kubenswrapper[4831]: E1203 06:45:27.127523 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be5b407-422d-4d4c-8897-d2477b1c1ae1" containerName="extract" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.127541 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be5b407-422d-4d4c-8897-d2477b1c1ae1" containerName="extract" Dec 03 06:45:27 crc kubenswrapper[4831]: E1203 06:45:27.127564 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c289d28d-642e-4cc4-9d25-f025800585d1" containerName="console" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.127571 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c289d28d-642e-4cc4-9d25-f025800585d1" containerName="console" Dec 03 06:45:27 crc kubenswrapper[4831]: E1203 06:45:27.127582 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be5b407-422d-4d4c-8897-d2477b1c1ae1" containerName="pull" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.127590 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be5b407-422d-4d4c-8897-d2477b1c1ae1" containerName="pull" Dec 03 06:45:27 crc kubenswrapper[4831]: E1203 06:45:27.127600 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6be5b407-422d-4d4c-8897-d2477b1c1ae1" containerName="util" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.127608 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6be5b407-422d-4d4c-8897-d2477b1c1ae1" containerName="util" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.127729 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c289d28d-642e-4cc4-9d25-f025800585d1" containerName="console" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.127744 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6be5b407-422d-4d4c-8897-d2477b1c1ae1" containerName="extract" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.128203 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.131386 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.131399 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.131937 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-54c8h" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.132374 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.133474 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.148724 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb"] Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.255536 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hmvq\" (UniqueName: \"kubernetes.io/projected/e149f20c-2288-4fcf-b90a-f2bb4029436d-kube-api-access-9hmvq\") pod \"metallb-operator-controller-manager-8666f48b7d-c4vmb\" (UID: \"e149f20c-2288-4fcf-b90a-f2bb4029436d\") " pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.255582 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e149f20c-2288-4fcf-b90a-f2bb4029436d-webhook-cert\") pod \"metallb-operator-controller-manager-8666f48b7d-c4vmb\" (UID: \"e149f20c-2288-4fcf-b90a-f2bb4029436d\") " pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.255620 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e149f20c-2288-4fcf-b90a-f2bb4029436d-apiservice-cert\") pod \"metallb-operator-controller-manager-8666f48b7d-c4vmb\" (UID: \"e149f20c-2288-4fcf-b90a-f2bb4029436d\") " pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.356213 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hmvq\" (UniqueName: \"kubernetes.io/projected/e149f20c-2288-4fcf-b90a-f2bb4029436d-kube-api-access-9hmvq\") pod \"metallb-operator-controller-manager-8666f48b7d-c4vmb\" (UID: \"e149f20c-2288-4fcf-b90a-f2bb4029436d\") " pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.356257 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e149f20c-2288-4fcf-b90a-f2bb4029436d-webhook-cert\") pod \"metallb-operator-controller-manager-8666f48b7d-c4vmb\" (UID: \"e149f20c-2288-4fcf-b90a-f2bb4029436d\") " pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.356286 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e149f20c-2288-4fcf-b90a-f2bb4029436d-apiservice-cert\") pod \"metallb-operator-controller-manager-8666f48b7d-c4vmb\" (UID: \"e149f20c-2288-4fcf-b90a-f2bb4029436d\") " pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.370943 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/e149f20c-2288-4fcf-b90a-f2bb4029436d-apiservice-cert\") pod \"metallb-operator-controller-manager-8666f48b7d-c4vmb\" (UID: \"e149f20c-2288-4fcf-b90a-f2bb4029436d\") " pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.371050 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/e149f20c-2288-4fcf-b90a-f2bb4029436d-webhook-cert\") pod \"metallb-operator-controller-manager-8666f48b7d-c4vmb\" (UID: \"e149f20c-2288-4fcf-b90a-f2bb4029436d\") " pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.373526 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hmvq\" (UniqueName: \"kubernetes.io/projected/e149f20c-2288-4fcf-b90a-f2bb4029436d-kube-api-access-9hmvq\") pod \"metallb-operator-controller-manager-8666f48b7d-c4vmb\" (UID: \"e149f20c-2288-4fcf-b90a-f2bb4029436d\") " pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.445307 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.466148 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt"] Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.466980 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.473753 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.473968 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-x5ld2" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.474132 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.494866 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt"] Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.662156 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ae73146a-b079-4641-942b-9ceebbfbae34-apiservice-cert\") pod \"metallb-operator-webhook-server-5bbcb6bcff-p6plt\" (UID: \"ae73146a-b079-4641-942b-9ceebbfbae34\") " pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.662485 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlbpl\" (UniqueName: \"kubernetes.io/projected/ae73146a-b079-4641-942b-9ceebbfbae34-kube-api-access-xlbpl\") pod \"metallb-operator-webhook-server-5bbcb6bcff-p6plt\" (UID: \"ae73146a-b079-4641-942b-9ceebbfbae34\") " pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.662550 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ae73146a-b079-4641-942b-9ceebbfbae34-webhook-cert\") pod \"metallb-operator-webhook-server-5bbcb6bcff-p6plt\" (UID: \"ae73146a-b079-4641-942b-9ceebbfbae34\") " pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.763641 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ae73146a-b079-4641-942b-9ceebbfbae34-webhook-cert\") pod \"metallb-operator-webhook-server-5bbcb6bcff-p6plt\" (UID: \"ae73146a-b079-4641-942b-9ceebbfbae34\") " pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.763727 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ae73146a-b079-4641-942b-9ceebbfbae34-apiservice-cert\") pod \"metallb-operator-webhook-server-5bbcb6bcff-p6plt\" (UID: \"ae73146a-b079-4641-942b-9ceebbfbae34\") " pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.763771 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xlbpl\" (UniqueName: \"kubernetes.io/projected/ae73146a-b079-4641-942b-9ceebbfbae34-kube-api-access-xlbpl\") pod \"metallb-operator-webhook-server-5bbcb6bcff-p6plt\" (UID: \"ae73146a-b079-4641-942b-9ceebbfbae34\") " pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.768403 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ae73146a-b079-4641-942b-9ceebbfbae34-apiservice-cert\") pod \"metallb-operator-webhook-server-5bbcb6bcff-p6plt\" (UID: \"ae73146a-b079-4641-942b-9ceebbfbae34\") " pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.768929 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ae73146a-b079-4641-942b-9ceebbfbae34-webhook-cert\") pod \"metallb-operator-webhook-server-5bbcb6bcff-p6plt\" (UID: \"ae73146a-b079-4641-942b-9ceebbfbae34\") " pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.814352 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlbpl\" (UniqueName: \"kubernetes.io/projected/ae73146a-b079-4641-942b-9ceebbfbae34-kube-api-access-xlbpl\") pod \"metallb-operator-webhook-server-5bbcb6bcff-p6plt\" (UID: \"ae73146a-b079-4641-942b-9ceebbfbae34\") " pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" Dec 03 06:45:27 crc kubenswrapper[4831]: I1203 06:45:27.837948 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb"] Dec 03 06:45:27 crc kubenswrapper[4831]: W1203 06:45:27.842974 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode149f20c_2288_4fcf_b90a_f2bb4029436d.slice/crio-165be8b22b30494ffbf3e1e4e106d234406402536ed06d0321334a37f853157e WatchSource:0}: Error finding container 165be8b22b30494ffbf3e1e4e106d234406402536ed06d0321334a37f853157e: Status 404 returned error can't find the container with id 165be8b22b30494ffbf3e1e4e106d234406402536ed06d0321334a37f853157e Dec 03 06:45:28 crc kubenswrapper[4831]: I1203 06:45:28.025464 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" event={"ID":"e149f20c-2288-4fcf-b90a-f2bb4029436d","Type":"ContainerStarted","Data":"165be8b22b30494ffbf3e1e4e106d234406402536ed06d0321334a37f853157e"} Dec 03 06:45:28 crc kubenswrapper[4831]: I1203 06:45:28.109341 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" Dec 03 06:45:28 crc kubenswrapper[4831]: I1203 06:45:28.325251 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt"] Dec 03 06:45:28 crc kubenswrapper[4831]: W1203 06:45:28.332409 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae73146a_b079_4641_942b_9ceebbfbae34.slice/crio-4d4d5b24d26140d515410d3084977a1fcdd63bd95b4b7d06ae2e61cc0897b9aa WatchSource:0}: Error finding container 4d4d5b24d26140d515410d3084977a1fcdd63bd95b4b7d06ae2e61cc0897b9aa: Status 404 returned error can't find the container with id 4d4d5b24d26140d515410d3084977a1fcdd63bd95b4b7d06ae2e61cc0897b9aa Dec 03 06:45:29 crc kubenswrapper[4831]: I1203 06:45:29.033471 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" event={"ID":"ae73146a-b079-4641-942b-9ceebbfbae34","Type":"ContainerStarted","Data":"4d4d5b24d26140d515410d3084977a1fcdd63bd95b4b7d06ae2e61cc0897b9aa"} Dec 03 06:45:33 crc kubenswrapper[4831]: I1203 06:45:33.073098 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" event={"ID":"ae73146a-b079-4641-942b-9ceebbfbae34","Type":"ContainerStarted","Data":"1658c5ecbacae3e0efd63701d20cf8833ad7a4a9bd26ab352feb915d0b1009d3"} Dec 03 06:45:33 crc kubenswrapper[4831]: I1203 06:45:33.073857 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" Dec 03 06:45:33 crc kubenswrapper[4831]: I1203 06:45:33.074957 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" event={"ID":"e149f20c-2288-4fcf-b90a-f2bb4029436d","Type":"ContainerStarted","Data":"e62c11480a544c333915fdce5873f25bd444ebda7dae26231ac1c0a3ebe59795"} Dec 03 06:45:33 crc kubenswrapper[4831]: I1203 06:45:33.075846 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" Dec 03 06:45:33 crc kubenswrapper[4831]: I1203 06:45:33.114271 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" podStartSLOduration=1.778746606 podStartE2EDuration="6.114244537s" podCreationTimestamp="2025-12-03 06:45:27 +0000 UTC" firstStartedPulling="2025-12-03 06:45:28.334212485 +0000 UTC m=+865.677795983" lastFinishedPulling="2025-12-03 06:45:32.669710406 +0000 UTC m=+870.013293914" observedRunningTime="2025-12-03 06:45:33.109119834 +0000 UTC m=+870.452703382" watchObservedRunningTime="2025-12-03 06:45:33.114244537 +0000 UTC m=+870.457828085" Dec 03 06:45:33 crc kubenswrapper[4831]: I1203 06:45:33.152097 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" podStartSLOduration=1.41104578 podStartE2EDuration="6.15207977s" podCreationTimestamp="2025-12-03 06:45:27 +0000 UTC" firstStartedPulling="2025-12-03 06:45:27.844726506 +0000 UTC m=+865.188310014" lastFinishedPulling="2025-12-03 06:45:32.585760486 +0000 UTC m=+869.929344004" observedRunningTime="2025-12-03 06:45:33.147668191 +0000 UTC m=+870.491251699" watchObservedRunningTime="2025-12-03 06:45:33.15207977 +0000 UTC m=+870.495663278" Dec 03 06:45:48 crc kubenswrapper[4831]: I1203 06:45:48.116956 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5bbcb6bcff-p6plt" Dec 03 06:45:57 crc kubenswrapper[4831]: I1203 06:45:57.596422 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:45:57 crc kubenswrapper[4831]: I1203 06:45:57.597011 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:46:02 crc kubenswrapper[4831]: I1203 06:46:02.887165 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d6q7x"] Dec 03 06:46:02 crc kubenswrapper[4831]: I1203 06:46:02.888602 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:02 crc kubenswrapper[4831]: I1203 06:46:02.900277 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6q7x"] Dec 03 06:46:03 crc kubenswrapper[4831]: I1203 06:46:03.013439 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de478ce6-c46c-427a-ba46-f8b79546bf75-catalog-content\") pod \"redhat-marketplace-d6q7x\" (UID: \"de478ce6-c46c-427a-ba46-f8b79546bf75\") " pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:03 crc kubenswrapper[4831]: I1203 06:46:03.013722 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de478ce6-c46c-427a-ba46-f8b79546bf75-utilities\") pod \"redhat-marketplace-d6q7x\" (UID: \"de478ce6-c46c-427a-ba46-f8b79546bf75\") " pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:03 crc kubenswrapper[4831]: I1203 06:46:03.013779 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzb22\" (UniqueName: \"kubernetes.io/projected/de478ce6-c46c-427a-ba46-f8b79546bf75-kube-api-access-xzb22\") pod \"redhat-marketplace-d6q7x\" (UID: \"de478ce6-c46c-427a-ba46-f8b79546bf75\") " pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:03 crc kubenswrapper[4831]: I1203 06:46:03.115532 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzb22\" (UniqueName: \"kubernetes.io/projected/de478ce6-c46c-427a-ba46-f8b79546bf75-kube-api-access-xzb22\") pod \"redhat-marketplace-d6q7x\" (UID: \"de478ce6-c46c-427a-ba46-f8b79546bf75\") " pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:03 crc kubenswrapper[4831]: I1203 06:46:03.115581 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de478ce6-c46c-427a-ba46-f8b79546bf75-utilities\") pod \"redhat-marketplace-d6q7x\" (UID: \"de478ce6-c46c-427a-ba46-f8b79546bf75\") " pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:03 crc kubenswrapper[4831]: I1203 06:46:03.115613 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de478ce6-c46c-427a-ba46-f8b79546bf75-catalog-content\") pod \"redhat-marketplace-d6q7x\" (UID: \"de478ce6-c46c-427a-ba46-f8b79546bf75\") " pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:03 crc kubenswrapper[4831]: I1203 06:46:03.116045 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de478ce6-c46c-427a-ba46-f8b79546bf75-catalog-content\") pod \"redhat-marketplace-d6q7x\" (UID: \"de478ce6-c46c-427a-ba46-f8b79546bf75\") " pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:03 crc kubenswrapper[4831]: I1203 06:46:03.116160 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de478ce6-c46c-427a-ba46-f8b79546bf75-utilities\") pod \"redhat-marketplace-d6q7x\" (UID: \"de478ce6-c46c-427a-ba46-f8b79546bf75\") " pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:03 crc kubenswrapper[4831]: I1203 06:46:03.154295 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzb22\" (UniqueName: \"kubernetes.io/projected/de478ce6-c46c-427a-ba46-f8b79546bf75-kube-api-access-xzb22\") pod \"redhat-marketplace-d6q7x\" (UID: \"de478ce6-c46c-427a-ba46-f8b79546bf75\") " pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:03 crc kubenswrapper[4831]: I1203 06:46:03.219744 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:03 crc kubenswrapper[4831]: I1203 06:46:03.567020 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6q7x"] Dec 03 06:46:04 crc kubenswrapper[4831]: I1203 06:46:04.255901 4831 generic.go:334] "Generic (PLEG): container finished" podID="de478ce6-c46c-427a-ba46-f8b79546bf75" containerID="dcf993f83590b6c33506cde0a61e0f4cb5d69a8e2dae7b609ea2d39bcef2fd37" exitCode=0 Dec 03 06:46:04 crc kubenswrapper[4831]: I1203 06:46:04.255957 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6q7x" event={"ID":"de478ce6-c46c-427a-ba46-f8b79546bf75","Type":"ContainerDied","Data":"dcf993f83590b6c33506cde0a61e0f4cb5d69a8e2dae7b609ea2d39bcef2fd37"} Dec 03 06:46:04 crc kubenswrapper[4831]: I1203 06:46:04.256277 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6q7x" event={"ID":"de478ce6-c46c-427a-ba46-f8b79546bf75","Type":"ContainerStarted","Data":"c77cbc45ccfac31a5d817127db33ac0911976b5fe272e39b0cdd62351565894e"} Dec 03 06:46:05 crc kubenswrapper[4831]: I1203 06:46:05.268166 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6q7x" event={"ID":"de478ce6-c46c-427a-ba46-f8b79546bf75","Type":"ContainerStarted","Data":"6049db91152997fab8da5c4d261d62ba474e09baa8b1559548fd6a83ae1e23e9"} Dec 03 06:46:06 crc kubenswrapper[4831]: I1203 06:46:06.275509 4831 generic.go:334] "Generic (PLEG): container finished" podID="de478ce6-c46c-427a-ba46-f8b79546bf75" containerID="6049db91152997fab8da5c4d261d62ba474e09baa8b1559548fd6a83ae1e23e9" exitCode=0 Dec 03 06:46:06 crc kubenswrapper[4831]: I1203 06:46:06.275556 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6q7x" event={"ID":"de478ce6-c46c-427a-ba46-f8b79546bf75","Type":"ContainerDied","Data":"6049db91152997fab8da5c4d261d62ba474e09baa8b1559548fd6a83ae1e23e9"} Dec 03 06:46:07 crc kubenswrapper[4831]: I1203 06:46:07.449033 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-8666f48b7d-c4vmb" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.136235 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-49hk4"] Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.138270 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.146727 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-mxdpp" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.146998 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.147113 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.167969 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8"] Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.168613 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.173397 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.178794 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b023523a-cf93-48a2-be02-a6f4ba831bca-metrics\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.178860 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b023523a-cf93-48a2-be02-a6f4ba831bca-frr-conf\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.178887 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b023523a-cf93-48a2-be02-a6f4ba831bca-frr-sockets\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.178908 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-897tr\" (UniqueName: \"kubernetes.io/projected/b023523a-cf93-48a2-be02-a6f4ba831bca-kube-api-access-897tr\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.178927 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b023523a-cf93-48a2-be02-a6f4ba831bca-reloader\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.178952 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b023523a-cf93-48a2-be02-a6f4ba831bca-frr-startup\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.178973 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b023523a-cf93-48a2-be02-a6f4ba831bca-metrics-certs\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.187996 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8"] Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.218925 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-flnpv"] Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.219968 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-flnpv" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.221969 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.222020 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.223096 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-k4bww" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.223701 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.245637 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-f8648f98b-vwlgn"] Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.246424 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-vwlgn" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.247962 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.269012 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-vwlgn"] Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.279829 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ce7e180-7f81-4cb1-b046-7e53111c2731-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-kvrm8\" (UID: \"8ce7e180-7f81-4cb1-b046-7e53111c2731\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.279883 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b023523a-cf93-48a2-be02-a6f4ba831bca-frr-startup\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.279913 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b023523a-cf93-48a2-be02-a6f4ba831bca-metrics-certs\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.279933 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-metrics-certs\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.279949 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4ed12823-b3b1-4ee5-af2e-07320e5421eb-metallb-excludel2\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.279970 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e184be1-0196-438c-a4ed-05ee32ccac09-metrics-certs\") pod \"controller-f8648f98b-vwlgn\" (UID: \"6e184be1-0196-438c-a4ed-05ee32ccac09\") " pod="metallb-system/controller-f8648f98b-vwlgn" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.279986 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b023523a-cf93-48a2-be02-a6f4ba831bca-metrics\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.280003 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e184be1-0196-438c-a4ed-05ee32ccac09-cert\") pod \"controller-f8648f98b-vwlgn\" (UID: \"6e184be1-0196-438c-a4ed-05ee32ccac09\") " pod="metallb-system/controller-f8648f98b-vwlgn" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.280029 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rf97\" (UniqueName: \"kubernetes.io/projected/6e184be1-0196-438c-a4ed-05ee32ccac09-kube-api-access-8rf97\") pod \"controller-f8648f98b-vwlgn\" (UID: \"6e184be1-0196-438c-a4ed-05ee32ccac09\") " pod="metallb-system/controller-f8648f98b-vwlgn" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.280048 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr9k4\" (UniqueName: \"kubernetes.io/projected/8ce7e180-7f81-4cb1-b046-7e53111c2731-kube-api-access-rr9k4\") pod \"frr-k8s-webhook-server-7fcb986d4-kvrm8\" (UID: \"8ce7e180-7f81-4cb1-b046-7e53111c2731\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.280082 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b023523a-cf93-48a2-be02-a6f4ba831bca-frr-conf\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.280104 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b023523a-cf93-48a2-be02-a6f4ba831bca-frr-sockets\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.280122 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-897tr\" (UniqueName: \"kubernetes.io/projected/b023523a-cf93-48a2-be02-a6f4ba831bca-kube-api-access-897tr\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.280138 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-memberlist\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.280155 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxnff\" (UniqueName: \"kubernetes.io/projected/4ed12823-b3b1-4ee5-af2e-07320e5421eb-kube-api-access-cxnff\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.280174 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b023523a-cf93-48a2-be02-a6f4ba831bca-reloader\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.280585 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b023523a-cf93-48a2-be02-a6f4ba831bca-reloader\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: E1203 06:46:08.280700 4831 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Dec 03 06:46:08 crc kubenswrapper[4831]: E1203 06:46:08.280757 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b023523a-cf93-48a2-be02-a6f4ba831bca-metrics-certs podName:b023523a-cf93-48a2-be02-a6f4ba831bca nodeName:}" failed. No retries permitted until 2025-12-03 06:46:08.780736917 +0000 UTC m=+906.124320425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b023523a-cf93-48a2-be02-a6f4ba831bca-metrics-certs") pod "frr-k8s-49hk4" (UID: "b023523a-cf93-48a2-be02-a6f4ba831bca") : secret "frr-k8s-certs-secret" not found Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.280809 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b023523a-cf93-48a2-be02-a6f4ba831bca-frr-startup\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.281046 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b023523a-cf93-48a2-be02-a6f4ba831bca-metrics\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.281132 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b023523a-cf93-48a2-be02-a6f4ba831bca-frr-conf\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.281268 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b023523a-cf93-48a2-be02-a6f4ba831bca-frr-sockets\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.289896 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6q7x" event={"ID":"de478ce6-c46c-427a-ba46-f8b79546bf75","Type":"ContainerStarted","Data":"1300490ac595cf8176a3f2cc503b9764e9f52082c6816aceb5dd686794d49de4"} Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.317176 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-897tr\" (UniqueName: \"kubernetes.io/projected/b023523a-cf93-48a2-be02-a6f4ba831bca-kube-api-access-897tr\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.321805 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d6q7x" podStartSLOduration=3.318854158 podStartE2EDuration="6.321788923s" podCreationTimestamp="2025-12-03 06:46:02 +0000 UTC" firstStartedPulling="2025-12-03 06:46:04.258769282 +0000 UTC m=+901.602352830" lastFinishedPulling="2025-12-03 06:46:07.261704047 +0000 UTC m=+904.605287595" observedRunningTime="2025-12-03 06:46:08.320484062 +0000 UTC m=+905.664067560" watchObservedRunningTime="2025-12-03 06:46:08.321788923 +0000 UTC m=+905.665372431" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.381489 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-metrics-certs\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.381541 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4ed12823-b3b1-4ee5-af2e-07320e5421eb-metallb-excludel2\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.381565 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e184be1-0196-438c-a4ed-05ee32ccac09-metrics-certs\") pod \"controller-f8648f98b-vwlgn\" (UID: \"6e184be1-0196-438c-a4ed-05ee32ccac09\") " pod="metallb-system/controller-f8648f98b-vwlgn" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.381584 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e184be1-0196-438c-a4ed-05ee32ccac09-cert\") pod \"controller-f8648f98b-vwlgn\" (UID: \"6e184be1-0196-438c-a4ed-05ee32ccac09\") " pod="metallb-system/controller-f8648f98b-vwlgn" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.381613 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rf97\" (UniqueName: \"kubernetes.io/projected/6e184be1-0196-438c-a4ed-05ee32ccac09-kube-api-access-8rf97\") pod \"controller-f8648f98b-vwlgn\" (UID: \"6e184be1-0196-438c-a4ed-05ee32ccac09\") " pod="metallb-system/controller-f8648f98b-vwlgn" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.381632 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rr9k4\" (UniqueName: \"kubernetes.io/projected/8ce7e180-7f81-4cb1-b046-7e53111c2731-kube-api-access-rr9k4\") pod \"frr-k8s-webhook-server-7fcb986d4-kvrm8\" (UID: \"8ce7e180-7f81-4cb1-b046-7e53111c2731\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.381691 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-memberlist\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:08 crc kubenswrapper[4831]: E1203 06:46:08.381695 4831 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Dec 03 06:46:08 crc kubenswrapper[4831]: E1203 06:46:08.381770 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-metrics-certs podName:4ed12823-b3b1-4ee5-af2e-07320e5421eb nodeName:}" failed. No retries permitted until 2025-12-03 06:46:08.881751545 +0000 UTC m=+906.225335053 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-metrics-certs") pod "speaker-flnpv" (UID: "4ed12823-b3b1-4ee5-af2e-07320e5421eb") : secret "speaker-certs-secret" not found Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.381709 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cxnff\" (UniqueName: \"kubernetes.io/projected/4ed12823-b3b1-4ee5-af2e-07320e5421eb-kube-api-access-cxnff\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.381881 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ce7e180-7f81-4cb1-b046-7e53111c2731-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-kvrm8\" (UID: \"8ce7e180-7f81-4cb1-b046-7e53111c2731\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8" Dec 03 06:46:08 crc kubenswrapper[4831]: E1203 06:46:08.382231 4831 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.382238 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/4ed12823-b3b1-4ee5-af2e-07320e5421eb-metallb-excludel2\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:08 crc kubenswrapper[4831]: E1203 06:46:08.382266 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-memberlist podName:4ed12823-b3b1-4ee5-af2e-07320e5421eb nodeName:}" failed. No retries permitted until 2025-12-03 06:46:08.882257731 +0000 UTC m=+906.225841239 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-memberlist") pod "speaker-flnpv" (UID: "4ed12823-b3b1-4ee5-af2e-07320e5421eb") : secret "metallb-memberlist" not found Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.384720 4831 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.386891 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8ce7e180-7f81-4cb1-b046-7e53111c2731-cert\") pod \"frr-k8s-webhook-server-7fcb986d4-kvrm8\" (UID: \"8ce7e180-7f81-4cb1-b046-7e53111c2731\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.398780 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxnff\" (UniqueName: \"kubernetes.io/projected/4ed12823-b3b1-4ee5-af2e-07320e5421eb-kube-api-access-cxnff\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.398792 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6e184be1-0196-438c-a4ed-05ee32ccac09-cert\") pod \"controller-f8648f98b-vwlgn\" (UID: \"6e184be1-0196-438c-a4ed-05ee32ccac09\") " pod="metallb-system/controller-f8648f98b-vwlgn" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.402900 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rf97\" (UniqueName: \"kubernetes.io/projected/6e184be1-0196-438c-a4ed-05ee32ccac09-kube-api-access-8rf97\") pod \"controller-f8648f98b-vwlgn\" (UID: \"6e184be1-0196-438c-a4ed-05ee32ccac09\") " pod="metallb-system/controller-f8648f98b-vwlgn" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.402936 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr9k4\" (UniqueName: \"kubernetes.io/projected/8ce7e180-7f81-4cb1-b046-7e53111c2731-kube-api-access-rr9k4\") pod \"frr-k8s-webhook-server-7fcb986d4-kvrm8\" (UID: \"8ce7e180-7f81-4cb1-b046-7e53111c2731\") " pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.405872 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6e184be1-0196-438c-a4ed-05ee32ccac09-metrics-certs\") pod \"controller-f8648f98b-vwlgn\" (UID: \"6e184be1-0196-438c-a4ed-05ee32ccac09\") " pod="metallb-system/controller-f8648f98b-vwlgn" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.493251 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.561130 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-f8648f98b-vwlgn" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.714218 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8"] Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.792477 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b023523a-cf93-48a2-be02-a6f4ba831bca-metrics-certs\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.796468 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b023523a-cf93-48a2-be02-a6f4ba831bca-metrics-certs\") pod \"frr-k8s-49hk4\" (UID: \"b023523a-cf93-48a2-be02-a6f4ba831bca\") " pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.804300 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-f8648f98b-vwlgn"] Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.893637 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-memberlist\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.893936 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-metrics-certs\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:08 crc kubenswrapper[4831]: E1203 06:46:08.893822 4831 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Dec 03 06:46:08 crc kubenswrapper[4831]: E1203 06:46:08.894036 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-memberlist podName:4ed12823-b3b1-4ee5-af2e-07320e5421eb nodeName:}" failed. No retries permitted until 2025-12-03 06:46:09.894017252 +0000 UTC m=+907.237600760 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-memberlist") pod "speaker-flnpv" (UID: "4ed12823-b3b1-4ee5-af2e-07320e5421eb") : secret "metallb-memberlist" not found Dec 03 06:46:08 crc kubenswrapper[4831]: I1203 06:46:08.898665 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-metrics-certs\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:09 crc kubenswrapper[4831]: I1203 06:46:09.076254 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:09 crc kubenswrapper[4831]: I1203 06:46:09.295719 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-49hk4" event={"ID":"b023523a-cf93-48a2-be02-a6f4ba831bca","Type":"ContainerStarted","Data":"b767d138431e886a78698faa511ccd8198421ce4d4eeb2bf1f4c7228aeb5c56a"} Dec 03 06:46:09 crc kubenswrapper[4831]: I1203 06:46:09.296817 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8" event={"ID":"8ce7e180-7f81-4cb1-b046-7e53111c2731","Type":"ContainerStarted","Data":"24cf3ba2c3d6940f4ee76d508ccef66012da80ed9afe0d3efb0b247e4b86fe75"} Dec 03 06:46:09 crc kubenswrapper[4831]: I1203 06:46:09.298378 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-vwlgn" event={"ID":"6e184be1-0196-438c-a4ed-05ee32ccac09","Type":"ContainerStarted","Data":"60f18f8e03b719d45664b118f4a95e8a40780b624b8057891e4076c84cff464a"} Dec 03 06:46:09 crc kubenswrapper[4831]: I1203 06:46:09.298424 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-vwlgn" event={"ID":"6e184be1-0196-438c-a4ed-05ee32ccac09","Type":"ContainerStarted","Data":"2bd4daac7c181188a14f2d1b7e4570cb560c2a4110ba65b7d268582a82a3dad8"} Dec 03 06:46:09 crc kubenswrapper[4831]: I1203 06:46:09.298434 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-f8648f98b-vwlgn" event={"ID":"6e184be1-0196-438c-a4ed-05ee32ccac09","Type":"ContainerStarted","Data":"484d5fc04e174eefbd5b05c1dc6cb1806ffeed74be3e233863b7fe4eb3a9cc62"} Dec 03 06:46:09 crc kubenswrapper[4831]: I1203 06:46:09.298536 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-f8648f98b-vwlgn" Dec 03 06:46:09 crc kubenswrapper[4831]: I1203 06:46:09.906443 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-memberlist\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:09 crc kubenswrapper[4831]: I1203 06:46:09.914040 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/4ed12823-b3b1-4ee5-af2e-07320e5421eb-memberlist\") pod \"speaker-flnpv\" (UID: \"4ed12823-b3b1-4ee5-af2e-07320e5421eb\") " pod="metallb-system/speaker-flnpv" Dec 03 06:46:10 crc kubenswrapper[4831]: I1203 06:46:10.034642 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-flnpv" Dec 03 06:46:10 crc kubenswrapper[4831]: I1203 06:46:10.305960 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-flnpv" event={"ID":"4ed12823-b3b1-4ee5-af2e-07320e5421eb","Type":"ContainerStarted","Data":"3c9a551f286493a2dbf96bf73dd292bf181cd25893c9464bfb0115a5cc7ced55"} Dec 03 06:46:11 crc kubenswrapper[4831]: I1203 06:46:11.317523 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-flnpv" event={"ID":"4ed12823-b3b1-4ee5-af2e-07320e5421eb","Type":"ContainerStarted","Data":"28739794322a51f097aa185ade7c72bac632d7242e0d045e83446b57c6b18c96"} Dec 03 06:46:11 crc kubenswrapper[4831]: I1203 06:46:11.317820 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-flnpv" event={"ID":"4ed12823-b3b1-4ee5-af2e-07320e5421eb","Type":"ContainerStarted","Data":"d21121427bfdab631c11f75691bbb507f2bd8a60a0ca082054c32afe5ca37c85"} Dec 03 06:46:11 crc kubenswrapper[4831]: I1203 06:46:11.318471 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-flnpv" Dec 03 06:46:11 crc kubenswrapper[4831]: I1203 06:46:11.355895 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-flnpv" podStartSLOduration=3.35587482 podStartE2EDuration="3.35587482s" podCreationTimestamp="2025-12-03 06:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:11.345535313 +0000 UTC m=+908.689118841" watchObservedRunningTime="2025-12-03 06:46:11.35587482 +0000 UTC m=+908.699458348" Dec 03 06:46:11 crc kubenswrapper[4831]: I1203 06:46:11.356746 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-f8648f98b-vwlgn" podStartSLOduration=3.356717766 podStartE2EDuration="3.356717766s" podCreationTimestamp="2025-12-03 06:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:46:09.314973198 +0000 UTC m=+906.658556706" watchObservedRunningTime="2025-12-03 06:46:11.356717766 +0000 UTC m=+908.700301274" Dec 03 06:46:13 crc kubenswrapper[4831]: I1203 06:46:13.220573 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:13 crc kubenswrapper[4831]: I1203 06:46:13.220628 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:13 crc kubenswrapper[4831]: I1203 06:46:13.279539 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:13 crc kubenswrapper[4831]: I1203 06:46:13.384105 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:13 crc kubenswrapper[4831]: I1203 06:46:13.508405 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6q7x"] Dec 03 06:46:15 crc kubenswrapper[4831]: I1203 06:46:15.361302 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d6q7x" podUID="de478ce6-c46c-427a-ba46-f8b79546bf75" containerName="registry-server" containerID="cri-o://1300490ac595cf8176a3f2cc503b9764e9f52082c6816aceb5dd686794d49de4" gracePeriod=2 Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.235176 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mkpjw"] Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.236306 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.252280 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mkpjw"] Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.365282 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef034df4-d247-47c4-938c-8bb61989ca5f-utilities\") pod \"community-operators-mkpjw\" (UID: \"ef034df4-d247-47c4-938c-8bb61989ca5f\") " pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.365359 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqzpl\" (UniqueName: \"kubernetes.io/projected/ef034df4-d247-47c4-938c-8bb61989ca5f-kube-api-access-hqzpl\") pod \"community-operators-mkpjw\" (UID: \"ef034df4-d247-47c4-938c-8bb61989ca5f\") " pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.365402 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef034df4-d247-47c4-938c-8bb61989ca5f-catalog-content\") pod \"community-operators-mkpjw\" (UID: \"ef034df4-d247-47c4-938c-8bb61989ca5f\") " pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.368585 4831 generic.go:334] "Generic (PLEG): container finished" podID="de478ce6-c46c-427a-ba46-f8b79546bf75" containerID="1300490ac595cf8176a3f2cc503b9764e9f52082c6816aceb5dd686794d49de4" exitCode=0 Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.368622 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6q7x" event={"ID":"de478ce6-c46c-427a-ba46-f8b79546bf75","Type":"ContainerDied","Data":"1300490ac595cf8176a3f2cc503b9764e9f52082c6816aceb5dd686794d49de4"} Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.466944 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef034df4-d247-47c4-938c-8bb61989ca5f-utilities\") pod \"community-operators-mkpjw\" (UID: \"ef034df4-d247-47c4-938c-8bb61989ca5f\") " pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.466997 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqzpl\" (UniqueName: \"kubernetes.io/projected/ef034df4-d247-47c4-938c-8bb61989ca5f-kube-api-access-hqzpl\") pod \"community-operators-mkpjw\" (UID: \"ef034df4-d247-47c4-938c-8bb61989ca5f\") " pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.467027 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef034df4-d247-47c4-938c-8bb61989ca5f-catalog-content\") pod \"community-operators-mkpjw\" (UID: \"ef034df4-d247-47c4-938c-8bb61989ca5f\") " pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.467496 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef034df4-d247-47c4-938c-8bb61989ca5f-utilities\") pod \"community-operators-mkpjw\" (UID: \"ef034df4-d247-47c4-938c-8bb61989ca5f\") " pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.467572 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef034df4-d247-47c4-938c-8bb61989ca5f-catalog-content\") pod \"community-operators-mkpjw\" (UID: \"ef034df4-d247-47c4-938c-8bb61989ca5f\") " pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.484615 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqzpl\" (UniqueName: \"kubernetes.io/projected/ef034df4-d247-47c4-938c-8bb61989ca5f-kube-api-access-hqzpl\") pod \"community-operators-mkpjw\" (UID: \"ef034df4-d247-47c4-938c-8bb61989ca5f\") " pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:16 crc kubenswrapper[4831]: I1203 06:46:16.557498 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:17 crc kubenswrapper[4831]: I1203 06:46:17.732716 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:17 crc kubenswrapper[4831]: I1203 06:46:17.841574 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de478ce6-c46c-427a-ba46-f8b79546bf75-utilities\") pod \"de478ce6-c46c-427a-ba46-f8b79546bf75\" (UID: \"de478ce6-c46c-427a-ba46-f8b79546bf75\") " Dec 03 06:46:17 crc kubenswrapper[4831]: I1203 06:46:17.842086 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de478ce6-c46c-427a-ba46-f8b79546bf75-catalog-content\") pod \"de478ce6-c46c-427a-ba46-f8b79546bf75\" (UID: \"de478ce6-c46c-427a-ba46-f8b79546bf75\") " Dec 03 06:46:17 crc kubenswrapper[4831]: I1203 06:46:17.842190 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzb22\" (UniqueName: \"kubernetes.io/projected/de478ce6-c46c-427a-ba46-f8b79546bf75-kube-api-access-xzb22\") pod \"de478ce6-c46c-427a-ba46-f8b79546bf75\" (UID: \"de478ce6-c46c-427a-ba46-f8b79546bf75\") " Dec 03 06:46:17 crc kubenswrapper[4831]: I1203 06:46:17.842748 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de478ce6-c46c-427a-ba46-f8b79546bf75-utilities" (OuterVolumeSpecName: "utilities") pod "de478ce6-c46c-427a-ba46-f8b79546bf75" (UID: "de478ce6-c46c-427a-ba46-f8b79546bf75"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:46:17 crc kubenswrapper[4831]: I1203 06:46:17.848062 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de478ce6-c46c-427a-ba46-f8b79546bf75-kube-api-access-xzb22" (OuterVolumeSpecName: "kube-api-access-xzb22") pod "de478ce6-c46c-427a-ba46-f8b79546bf75" (UID: "de478ce6-c46c-427a-ba46-f8b79546bf75"). InnerVolumeSpecName "kube-api-access-xzb22". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:46:17 crc kubenswrapper[4831]: I1203 06:46:17.858304 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de478ce6-c46c-427a-ba46-f8b79546bf75-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "de478ce6-c46c-427a-ba46-f8b79546bf75" (UID: "de478ce6-c46c-427a-ba46-f8b79546bf75"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:46:17 crc kubenswrapper[4831]: I1203 06:46:17.892640 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mkpjw"] Dec 03 06:46:17 crc kubenswrapper[4831]: W1203 06:46:17.896013 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef034df4_d247_47c4_938c_8bb61989ca5f.slice/crio-eae657de97c4ab9bf292342e999738791f3265dde4a3011ffa2cb6927e1bc4af WatchSource:0}: Error finding container eae657de97c4ab9bf292342e999738791f3265dde4a3011ffa2cb6927e1bc4af: Status 404 returned error can't find the container with id eae657de97c4ab9bf292342e999738791f3265dde4a3011ffa2cb6927e1bc4af Dec 03 06:46:17 crc kubenswrapper[4831]: I1203 06:46:17.944056 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/de478ce6-c46c-427a-ba46-f8b79546bf75-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:46:17 crc kubenswrapper[4831]: I1203 06:46:17.944109 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzb22\" (UniqueName: \"kubernetes.io/projected/de478ce6-c46c-427a-ba46-f8b79546bf75-kube-api-access-xzb22\") on node \"crc\" DevicePath \"\"" Dec 03 06:46:17 crc kubenswrapper[4831]: I1203 06:46:17.944141 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/de478ce6-c46c-427a-ba46-f8b79546bf75-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.399620 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8" event={"ID":"8ce7e180-7f81-4cb1-b046-7e53111c2731","Type":"ContainerStarted","Data":"1257bf81a7ba6f61dd676e17c69792db50b1a1d601456bd036ff13ebaddecc65"} Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.399736 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8" Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.403102 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d6q7x" event={"ID":"de478ce6-c46c-427a-ba46-f8b79546bf75","Type":"ContainerDied","Data":"c77cbc45ccfac31a5d817127db33ac0911976b5fe272e39b0cdd62351565894e"} Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.403129 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d6q7x" Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.403167 4831 scope.go:117] "RemoveContainer" containerID="1300490ac595cf8176a3f2cc503b9764e9f52082c6816aceb5dd686794d49de4" Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.406438 4831 generic.go:334] "Generic (PLEG): container finished" podID="b023523a-cf93-48a2-be02-a6f4ba831bca" containerID="a6f0200643a7e60d52247818d2fb6e222b8ac210b92608654a2ba455ddb4dab2" exitCode=0 Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.406584 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-49hk4" event={"ID":"b023523a-cf93-48a2-be02-a6f4ba831bca","Type":"ContainerDied","Data":"a6f0200643a7e60d52247818d2fb6e222b8ac210b92608654a2ba455ddb4dab2"} Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.411516 4831 generic.go:334] "Generic (PLEG): container finished" podID="ef034df4-d247-47c4-938c-8bb61989ca5f" containerID="20bde613b31b9a99805ca3fb7988997d0b7613d15977728567fc01e333d9c982" exitCode=0 Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.411572 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mkpjw" event={"ID":"ef034df4-d247-47c4-938c-8bb61989ca5f","Type":"ContainerDied","Data":"20bde613b31b9a99805ca3fb7988997d0b7613d15977728567fc01e333d9c982"} Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.411604 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mkpjw" event={"ID":"ef034df4-d247-47c4-938c-8bb61989ca5f","Type":"ContainerStarted","Data":"eae657de97c4ab9bf292342e999738791f3265dde4a3011ffa2cb6927e1bc4af"} Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.422451 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8" podStartSLOduration=1.629393272 podStartE2EDuration="10.422430877s" podCreationTimestamp="2025-12-03 06:46:08 +0000 UTC" firstStartedPulling="2025-12-03 06:46:08.719795584 +0000 UTC m=+906.063379092" lastFinishedPulling="2025-12-03 06:46:17.512833179 +0000 UTC m=+914.856416697" observedRunningTime="2025-12-03 06:46:18.418001236 +0000 UTC m=+915.761584754" watchObservedRunningTime="2025-12-03 06:46:18.422430877 +0000 UTC m=+915.766014385" Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.428119 4831 scope.go:117] "RemoveContainer" containerID="6049db91152997fab8da5c4d261d62ba474e09baa8b1559548fd6a83ae1e23e9" Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.469128 4831 scope.go:117] "RemoveContainer" containerID="dcf993f83590b6c33506cde0a61e0f4cb5d69a8e2dae7b609ea2d39bcef2fd37" Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.491057 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6q7x"] Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.493736 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d6q7x"] Dec 03 06:46:18 crc kubenswrapper[4831]: I1203 06:46:18.565651 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-f8648f98b-vwlgn" Dec 03 06:46:19 crc kubenswrapper[4831]: I1203 06:46:19.031856 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de478ce6-c46c-427a-ba46-f8b79546bf75" path="/var/lib/kubelet/pods/de478ce6-c46c-427a-ba46-f8b79546bf75/volumes" Dec 03 06:46:19 crc kubenswrapper[4831]: I1203 06:46:19.420562 4831 generic.go:334] "Generic (PLEG): container finished" podID="b023523a-cf93-48a2-be02-a6f4ba831bca" containerID="b007f08fba9caf9e6d97b48a8522cb2b591d6b751f1b5dfba276bce8a61bb0d4" exitCode=0 Dec 03 06:46:19 crc kubenswrapper[4831]: I1203 06:46:19.420638 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-49hk4" event={"ID":"b023523a-cf93-48a2-be02-a6f4ba831bca","Type":"ContainerDied","Data":"b007f08fba9caf9e6d97b48a8522cb2b591d6b751f1b5dfba276bce8a61bb0d4"} Dec 03 06:46:19 crc kubenswrapper[4831]: I1203 06:46:19.423364 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mkpjw" event={"ID":"ef034df4-d247-47c4-938c-8bb61989ca5f","Type":"ContainerStarted","Data":"b933d67aa84c243e451b2fa830306c46b207507b055a4bccc25860174e8d00b2"} Dec 03 06:46:20 crc kubenswrapper[4831]: I1203 06:46:20.039972 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-flnpv" Dec 03 06:46:20 crc kubenswrapper[4831]: I1203 06:46:20.433715 4831 generic.go:334] "Generic (PLEG): container finished" podID="ef034df4-d247-47c4-938c-8bb61989ca5f" containerID="b933d67aa84c243e451b2fa830306c46b207507b055a4bccc25860174e8d00b2" exitCode=0 Dec 03 06:46:20 crc kubenswrapper[4831]: I1203 06:46:20.433761 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mkpjw" event={"ID":"ef034df4-d247-47c4-938c-8bb61989ca5f","Type":"ContainerDied","Data":"b933d67aa84c243e451b2fa830306c46b207507b055a4bccc25860174e8d00b2"} Dec 03 06:46:20 crc kubenswrapper[4831]: I1203 06:46:20.438112 4831 generic.go:334] "Generic (PLEG): container finished" podID="b023523a-cf93-48a2-be02-a6f4ba831bca" containerID="b30e7724fac1de566bf6a4f780c8abd5c73bb35cf8a88ae17dc83d0ba717895c" exitCode=0 Dec 03 06:46:20 crc kubenswrapper[4831]: I1203 06:46:20.438172 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-49hk4" event={"ID":"b023523a-cf93-48a2-be02-a6f4ba831bca","Type":"ContainerDied","Data":"b30e7724fac1de566bf6a4f780c8abd5c73bb35cf8a88ae17dc83d0ba717895c"} Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.448243 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-49hk4" event={"ID":"b023523a-cf93-48a2-be02-a6f4ba831bca","Type":"ContainerStarted","Data":"2d1fbe395f2d85fa58e0676c4b4e4f8ba7833f3742a41986d70a93df2d452dc4"} Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.449745 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-49hk4" event={"ID":"b023523a-cf93-48a2-be02-a6f4ba831bca","Type":"ContainerStarted","Data":"9f3385b7f7fe29c6249d3b5a36bd882051c6e63874c4ca3fd7288ba29c8cb2c8"} Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.449873 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-49hk4" event={"ID":"b023523a-cf93-48a2-be02-a6f4ba831bca","Type":"ContainerStarted","Data":"351d187fedebbe2135c3993730628a41589aa6e84d6d17cc4b691c1ffeca4c87"} Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.450014 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-49hk4" event={"ID":"b023523a-cf93-48a2-be02-a6f4ba831bca","Type":"ContainerStarted","Data":"d3bfc049a7a81c4974fee3c465984bcd9e214886bbb8b0dadad4a5a10e7560c7"} Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.450103 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-49hk4" event={"ID":"b023523a-cf93-48a2-be02-a6f4ba831bca","Type":"ContainerStarted","Data":"796740ea9d5e4d752366719f16bf3a170fb174ee22fffa6a95715fe9a2370273"} Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.450548 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mkpjw" event={"ID":"ef034df4-d247-47c4-938c-8bb61989ca5f","Type":"ContainerStarted","Data":"25470add0f7ea1c6b2ca056551690fdc2efc2813bf6358037011c97ba430e207"} Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.467047 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mkpjw" podStartSLOduration=2.749334233 podStartE2EDuration="5.467032946s" podCreationTimestamp="2025-12-03 06:46:16 +0000 UTC" firstStartedPulling="2025-12-03 06:46:18.428474047 +0000 UTC m=+915.772057575" lastFinishedPulling="2025-12-03 06:46:21.14617278 +0000 UTC m=+918.489756288" observedRunningTime="2025-12-03 06:46:21.465911561 +0000 UTC m=+918.809495069" watchObservedRunningTime="2025-12-03 06:46:21.467032946 +0000 UTC m=+918.810616454" Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.959628 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd"] Dec 03 06:46:21 crc kubenswrapper[4831]: E1203 06:46:21.960109 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de478ce6-c46c-427a-ba46-f8b79546bf75" containerName="extract-content" Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.960198 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="de478ce6-c46c-427a-ba46-f8b79546bf75" containerName="extract-content" Dec 03 06:46:21 crc kubenswrapper[4831]: E1203 06:46:21.960273 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de478ce6-c46c-427a-ba46-f8b79546bf75" containerName="registry-server" Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.960388 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="de478ce6-c46c-427a-ba46-f8b79546bf75" containerName="registry-server" Dec 03 06:46:21 crc kubenswrapper[4831]: E1203 06:46:21.960479 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de478ce6-c46c-427a-ba46-f8b79546bf75" containerName="extract-utilities" Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.960549 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="de478ce6-c46c-427a-ba46-f8b79546bf75" containerName="extract-utilities" Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.960828 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="de478ce6-c46c-427a-ba46-f8b79546bf75" containerName="registry-server" Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.961869 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.975100 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 06:46:21 crc kubenswrapper[4831]: I1203 06:46:21.982027 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd"] Dec 03 06:46:22 crc kubenswrapper[4831]: I1203 06:46:22.028257 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd\" (UID: \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" Dec 03 06:46:22 crc kubenswrapper[4831]: I1203 06:46:22.028358 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js4bc\" (UniqueName: \"kubernetes.io/projected/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-kube-api-access-js4bc\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd\" (UID: \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" Dec 03 06:46:22 crc kubenswrapper[4831]: I1203 06:46:22.028425 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd\" (UID: \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" Dec 03 06:46:22 crc kubenswrapper[4831]: I1203 06:46:22.130371 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd\" (UID: \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" Dec 03 06:46:22 crc kubenswrapper[4831]: I1203 06:46:22.130497 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd\" (UID: \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" Dec 03 06:46:22 crc kubenswrapper[4831]: I1203 06:46:22.130533 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js4bc\" (UniqueName: \"kubernetes.io/projected/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-kube-api-access-js4bc\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd\" (UID: \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" Dec 03 06:46:22 crc kubenswrapper[4831]: I1203 06:46:22.131090 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd\" (UID: \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" Dec 03 06:46:22 crc kubenswrapper[4831]: I1203 06:46:22.131201 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd\" (UID: \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" Dec 03 06:46:22 crc kubenswrapper[4831]: I1203 06:46:22.151363 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js4bc\" (UniqueName: \"kubernetes.io/projected/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-kube-api-access-js4bc\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd\" (UID: \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" Dec 03 06:46:22 crc kubenswrapper[4831]: I1203 06:46:22.286204 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" Dec 03 06:46:22 crc kubenswrapper[4831]: I1203 06:46:22.473310 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-49hk4" event={"ID":"b023523a-cf93-48a2-be02-a6f4ba831bca","Type":"ContainerStarted","Data":"1d5168fa5837892637d84d2404246126bfb67c9d4f5e3d8af2502ad92ad53da6"} Dec 03 06:46:22 crc kubenswrapper[4831]: I1203 06:46:22.496171 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-49hk4" podStartSLOduration=6.128804416 podStartE2EDuration="14.496157836s" podCreationTimestamp="2025-12-03 06:46:08 +0000 UTC" firstStartedPulling="2025-12-03 06:46:09.176671173 +0000 UTC m=+906.520254721" lastFinishedPulling="2025-12-03 06:46:17.544024633 +0000 UTC m=+914.887608141" observedRunningTime="2025-12-03 06:46:22.492362077 +0000 UTC m=+919.835945585" watchObservedRunningTime="2025-12-03 06:46:22.496157836 +0000 UTC m=+919.839741334" Dec 03 06:46:22 crc kubenswrapper[4831]: W1203 06:46:22.508061 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7ea5269_a0d1_4074_a27b_ff29b0dd0ec4.slice/crio-bbaa9936dfa4702c9eb2d178b9640dbdf9cb8947ceaa9390d7099fe6566fcf36 WatchSource:0}: Error finding container bbaa9936dfa4702c9eb2d178b9640dbdf9cb8947ceaa9390d7099fe6566fcf36: Status 404 returned error can't find the container with id bbaa9936dfa4702c9eb2d178b9640dbdf9cb8947ceaa9390d7099fe6566fcf36 Dec 03 06:46:22 crc kubenswrapper[4831]: I1203 06:46:22.512247 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd"] Dec 03 06:46:23 crc kubenswrapper[4831]: I1203 06:46:23.481872 4831 generic.go:334] "Generic (PLEG): container finished" podID="c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4" containerID="5316002461fbaeda97e69ade1ea1a5583c815f2b63573725bc56497792882993" exitCode=0 Dec 03 06:46:23 crc kubenswrapper[4831]: I1203 06:46:23.481955 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" event={"ID":"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4","Type":"ContainerDied","Data":"5316002461fbaeda97e69ade1ea1a5583c815f2b63573725bc56497792882993"} Dec 03 06:46:23 crc kubenswrapper[4831]: I1203 06:46:23.482361 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" event={"ID":"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4","Type":"ContainerStarted","Data":"bbaa9936dfa4702c9eb2d178b9640dbdf9cb8947ceaa9390d7099fe6566fcf36"} Dec 03 06:46:23 crc kubenswrapper[4831]: I1203 06:46:23.483114 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:24 crc kubenswrapper[4831]: I1203 06:46:24.076898 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:24 crc kubenswrapper[4831]: I1203 06:46:24.113692 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:26 crc kubenswrapper[4831]: I1203 06:46:26.557633 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:26 crc kubenswrapper[4831]: I1203 06:46:26.557964 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:26 crc kubenswrapper[4831]: I1203 06:46:26.632223 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:27 crc kubenswrapper[4831]: I1203 06:46:27.511556 4831 generic.go:334] "Generic (PLEG): container finished" podID="c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4" containerID="0e626de034f245ab2cffa3a08b8dec1ce649370272d94ea3cced71d0d8ad32ae" exitCode=0 Dec 03 06:46:27 crc kubenswrapper[4831]: I1203 06:46:27.511652 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" event={"ID":"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4","Type":"ContainerDied","Data":"0e626de034f245ab2cffa3a08b8dec1ce649370272d94ea3cced71d0d8ad32ae"} Dec 03 06:46:27 crc kubenswrapper[4831]: I1203 06:46:27.569188 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:27 crc kubenswrapper[4831]: I1203 06:46:27.597069 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:46:27 crc kubenswrapper[4831]: I1203 06:46:27.597145 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:46:28 crc kubenswrapper[4831]: I1203 06:46:28.503121 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7fcb986d4-kvrm8" Dec 03 06:46:28 crc kubenswrapper[4831]: I1203 06:46:28.525425 4831 generic.go:334] "Generic (PLEG): container finished" podID="c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4" containerID="39fe71e1642f6dd1ca2d9bdf39f62ca210005b8fcd9850a3926f4805bb211e03" exitCode=0 Dec 03 06:46:28 crc kubenswrapper[4831]: I1203 06:46:28.525555 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" event={"ID":"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4","Type":"ContainerDied","Data":"39fe71e1642f6dd1ca2d9bdf39f62ca210005b8fcd9850a3926f4805bb211e03"} Dec 03 06:46:29 crc kubenswrapper[4831]: I1203 06:46:29.315453 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mkpjw"] Dec 03 06:46:29 crc kubenswrapper[4831]: I1203 06:46:29.532459 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mkpjw" podUID="ef034df4-d247-47c4-938c-8bb61989ca5f" containerName="registry-server" containerID="cri-o://25470add0f7ea1c6b2ca056551690fdc2efc2813bf6358037011c97ba430e207" gracePeriod=2 Dec 03 06:46:29 crc kubenswrapper[4831]: I1203 06:46:29.865451 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" Dec 03 06:46:29 crc kubenswrapper[4831]: I1203 06:46:29.992012 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js4bc\" (UniqueName: \"kubernetes.io/projected/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-kube-api-access-js4bc\") pod \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\" (UID: \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\") " Dec 03 06:46:29 crc kubenswrapper[4831]: I1203 06:46:29.992064 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-util\") pod \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\" (UID: \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\") " Dec 03 06:46:29 crc kubenswrapper[4831]: I1203 06:46:29.992111 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-bundle\") pod \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\" (UID: \"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4\") " Dec 03 06:46:29 crc kubenswrapper[4831]: I1203 06:46:29.993281 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-bundle" (OuterVolumeSpecName: "bundle") pod "c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4" (UID: "c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:46:29 crc kubenswrapper[4831]: I1203 06:46:29.998500 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-kube-api-access-js4bc" (OuterVolumeSpecName: "kube-api-access-js4bc") pod "c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4" (UID: "c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4"). InnerVolumeSpecName "kube-api-access-js4bc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.002501 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-util" (OuterVolumeSpecName: "util") pod "c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4" (UID: "c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.093635 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-util\") on node \"crc\" DevicePath \"\"" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.093693 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js4bc\" (UniqueName: \"kubernetes.io/projected/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-kube-api-access-js4bc\") on node \"crc\" DevicePath \"\"" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.093711 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.442436 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.541261 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" event={"ID":"c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4","Type":"ContainerDied","Data":"bbaa9936dfa4702c9eb2d178b9640dbdf9cb8947ceaa9390d7099fe6566fcf36"} Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.541300 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bbaa9936dfa4702c9eb2d178b9640dbdf9cb8947ceaa9390d7099fe6566fcf36" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.541373 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.568580 4831 generic.go:334] "Generic (PLEG): container finished" podID="ef034df4-d247-47c4-938c-8bb61989ca5f" containerID="25470add0f7ea1c6b2ca056551690fdc2efc2813bf6358037011c97ba430e207" exitCode=0 Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.568629 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mkpjw" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.568629 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mkpjw" event={"ID":"ef034df4-d247-47c4-938c-8bb61989ca5f","Type":"ContainerDied","Data":"25470add0f7ea1c6b2ca056551690fdc2efc2813bf6358037011c97ba430e207"} Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.568767 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mkpjw" event={"ID":"ef034df4-d247-47c4-938c-8bb61989ca5f","Type":"ContainerDied","Data":"eae657de97c4ab9bf292342e999738791f3265dde4a3011ffa2cb6927e1bc4af"} Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.568788 4831 scope.go:117] "RemoveContainer" containerID="25470add0f7ea1c6b2ca056551690fdc2efc2813bf6358037011c97ba430e207" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.590742 4831 scope.go:117] "RemoveContainer" containerID="b933d67aa84c243e451b2fa830306c46b207507b055a4bccc25860174e8d00b2" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.600966 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef034df4-d247-47c4-938c-8bb61989ca5f-utilities\") pod \"ef034df4-d247-47c4-938c-8bb61989ca5f\" (UID: \"ef034df4-d247-47c4-938c-8bb61989ca5f\") " Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.601065 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqzpl\" (UniqueName: \"kubernetes.io/projected/ef034df4-d247-47c4-938c-8bb61989ca5f-kube-api-access-hqzpl\") pod \"ef034df4-d247-47c4-938c-8bb61989ca5f\" (UID: \"ef034df4-d247-47c4-938c-8bb61989ca5f\") " Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.601127 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef034df4-d247-47c4-938c-8bb61989ca5f-catalog-content\") pod \"ef034df4-d247-47c4-938c-8bb61989ca5f\" (UID: \"ef034df4-d247-47c4-938c-8bb61989ca5f\") " Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.601852 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef034df4-d247-47c4-938c-8bb61989ca5f-utilities" (OuterVolumeSpecName: "utilities") pod "ef034df4-d247-47c4-938c-8bb61989ca5f" (UID: "ef034df4-d247-47c4-938c-8bb61989ca5f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.606398 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef034df4-d247-47c4-938c-8bb61989ca5f-kube-api-access-hqzpl" (OuterVolumeSpecName: "kube-api-access-hqzpl") pod "ef034df4-d247-47c4-938c-8bb61989ca5f" (UID: "ef034df4-d247-47c4-938c-8bb61989ca5f"). InnerVolumeSpecName "kube-api-access-hqzpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.609082 4831 scope.go:117] "RemoveContainer" containerID="20bde613b31b9a99805ca3fb7988997d0b7613d15977728567fc01e333d9c982" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.626831 4831 scope.go:117] "RemoveContainer" containerID="25470add0f7ea1c6b2ca056551690fdc2efc2813bf6358037011c97ba430e207" Dec 03 06:46:30 crc kubenswrapper[4831]: E1203 06:46:30.627261 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25470add0f7ea1c6b2ca056551690fdc2efc2813bf6358037011c97ba430e207\": container with ID starting with 25470add0f7ea1c6b2ca056551690fdc2efc2813bf6358037011c97ba430e207 not found: ID does not exist" containerID="25470add0f7ea1c6b2ca056551690fdc2efc2813bf6358037011c97ba430e207" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.627325 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25470add0f7ea1c6b2ca056551690fdc2efc2813bf6358037011c97ba430e207"} err="failed to get container status \"25470add0f7ea1c6b2ca056551690fdc2efc2813bf6358037011c97ba430e207\": rpc error: code = NotFound desc = could not find container \"25470add0f7ea1c6b2ca056551690fdc2efc2813bf6358037011c97ba430e207\": container with ID starting with 25470add0f7ea1c6b2ca056551690fdc2efc2813bf6358037011c97ba430e207 not found: ID does not exist" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.627355 4831 scope.go:117] "RemoveContainer" containerID="b933d67aa84c243e451b2fa830306c46b207507b055a4bccc25860174e8d00b2" Dec 03 06:46:30 crc kubenswrapper[4831]: E1203 06:46:30.627685 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b933d67aa84c243e451b2fa830306c46b207507b055a4bccc25860174e8d00b2\": container with ID starting with b933d67aa84c243e451b2fa830306c46b207507b055a4bccc25860174e8d00b2 not found: ID does not exist" containerID="b933d67aa84c243e451b2fa830306c46b207507b055a4bccc25860174e8d00b2" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.627721 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b933d67aa84c243e451b2fa830306c46b207507b055a4bccc25860174e8d00b2"} err="failed to get container status \"b933d67aa84c243e451b2fa830306c46b207507b055a4bccc25860174e8d00b2\": rpc error: code = NotFound desc = could not find container \"b933d67aa84c243e451b2fa830306c46b207507b055a4bccc25860174e8d00b2\": container with ID starting with b933d67aa84c243e451b2fa830306c46b207507b055a4bccc25860174e8d00b2 not found: ID does not exist" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.627750 4831 scope.go:117] "RemoveContainer" containerID="20bde613b31b9a99805ca3fb7988997d0b7613d15977728567fc01e333d9c982" Dec 03 06:46:30 crc kubenswrapper[4831]: E1203 06:46:30.628198 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20bde613b31b9a99805ca3fb7988997d0b7613d15977728567fc01e333d9c982\": container with ID starting with 20bde613b31b9a99805ca3fb7988997d0b7613d15977728567fc01e333d9c982 not found: ID does not exist" containerID="20bde613b31b9a99805ca3fb7988997d0b7613d15977728567fc01e333d9c982" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.628222 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20bde613b31b9a99805ca3fb7988997d0b7613d15977728567fc01e333d9c982"} err="failed to get container status \"20bde613b31b9a99805ca3fb7988997d0b7613d15977728567fc01e333d9c982\": rpc error: code = NotFound desc = could not find container \"20bde613b31b9a99805ca3fb7988997d0b7613d15977728567fc01e333d9c982\": container with ID starting with 20bde613b31b9a99805ca3fb7988997d0b7613d15977728567fc01e333d9c982 not found: ID does not exist" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.655440 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ef034df4-d247-47c4-938c-8bb61989ca5f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ef034df4-d247-47c4-938c-8bb61989ca5f" (UID: "ef034df4-d247-47c4-938c-8bb61989ca5f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.702859 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hqzpl\" (UniqueName: \"kubernetes.io/projected/ef034df4-d247-47c4-938c-8bb61989ca5f-kube-api-access-hqzpl\") on node \"crc\" DevicePath \"\"" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.702898 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ef034df4-d247-47c4-938c-8bb61989ca5f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.702914 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ef034df4-d247-47c4-938c-8bb61989ca5f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.900096 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mkpjw"] Dec 03 06:46:30 crc kubenswrapper[4831]: I1203 06:46:30.905112 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mkpjw"] Dec 03 06:46:31 crc kubenswrapper[4831]: I1203 06:46:31.022091 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef034df4-d247-47c4-938c-8bb61989ca5f" path="/var/lib/kubelet/pods/ef034df4-d247-47c4-938c-8bb61989ca5f/volumes" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.516273 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2"] Dec 03 06:46:35 crc kubenswrapper[4831]: E1203 06:46:35.517173 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4" containerName="util" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.517188 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4" containerName="util" Dec 03 06:46:35 crc kubenswrapper[4831]: E1203 06:46:35.517207 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4" containerName="extract" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.517216 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4" containerName="extract" Dec 03 06:46:35 crc kubenswrapper[4831]: E1203 06:46:35.517232 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef034df4-d247-47c4-938c-8bb61989ca5f" containerName="extract-utilities" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.517240 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef034df4-d247-47c4-938c-8bb61989ca5f" containerName="extract-utilities" Dec 03 06:46:35 crc kubenswrapper[4831]: E1203 06:46:35.517256 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef034df4-d247-47c4-938c-8bb61989ca5f" containerName="registry-server" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.517264 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef034df4-d247-47c4-938c-8bb61989ca5f" containerName="registry-server" Dec 03 06:46:35 crc kubenswrapper[4831]: E1203 06:46:35.517278 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef034df4-d247-47c4-938c-8bb61989ca5f" containerName="extract-content" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.517286 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef034df4-d247-47c4-938c-8bb61989ca5f" containerName="extract-content" Dec 03 06:46:35 crc kubenswrapper[4831]: E1203 06:46:35.517301 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4" containerName="pull" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.517326 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4" containerName="pull" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.517447 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef034df4-d247-47c4-938c-8bb61989ca5f" containerName="registry-server" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.517461 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4" containerName="extract" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.517921 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.519389 4831 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-wz8j6" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.521564 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.521756 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.540849 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2"] Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.667701 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/abc80626-3722-43a0-8c09-c2a924946e16-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-27nx2\" (UID: \"abc80626-3722-43a0-8c09-c2a924946e16\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.667813 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmkqr\" (UniqueName: \"kubernetes.io/projected/abc80626-3722-43a0-8c09-c2a924946e16-kube-api-access-mmkqr\") pod \"cert-manager-operator-controller-manager-64cf6dff88-27nx2\" (UID: \"abc80626-3722-43a0-8c09-c2a924946e16\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.769090 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/abc80626-3722-43a0-8c09-c2a924946e16-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-27nx2\" (UID: \"abc80626-3722-43a0-8c09-c2a924946e16\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.769166 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmkqr\" (UniqueName: \"kubernetes.io/projected/abc80626-3722-43a0-8c09-c2a924946e16-kube-api-access-mmkqr\") pod \"cert-manager-operator-controller-manager-64cf6dff88-27nx2\" (UID: \"abc80626-3722-43a0-8c09-c2a924946e16\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.769567 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/abc80626-3722-43a0-8c09-c2a924946e16-tmp\") pod \"cert-manager-operator-controller-manager-64cf6dff88-27nx2\" (UID: \"abc80626-3722-43a0-8c09-c2a924946e16\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.787498 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmkqr\" (UniqueName: \"kubernetes.io/projected/abc80626-3722-43a0-8c09-c2a924946e16-kube-api-access-mmkqr\") pod \"cert-manager-operator-controller-manager-64cf6dff88-27nx2\" (UID: \"abc80626-3722-43a0-8c09-c2a924946e16\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2" Dec 03 06:46:35 crc kubenswrapper[4831]: I1203 06:46:35.847915 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2" Dec 03 06:46:36 crc kubenswrapper[4831]: I1203 06:46:36.394139 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2"] Dec 03 06:46:36 crc kubenswrapper[4831]: W1203 06:46:36.403478 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabc80626_3722_43a0_8c09_c2a924946e16.slice/crio-7536d5fb4810e8e28c3eb51aaeee46403eb97ecbb251f7380881791c370532a7 WatchSource:0}: Error finding container 7536d5fb4810e8e28c3eb51aaeee46403eb97ecbb251f7380881791c370532a7: Status 404 returned error can't find the container with id 7536d5fb4810e8e28c3eb51aaeee46403eb97ecbb251f7380881791c370532a7 Dec 03 06:46:36 crc kubenswrapper[4831]: I1203 06:46:36.605417 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2" event={"ID":"abc80626-3722-43a0-8c09-c2a924946e16","Type":"ContainerStarted","Data":"7536d5fb4810e8e28c3eb51aaeee46403eb97ecbb251f7380881791c370532a7"} Dec 03 06:46:39 crc kubenswrapper[4831]: I1203 06:46:39.082662 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-49hk4" Dec 03 06:46:44 crc kubenswrapper[4831]: I1203 06:46:44.127692 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dz8wk"] Dec 03 06:46:44 crc kubenswrapper[4831]: I1203 06:46:44.130635 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:44 crc kubenswrapper[4831]: I1203 06:46:44.139160 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dz8wk"] Dec 03 06:46:44 crc kubenswrapper[4831]: I1203 06:46:44.277239 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13940a9c-6227-4e54-8683-376584d6f937-utilities\") pod \"certified-operators-dz8wk\" (UID: \"13940a9c-6227-4e54-8683-376584d6f937\") " pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:44 crc kubenswrapper[4831]: I1203 06:46:44.277301 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13940a9c-6227-4e54-8683-376584d6f937-catalog-content\") pod \"certified-operators-dz8wk\" (UID: \"13940a9c-6227-4e54-8683-376584d6f937\") " pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:44 crc kubenswrapper[4831]: I1203 06:46:44.277357 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg8zb\" (UniqueName: \"kubernetes.io/projected/13940a9c-6227-4e54-8683-376584d6f937-kube-api-access-xg8zb\") pod \"certified-operators-dz8wk\" (UID: \"13940a9c-6227-4e54-8683-376584d6f937\") " pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:44 crc kubenswrapper[4831]: I1203 06:46:44.378770 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg8zb\" (UniqueName: \"kubernetes.io/projected/13940a9c-6227-4e54-8683-376584d6f937-kube-api-access-xg8zb\") pod \"certified-operators-dz8wk\" (UID: \"13940a9c-6227-4e54-8683-376584d6f937\") " pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:44 crc kubenswrapper[4831]: I1203 06:46:44.378859 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13940a9c-6227-4e54-8683-376584d6f937-utilities\") pod \"certified-operators-dz8wk\" (UID: \"13940a9c-6227-4e54-8683-376584d6f937\") " pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:44 crc kubenswrapper[4831]: I1203 06:46:44.378906 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13940a9c-6227-4e54-8683-376584d6f937-catalog-content\") pod \"certified-operators-dz8wk\" (UID: \"13940a9c-6227-4e54-8683-376584d6f937\") " pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:44 crc kubenswrapper[4831]: I1203 06:46:44.379466 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13940a9c-6227-4e54-8683-376584d6f937-catalog-content\") pod \"certified-operators-dz8wk\" (UID: \"13940a9c-6227-4e54-8683-376584d6f937\") " pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:44 crc kubenswrapper[4831]: I1203 06:46:44.379543 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13940a9c-6227-4e54-8683-376584d6f937-utilities\") pod \"certified-operators-dz8wk\" (UID: \"13940a9c-6227-4e54-8683-376584d6f937\") " pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:44 crc kubenswrapper[4831]: I1203 06:46:44.404921 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg8zb\" (UniqueName: \"kubernetes.io/projected/13940a9c-6227-4e54-8683-376584d6f937-kube-api-access-xg8zb\") pod \"certified-operators-dz8wk\" (UID: \"13940a9c-6227-4e54-8683-376584d6f937\") " pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:44 crc kubenswrapper[4831]: I1203 06:46:44.486700 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:45 crc kubenswrapper[4831]: I1203 06:46:45.205882 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dz8wk"] Dec 03 06:46:45 crc kubenswrapper[4831]: W1203 06:46:45.213980 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13940a9c_6227_4e54_8683_376584d6f937.slice/crio-027244d763e696bf7fe47d010e088769892a0208e3c9a8dfbf66d5f79914acfa WatchSource:0}: Error finding container 027244d763e696bf7fe47d010e088769892a0208e3c9a8dfbf66d5f79914acfa: Status 404 returned error can't find the container with id 027244d763e696bf7fe47d010e088769892a0208e3c9a8dfbf66d5f79914acfa Dec 03 06:46:45 crc kubenswrapper[4831]: I1203 06:46:45.666853 4831 generic.go:334] "Generic (PLEG): container finished" podID="13940a9c-6227-4e54-8683-376584d6f937" containerID="6d263d83f6a7d21a93c170dbe49068036ccd1158cf8b137fbd2c254b23f40c4d" exitCode=0 Dec 03 06:46:45 crc kubenswrapper[4831]: I1203 06:46:45.666917 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dz8wk" event={"ID":"13940a9c-6227-4e54-8683-376584d6f937","Type":"ContainerDied","Data":"6d263d83f6a7d21a93c170dbe49068036ccd1158cf8b137fbd2c254b23f40c4d"} Dec 03 06:46:45 crc kubenswrapper[4831]: I1203 06:46:45.667254 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dz8wk" event={"ID":"13940a9c-6227-4e54-8683-376584d6f937","Type":"ContainerStarted","Data":"027244d763e696bf7fe47d010e088769892a0208e3c9a8dfbf66d5f79914acfa"} Dec 03 06:46:45 crc kubenswrapper[4831]: I1203 06:46:45.668664 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2" event={"ID":"abc80626-3722-43a0-8c09-c2a924946e16","Type":"ContainerStarted","Data":"18cf3085c14412b771471ef067ccc3ae8da7513f8fdad28e84b270cfb6619494"} Dec 03 06:46:45 crc kubenswrapper[4831]: I1203 06:46:45.713520 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-64cf6dff88-27nx2" podStartSLOduration=2.211069641 podStartE2EDuration="10.713500325s" podCreationTimestamp="2025-12-03 06:46:35 +0000 UTC" firstStartedPulling="2025-12-03 06:46:36.406016033 +0000 UTC m=+933.749599541" lastFinishedPulling="2025-12-03 06:46:44.908446717 +0000 UTC m=+942.252030225" observedRunningTime="2025-12-03 06:46:45.712236986 +0000 UTC m=+943.055820524" watchObservedRunningTime="2025-12-03 06:46:45.713500325 +0000 UTC m=+943.057083843" Dec 03 06:46:46 crc kubenswrapper[4831]: I1203 06:46:46.694417 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dz8wk" event={"ID":"13940a9c-6227-4e54-8683-376584d6f937","Type":"ContainerStarted","Data":"2c7453f1612410c1f7e5d331271b26e05615612b0402a7bfb9671158f4f7e3de"} Dec 03 06:46:47 crc kubenswrapper[4831]: I1203 06:46:47.705743 4831 generic.go:334] "Generic (PLEG): container finished" podID="13940a9c-6227-4e54-8683-376584d6f937" containerID="2c7453f1612410c1f7e5d331271b26e05615612b0402a7bfb9671158f4f7e3de" exitCode=0 Dec 03 06:46:47 crc kubenswrapper[4831]: I1203 06:46:47.705902 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dz8wk" event={"ID":"13940a9c-6227-4e54-8683-376584d6f937","Type":"ContainerDied","Data":"2c7453f1612410c1f7e5d331271b26e05615612b0402a7bfb9671158f4f7e3de"} Dec 03 06:46:48 crc kubenswrapper[4831]: I1203 06:46:48.715821 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dz8wk" event={"ID":"13940a9c-6227-4e54-8683-376584d6f937","Type":"ContainerStarted","Data":"97b35408c651f63379d23754b77f3d22cdf30cf72c0b287186291592ad192295"} Dec 03 06:46:48 crc kubenswrapper[4831]: I1203 06:46:48.745022 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dz8wk" podStartSLOduration=2.2759272360000002 podStartE2EDuration="4.744999401s" podCreationTimestamp="2025-12-03 06:46:44 +0000 UTC" firstStartedPulling="2025-12-03 06:46:45.669857448 +0000 UTC m=+943.013440966" lastFinishedPulling="2025-12-03 06:46:48.138929623 +0000 UTC m=+945.482513131" observedRunningTime="2025-12-03 06:46:48.744745514 +0000 UTC m=+946.088329022" watchObservedRunningTime="2025-12-03 06:46:48.744999401 +0000 UTC m=+946.088582909" Dec 03 06:46:50 crc kubenswrapper[4831]: I1203 06:46:50.483848 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-rs4hm"] Dec 03 06:46:50 crc kubenswrapper[4831]: I1203 06:46:50.485233 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-rs4hm" Dec 03 06:46:50 crc kubenswrapper[4831]: I1203 06:46:50.488139 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Dec 03 06:46:50 crc kubenswrapper[4831]: I1203 06:46:50.488220 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Dec 03 06:46:50 crc kubenswrapper[4831]: I1203 06:46:50.497345 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-rs4hm"] Dec 03 06:46:50 crc kubenswrapper[4831]: I1203 06:46:50.500603 4831 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-8jmzg" Dec 03 06:46:50 crc kubenswrapper[4831]: I1203 06:46:50.668987 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-726wk\" (UniqueName: \"kubernetes.io/projected/2334e9c7-0d74-4b40-bfa2-42916f53c7aa-kube-api-access-726wk\") pod \"cert-manager-webhook-f4fb5df64-rs4hm\" (UID: \"2334e9c7-0d74-4b40-bfa2-42916f53c7aa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rs4hm" Dec 03 06:46:50 crc kubenswrapper[4831]: I1203 06:46:50.669137 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2334e9c7-0d74-4b40-bfa2-42916f53c7aa-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-rs4hm\" (UID: \"2334e9c7-0d74-4b40-bfa2-42916f53c7aa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rs4hm" Dec 03 06:46:50 crc kubenswrapper[4831]: I1203 06:46:50.770277 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2334e9c7-0d74-4b40-bfa2-42916f53c7aa-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-rs4hm\" (UID: \"2334e9c7-0d74-4b40-bfa2-42916f53c7aa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rs4hm" Dec 03 06:46:50 crc kubenswrapper[4831]: I1203 06:46:50.770612 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-726wk\" (UniqueName: \"kubernetes.io/projected/2334e9c7-0d74-4b40-bfa2-42916f53c7aa-kube-api-access-726wk\") pod \"cert-manager-webhook-f4fb5df64-rs4hm\" (UID: \"2334e9c7-0d74-4b40-bfa2-42916f53c7aa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rs4hm" Dec 03 06:46:50 crc kubenswrapper[4831]: I1203 06:46:50.793432 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2334e9c7-0d74-4b40-bfa2-42916f53c7aa-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-rs4hm\" (UID: \"2334e9c7-0d74-4b40-bfa2-42916f53c7aa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rs4hm" Dec 03 06:46:50 crc kubenswrapper[4831]: I1203 06:46:50.796550 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-726wk\" (UniqueName: \"kubernetes.io/projected/2334e9c7-0d74-4b40-bfa2-42916f53c7aa-kube-api-access-726wk\") pod \"cert-manager-webhook-f4fb5df64-rs4hm\" (UID: \"2334e9c7-0d74-4b40-bfa2-42916f53c7aa\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-rs4hm" Dec 03 06:46:50 crc kubenswrapper[4831]: I1203 06:46:50.847705 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-rs4hm" Dec 03 06:46:51 crc kubenswrapper[4831]: I1203 06:46:51.298133 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-rs4hm"] Dec 03 06:46:51 crc kubenswrapper[4831]: I1203 06:46:51.575305 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c"] Dec 03 06:46:51 crc kubenswrapper[4831]: I1203 06:46:51.576278 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c" Dec 03 06:46:51 crc kubenswrapper[4831]: I1203 06:46:51.584493 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c"] Dec 03 06:46:51 crc kubenswrapper[4831]: I1203 06:46:51.620352 4831 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-vgl7r" Dec 03 06:46:51 crc kubenswrapper[4831]: I1203 06:46:51.686435 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmklb\" (UniqueName: \"kubernetes.io/projected/4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b-kube-api-access-qmklb\") pod \"cert-manager-cainjector-855d9ccff4-gkk6c\" (UID: \"4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c" Dec 03 06:46:51 crc kubenswrapper[4831]: I1203 06:46:51.686515 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-gkk6c\" (UID: \"4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c" Dec 03 06:46:51 crc kubenswrapper[4831]: I1203 06:46:51.737110 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-rs4hm" event={"ID":"2334e9c7-0d74-4b40-bfa2-42916f53c7aa","Type":"ContainerStarted","Data":"993598d4038e6b697dc314e51fc3f36d077edc2ee31911db5f76376f3be2cd84"} Dec 03 06:46:51 crc kubenswrapper[4831]: I1203 06:46:51.787684 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmklb\" (UniqueName: \"kubernetes.io/projected/4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b-kube-api-access-qmklb\") pod \"cert-manager-cainjector-855d9ccff4-gkk6c\" (UID: \"4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c" Dec 03 06:46:51 crc kubenswrapper[4831]: I1203 06:46:51.787779 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-gkk6c\" (UID: \"4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c" Dec 03 06:46:51 crc kubenswrapper[4831]: I1203 06:46:51.810152 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-gkk6c\" (UID: \"4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c" Dec 03 06:46:51 crc kubenswrapper[4831]: I1203 06:46:51.810269 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmklb\" (UniqueName: \"kubernetes.io/projected/4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b-kube-api-access-qmklb\") pod \"cert-manager-cainjector-855d9ccff4-gkk6c\" (UID: \"4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c" Dec 03 06:46:51 crc kubenswrapper[4831]: I1203 06:46:51.939122 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c" Dec 03 06:46:52 crc kubenswrapper[4831]: I1203 06:46:52.361295 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c"] Dec 03 06:46:52 crc kubenswrapper[4831]: W1203 06:46:52.364690 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4bbcf00b_ec72_4ee0_a9ff_683a7ffe476b.slice/crio-64287605289c7dd5bbd9a71ad3aa6a3bc374e662ccbc6c5ae84edb42836a4649 WatchSource:0}: Error finding container 64287605289c7dd5bbd9a71ad3aa6a3bc374e662ccbc6c5ae84edb42836a4649: Status 404 returned error can't find the container with id 64287605289c7dd5bbd9a71ad3aa6a3bc374e662ccbc6c5ae84edb42836a4649 Dec 03 06:46:52 crc kubenswrapper[4831]: I1203 06:46:52.743607 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c" event={"ID":"4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b","Type":"ContainerStarted","Data":"64287605289c7dd5bbd9a71ad3aa6a3bc374e662ccbc6c5ae84edb42836a4649"} Dec 03 06:46:54 crc kubenswrapper[4831]: I1203 06:46:54.487369 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:54 crc kubenswrapper[4831]: I1203 06:46:54.487408 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:54 crc kubenswrapper[4831]: I1203 06:46:54.547826 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:54 crc kubenswrapper[4831]: I1203 06:46:54.829207 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:46:56 crc kubenswrapper[4831]: I1203 06:46:56.910506 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dz8wk"] Dec 03 06:46:56 crc kubenswrapper[4831]: I1203 06:46:56.910773 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dz8wk" podUID="13940a9c-6227-4e54-8683-376584d6f937" containerName="registry-server" containerID="cri-o://97b35408c651f63379d23754b77f3d22cdf30cf72c0b287186291592ad192295" gracePeriod=2 Dec 03 06:46:57 crc kubenswrapper[4831]: I1203 06:46:57.597165 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:46:57 crc kubenswrapper[4831]: I1203 06:46:57.597515 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:46:57 crc kubenswrapper[4831]: I1203 06:46:57.597576 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:46:57 crc kubenswrapper[4831]: I1203 06:46:57.598188 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c066cdb31940f01296e6a59517a0bf4cdb6c0c7137c9abb1a013450afc9368b"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:46:57 crc kubenswrapper[4831]: I1203 06:46:57.598241 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://9c066cdb31940f01296e6a59517a0bf4cdb6c0c7137c9abb1a013450afc9368b" gracePeriod=600 Dec 03 06:46:57 crc kubenswrapper[4831]: I1203 06:46:57.814976 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="9c066cdb31940f01296e6a59517a0bf4cdb6c0c7137c9abb1a013450afc9368b" exitCode=0 Dec 03 06:46:57 crc kubenswrapper[4831]: I1203 06:46:57.815039 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"9c066cdb31940f01296e6a59517a0bf4cdb6c0c7137c9abb1a013450afc9368b"} Dec 03 06:46:57 crc kubenswrapper[4831]: I1203 06:46:57.815076 4831 scope.go:117] "RemoveContainer" containerID="591c53f4c8c9620b5b60eed4f0d2632e242390fceb4ae25a90151135f08319c6" Dec 03 06:46:57 crc kubenswrapper[4831]: I1203 06:46:57.818126 4831 generic.go:334] "Generic (PLEG): container finished" podID="13940a9c-6227-4e54-8683-376584d6f937" containerID="97b35408c651f63379d23754b77f3d22cdf30cf72c0b287186291592ad192295" exitCode=0 Dec 03 06:46:57 crc kubenswrapper[4831]: I1203 06:46:57.818161 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dz8wk" event={"ID":"13940a9c-6227-4e54-8683-376584d6f937","Type":"ContainerDied","Data":"97b35408c651f63379d23754b77f3d22cdf30cf72c0b287186291592ad192295"} Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.462116 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.475976 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg8zb\" (UniqueName: \"kubernetes.io/projected/13940a9c-6227-4e54-8683-376584d6f937-kube-api-access-xg8zb\") pod \"13940a9c-6227-4e54-8683-376584d6f937\" (UID: \"13940a9c-6227-4e54-8683-376584d6f937\") " Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.476031 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13940a9c-6227-4e54-8683-376584d6f937-utilities\") pod \"13940a9c-6227-4e54-8683-376584d6f937\" (UID: \"13940a9c-6227-4e54-8683-376584d6f937\") " Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.476196 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13940a9c-6227-4e54-8683-376584d6f937-catalog-content\") pod \"13940a9c-6227-4e54-8683-376584d6f937\" (UID: \"13940a9c-6227-4e54-8683-376584d6f937\") " Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.477919 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13940a9c-6227-4e54-8683-376584d6f937-utilities" (OuterVolumeSpecName: "utilities") pod "13940a9c-6227-4e54-8683-376584d6f937" (UID: "13940a9c-6227-4e54-8683-376584d6f937"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.486883 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13940a9c-6227-4e54-8683-376584d6f937-kube-api-access-xg8zb" (OuterVolumeSpecName: "kube-api-access-xg8zb") pod "13940a9c-6227-4e54-8683-376584d6f937" (UID: "13940a9c-6227-4e54-8683-376584d6f937"). InnerVolumeSpecName "kube-api-access-xg8zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.538065 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13940a9c-6227-4e54-8683-376584d6f937-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "13940a9c-6227-4e54-8683-376584d6f937" (UID: "13940a9c-6227-4e54-8683-376584d6f937"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.578946 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/13940a9c-6227-4e54-8683-376584d6f937-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.579012 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg8zb\" (UniqueName: \"kubernetes.io/projected/13940a9c-6227-4e54-8683-376584d6f937-kube-api-access-xg8zb\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.579027 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/13940a9c-6227-4e54-8683-376584d6f937-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.847620 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c" event={"ID":"4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b","Type":"ContainerStarted","Data":"39970e06ccf343b2093077d4231ce33181f8316ad72b7d4afebd8af43d092fc0"} Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.851592 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dz8wk" event={"ID":"13940a9c-6227-4e54-8683-376584d6f937","Type":"ContainerDied","Data":"027244d763e696bf7fe47d010e088769892a0208e3c9a8dfbf66d5f79914acfa"} Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.851650 4831 scope.go:117] "RemoveContainer" containerID="97b35408c651f63379d23754b77f3d22cdf30cf72c0b287186291592ad192295" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.851605 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dz8wk" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.854135 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-rs4hm" event={"ID":"2334e9c7-0d74-4b40-bfa2-42916f53c7aa","Type":"ContainerStarted","Data":"51bb086ededa0921e3fd2189e0b06ae749b5fd037cac84dcf3b0cd5c7fbacc69"} Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.854348 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-rs4hm" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.857213 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"fe6e405940a8abb32a63a2b267869e6a6149449d90d59de98ade036092eb761f"} Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.873546 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-gkk6c" podStartSLOduration=1.94191004 podStartE2EDuration="10.873525438s" podCreationTimestamp="2025-12-03 06:46:51 +0000 UTC" firstStartedPulling="2025-12-03 06:46:52.366909532 +0000 UTC m=+949.710493070" lastFinishedPulling="2025-12-03 06:47:01.29852497 +0000 UTC m=+958.642108468" observedRunningTime="2025-12-03 06:47:01.865542147 +0000 UTC m=+959.209125675" watchObservedRunningTime="2025-12-03 06:47:01.873525438 +0000 UTC m=+959.217108956" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.889128 4831 scope.go:117] "RemoveContainer" containerID="2c7453f1612410c1f7e5d331271b26e05615612b0402a7bfb9671158f4f7e3de" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.919221 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-rs4hm" podStartSLOduration=1.928415782 podStartE2EDuration="11.91920587s" podCreationTimestamp="2025-12-03 06:46:50 +0000 UTC" firstStartedPulling="2025-12-03 06:46:51.304086328 +0000 UTC m=+948.647669836" lastFinishedPulling="2025-12-03 06:47:01.294876406 +0000 UTC m=+958.638459924" observedRunningTime="2025-12-03 06:47:01.914022287 +0000 UTC m=+959.257605795" watchObservedRunningTime="2025-12-03 06:47:01.91920587 +0000 UTC m=+959.262789368" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.921666 4831 scope.go:117] "RemoveContainer" containerID="6d263d83f6a7d21a93c170dbe49068036ccd1158cf8b137fbd2c254b23f40c4d" Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.939230 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dz8wk"] Dec 03 06:47:01 crc kubenswrapper[4831]: I1203 06:47:01.944237 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dz8wk"] Dec 03 06:47:03 crc kubenswrapper[4831]: I1203 06:47:03.025215 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13940a9c-6227-4e54-8683-376584d6f937" path="/var/lib/kubelet/pods/13940a9c-6227-4e54-8683-376584d6f937/volumes" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.119514 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-ztn66"] Dec 03 06:47:07 crc kubenswrapper[4831]: E1203 06:47:07.120343 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13940a9c-6227-4e54-8683-376584d6f937" containerName="extract-content" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.120359 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="13940a9c-6227-4e54-8683-376584d6f937" containerName="extract-content" Dec 03 06:47:07 crc kubenswrapper[4831]: E1203 06:47:07.120375 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13940a9c-6227-4e54-8683-376584d6f937" containerName="registry-server" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.120382 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="13940a9c-6227-4e54-8683-376584d6f937" containerName="registry-server" Dec 03 06:47:07 crc kubenswrapper[4831]: E1203 06:47:07.120411 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13940a9c-6227-4e54-8683-376584d6f937" containerName="extract-utilities" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.120422 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="13940a9c-6227-4e54-8683-376584d6f937" containerName="extract-utilities" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.120593 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="13940a9c-6227-4e54-8683-376584d6f937" containerName="registry-server" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.121150 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-ztn66" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.123803 4831 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-fjfzp" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.133138 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-ztn66"] Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.162833 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/28530532-eea3-4d88-9615-e7525130ea89-bound-sa-token\") pod \"cert-manager-86cb77c54b-ztn66\" (UID: \"28530532-eea3-4d88-9615-e7525130ea89\") " pod="cert-manager/cert-manager-86cb77c54b-ztn66" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.162869 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhl4b\" (UniqueName: \"kubernetes.io/projected/28530532-eea3-4d88-9615-e7525130ea89-kube-api-access-nhl4b\") pod \"cert-manager-86cb77c54b-ztn66\" (UID: \"28530532-eea3-4d88-9615-e7525130ea89\") " pod="cert-manager/cert-manager-86cb77c54b-ztn66" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.264460 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/28530532-eea3-4d88-9615-e7525130ea89-bound-sa-token\") pod \"cert-manager-86cb77c54b-ztn66\" (UID: \"28530532-eea3-4d88-9615-e7525130ea89\") " pod="cert-manager/cert-manager-86cb77c54b-ztn66" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.264525 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhl4b\" (UniqueName: \"kubernetes.io/projected/28530532-eea3-4d88-9615-e7525130ea89-kube-api-access-nhl4b\") pod \"cert-manager-86cb77c54b-ztn66\" (UID: \"28530532-eea3-4d88-9615-e7525130ea89\") " pod="cert-manager/cert-manager-86cb77c54b-ztn66" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.288982 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhl4b\" (UniqueName: \"kubernetes.io/projected/28530532-eea3-4d88-9615-e7525130ea89-kube-api-access-nhl4b\") pod \"cert-manager-86cb77c54b-ztn66\" (UID: \"28530532-eea3-4d88-9615-e7525130ea89\") " pod="cert-manager/cert-manager-86cb77c54b-ztn66" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.295148 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/28530532-eea3-4d88-9615-e7525130ea89-bound-sa-token\") pod \"cert-manager-86cb77c54b-ztn66\" (UID: \"28530532-eea3-4d88-9615-e7525130ea89\") " pod="cert-manager/cert-manager-86cb77c54b-ztn66" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.450347 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-ztn66" Dec 03 06:47:07 crc kubenswrapper[4831]: I1203 06:47:07.991911 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-ztn66"] Dec 03 06:47:08 crc kubenswrapper[4831]: W1203 06:47:08.001326 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod28530532_eea3_4d88_9615_e7525130ea89.slice/crio-3a52310f171a916588c9318e282f9baff66090615df3698ae6f879c7caa473ac WatchSource:0}: Error finding container 3a52310f171a916588c9318e282f9baff66090615df3698ae6f879c7caa473ac: Status 404 returned error can't find the container with id 3a52310f171a916588c9318e282f9baff66090615df3698ae6f879c7caa473ac Dec 03 06:47:08 crc kubenswrapper[4831]: I1203 06:47:08.918377 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-ztn66" event={"ID":"28530532-eea3-4d88-9615-e7525130ea89","Type":"ContainerStarted","Data":"ba70e48f715769077fe6c5b89d1adb5019d2f9236864a41812e3ebec0c41e0fb"} Dec 03 06:47:08 crc kubenswrapper[4831]: I1203 06:47:08.918789 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-ztn66" event={"ID":"28530532-eea3-4d88-9615-e7525130ea89","Type":"ContainerStarted","Data":"3a52310f171a916588c9318e282f9baff66090615df3698ae6f879c7caa473ac"} Dec 03 06:47:08 crc kubenswrapper[4831]: I1203 06:47:08.943913 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-ztn66" podStartSLOduration=1.943889724 podStartE2EDuration="1.943889724s" podCreationTimestamp="2025-12-03 06:47:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:47:08.940045294 +0000 UTC m=+966.283628832" watchObservedRunningTime="2025-12-03 06:47:08.943889724 +0000 UTC m=+966.287473272" Dec 03 06:47:10 crc kubenswrapper[4831]: I1203 06:47:10.854660 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-rs4hm" Dec 03 06:47:14 crc kubenswrapper[4831]: I1203 06:47:14.021461 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-wjwhc"] Dec 03 06:47:14 crc kubenswrapper[4831]: I1203 06:47:14.023561 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wjwhc" Dec 03 06:47:14 crc kubenswrapper[4831]: I1203 06:47:14.025417 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Dec 03 06:47:14 crc kubenswrapper[4831]: I1203 06:47:14.025989 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Dec 03 06:47:14 crc kubenswrapper[4831]: I1203 06:47:14.027941 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-ch6rg" Dec 03 06:47:14 crc kubenswrapper[4831]: I1203 06:47:14.032066 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wjwhc"] Dec 03 06:47:14 crc kubenswrapper[4831]: I1203 06:47:14.070668 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95ntq\" (UniqueName: \"kubernetes.io/projected/4ce91a94-12bd-4653-89b0-eaa20b299d5a-kube-api-access-95ntq\") pod \"openstack-operator-index-wjwhc\" (UID: \"4ce91a94-12bd-4653-89b0-eaa20b299d5a\") " pod="openstack-operators/openstack-operator-index-wjwhc" Dec 03 06:47:14 crc kubenswrapper[4831]: I1203 06:47:14.172660 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95ntq\" (UniqueName: \"kubernetes.io/projected/4ce91a94-12bd-4653-89b0-eaa20b299d5a-kube-api-access-95ntq\") pod \"openstack-operator-index-wjwhc\" (UID: \"4ce91a94-12bd-4653-89b0-eaa20b299d5a\") " pod="openstack-operators/openstack-operator-index-wjwhc" Dec 03 06:47:14 crc kubenswrapper[4831]: I1203 06:47:14.190522 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95ntq\" (UniqueName: \"kubernetes.io/projected/4ce91a94-12bd-4653-89b0-eaa20b299d5a-kube-api-access-95ntq\") pod \"openstack-operator-index-wjwhc\" (UID: \"4ce91a94-12bd-4653-89b0-eaa20b299d5a\") " pod="openstack-operators/openstack-operator-index-wjwhc" Dec 03 06:47:14 crc kubenswrapper[4831]: I1203 06:47:14.348458 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wjwhc" Dec 03 06:47:14 crc kubenswrapper[4831]: I1203 06:47:14.821076 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-wjwhc"] Dec 03 06:47:14 crc kubenswrapper[4831]: I1203 06:47:14.959618 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wjwhc" event={"ID":"4ce91a94-12bd-4653-89b0-eaa20b299d5a","Type":"ContainerStarted","Data":"c1036cf4848f9cd0f18e29d4b669f89265650a7dbabc24c9313e7f65c5a22ac7"} Dec 03 06:47:17 crc kubenswrapper[4831]: I1203 06:47:17.983545 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wjwhc" event={"ID":"4ce91a94-12bd-4653-89b0-eaa20b299d5a","Type":"ContainerStarted","Data":"d94d887b9776baa8d1920fee3c2e195660fc63b5772b77b96311afdff7678f97"} Dec 03 06:47:18 crc kubenswrapper[4831]: I1203 06:47:18.004033 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-wjwhc" podStartSLOduration=2.936640553 podStartE2EDuration="5.004004132s" podCreationTimestamp="2025-12-03 06:47:13 +0000 UTC" firstStartedPulling="2025-12-03 06:47:14.832141781 +0000 UTC m=+972.175725289" lastFinishedPulling="2025-12-03 06:47:16.89950536 +0000 UTC m=+974.243088868" observedRunningTime="2025-12-03 06:47:18.000125491 +0000 UTC m=+975.343709009" watchObservedRunningTime="2025-12-03 06:47:18.004004132 +0000 UTC m=+975.347587680" Dec 03 06:47:18 crc kubenswrapper[4831]: I1203 06:47:18.988506 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wjwhc"] Dec 03 06:47:19 crc kubenswrapper[4831]: I1203 06:47:19.619404 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-k24lq"] Dec 03 06:47:19 crc kubenswrapper[4831]: I1203 06:47:19.620914 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k24lq" Dec 03 06:47:19 crc kubenswrapper[4831]: I1203 06:47:19.637030 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k24lq"] Dec 03 06:47:19 crc kubenswrapper[4831]: I1203 06:47:19.762265 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6kbs\" (UniqueName: \"kubernetes.io/projected/f32bda31-93c7-4b6e-af11-0db32110019e-kube-api-access-v6kbs\") pod \"openstack-operator-index-k24lq\" (UID: \"f32bda31-93c7-4b6e-af11-0db32110019e\") " pod="openstack-operators/openstack-operator-index-k24lq" Dec 03 06:47:19 crc kubenswrapper[4831]: I1203 06:47:19.863842 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6kbs\" (UniqueName: \"kubernetes.io/projected/f32bda31-93c7-4b6e-af11-0db32110019e-kube-api-access-v6kbs\") pod \"openstack-operator-index-k24lq\" (UID: \"f32bda31-93c7-4b6e-af11-0db32110019e\") " pod="openstack-operators/openstack-operator-index-k24lq" Dec 03 06:47:19 crc kubenswrapper[4831]: I1203 06:47:19.886146 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6kbs\" (UniqueName: \"kubernetes.io/projected/f32bda31-93c7-4b6e-af11-0db32110019e-kube-api-access-v6kbs\") pod \"openstack-operator-index-k24lq\" (UID: \"f32bda31-93c7-4b6e-af11-0db32110019e\") " pod="openstack-operators/openstack-operator-index-k24lq" Dec 03 06:47:19 crc kubenswrapper[4831]: I1203 06:47:19.954097 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-k24lq" Dec 03 06:47:19 crc kubenswrapper[4831]: I1203 06:47:19.999289 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-wjwhc" podUID="4ce91a94-12bd-4653-89b0-eaa20b299d5a" containerName="registry-server" containerID="cri-o://d94d887b9776baa8d1920fee3c2e195660fc63b5772b77b96311afdff7678f97" gracePeriod=2 Dec 03 06:47:20 crc kubenswrapper[4831]: I1203 06:47:20.398016 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wjwhc" Dec 03 06:47:20 crc kubenswrapper[4831]: I1203 06:47:20.440901 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-k24lq"] Dec 03 06:47:20 crc kubenswrapper[4831]: W1203 06:47:20.457709 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf32bda31_93c7_4b6e_af11_0db32110019e.slice/crio-3d98853bacc2b3e237c5bc05a77b82d42e64c929445d7040bade5c217a92f582 WatchSource:0}: Error finding container 3d98853bacc2b3e237c5bc05a77b82d42e64c929445d7040bade5c217a92f582: Status 404 returned error can't find the container with id 3d98853bacc2b3e237c5bc05a77b82d42e64c929445d7040bade5c217a92f582 Dec 03 06:47:20 crc kubenswrapper[4831]: I1203 06:47:20.573355 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95ntq\" (UniqueName: \"kubernetes.io/projected/4ce91a94-12bd-4653-89b0-eaa20b299d5a-kube-api-access-95ntq\") pod \"4ce91a94-12bd-4653-89b0-eaa20b299d5a\" (UID: \"4ce91a94-12bd-4653-89b0-eaa20b299d5a\") " Dec 03 06:47:20 crc kubenswrapper[4831]: I1203 06:47:20.581760 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ce91a94-12bd-4653-89b0-eaa20b299d5a-kube-api-access-95ntq" (OuterVolumeSpecName: "kube-api-access-95ntq") pod "4ce91a94-12bd-4653-89b0-eaa20b299d5a" (UID: "4ce91a94-12bd-4653-89b0-eaa20b299d5a"). InnerVolumeSpecName "kube-api-access-95ntq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:47:20 crc kubenswrapper[4831]: I1203 06:47:20.675220 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95ntq\" (UniqueName: \"kubernetes.io/projected/4ce91a94-12bd-4653-89b0-eaa20b299d5a-kube-api-access-95ntq\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:21 crc kubenswrapper[4831]: I1203 06:47:21.017604 4831 generic.go:334] "Generic (PLEG): container finished" podID="4ce91a94-12bd-4653-89b0-eaa20b299d5a" containerID="d94d887b9776baa8d1920fee3c2e195660fc63b5772b77b96311afdff7678f97" exitCode=0 Dec 03 06:47:21 crc kubenswrapper[4831]: I1203 06:47:21.017707 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-wjwhc" Dec 03 06:47:21 crc kubenswrapper[4831]: I1203 06:47:21.028962 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k24lq" event={"ID":"f32bda31-93c7-4b6e-af11-0db32110019e","Type":"ContainerStarted","Data":"3bb321c07e9f7e2f8a4f81b1c69084ef2f46e9443c044f002ad37df134ae8347"} Dec 03 06:47:21 crc kubenswrapper[4831]: I1203 06:47:21.029021 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-k24lq" event={"ID":"f32bda31-93c7-4b6e-af11-0db32110019e","Type":"ContainerStarted","Data":"3d98853bacc2b3e237c5bc05a77b82d42e64c929445d7040bade5c217a92f582"} Dec 03 06:47:21 crc kubenswrapper[4831]: I1203 06:47:21.029042 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wjwhc" event={"ID":"4ce91a94-12bd-4653-89b0-eaa20b299d5a","Type":"ContainerDied","Data":"d94d887b9776baa8d1920fee3c2e195660fc63b5772b77b96311afdff7678f97"} Dec 03 06:47:21 crc kubenswrapper[4831]: I1203 06:47:21.029069 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-wjwhc" event={"ID":"4ce91a94-12bd-4653-89b0-eaa20b299d5a","Type":"ContainerDied","Data":"c1036cf4848f9cd0f18e29d4b669f89265650a7dbabc24c9313e7f65c5a22ac7"} Dec 03 06:47:21 crc kubenswrapper[4831]: I1203 06:47:21.029098 4831 scope.go:117] "RemoveContainer" containerID="d94d887b9776baa8d1920fee3c2e195660fc63b5772b77b96311afdff7678f97" Dec 03 06:47:21 crc kubenswrapper[4831]: I1203 06:47:21.046598 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-k24lq" podStartSLOduration=1.99128376 podStartE2EDuration="2.04657516s" podCreationTimestamp="2025-12-03 06:47:19 +0000 UTC" firstStartedPulling="2025-12-03 06:47:20.465067594 +0000 UTC m=+977.808651142" lastFinishedPulling="2025-12-03 06:47:20.520359024 +0000 UTC m=+977.863942542" observedRunningTime="2025-12-03 06:47:21.035677089 +0000 UTC m=+978.379260627" watchObservedRunningTime="2025-12-03 06:47:21.04657516 +0000 UTC m=+978.390158678" Dec 03 06:47:21 crc kubenswrapper[4831]: I1203 06:47:21.066092 4831 scope.go:117] "RemoveContainer" containerID="d94d887b9776baa8d1920fee3c2e195660fc63b5772b77b96311afdff7678f97" Dec 03 06:47:21 crc kubenswrapper[4831]: E1203 06:47:21.070196 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d94d887b9776baa8d1920fee3c2e195660fc63b5772b77b96311afdff7678f97\": container with ID starting with d94d887b9776baa8d1920fee3c2e195660fc63b5772b77b96311afdff7678f97 not found: ID does not exist" containerID="d94d887b9776baa8d1920fee3c2e195660fc63b5772b77b96311afdff7678f97" Dec 03 06:47:21 crc kubenswrapper[4831]: I1203 06:47:21.070383 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d94d887b9776baa8d1920fee3c2e195660fc63b5772b77b96311afdff7678f97"} err="failed to get container status \"d94d887b9776baa8d1920fee3c2e195660fc63b5772b77b96311afdff7678f97\": rpc error: code = NotFound desc = could not find container \"d94d887b9776baa8d1920fee3c2e195660fc63b5772b77b96311afdff7678f97\": container with ID starting with d94d887b9776baa8d1920fee3c2e195660fc63b5772b77b96311afdff7678f97 not found: ID does not exist" Dec 03 06:47:21 crc kubenswrapper[4831]: I1203 06:47:21.076869 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-wjwhc"] Dec 03 06:47:21 crc kubenswrapper[4831]: I1203 06:47:21.082647 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-wjwhc"] Dec 03 06:47:23 crc kubenswrapper[4831]: I1203 06:47:23.025274 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ce91a94-12bd-4653-89b0-eaa20b299d5a" path="/var/lib/kubelet/pods/4ce91a94-12bd-4653-89b0-eaa20b299d5a/volumes" Dec 03 06:47:29 crc kubenswrapper[4831]: I1203 06:47:29.954708 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-k24lq" Dec 03 06:47:29 crc kubenswrapper[4831]: I1203 06:47:29.955351 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-k24lq" Dec 03 06:47:29 crc kubenswrapper[4831]: I1203 06:47:29.982163 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-k24lq" Dec 03 06:47:30 crc kubenswrapper[4831]: I1203 06:47:30.107973 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-k24lq" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.577531 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n"] Dec 03 06:47:38 crc kubenswrapper[4831]: E1203 06:47:38.578564 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ce91a94-12bd-4653-89b0-eaa20b299d5a" containerName="registry-server" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.578585 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ce91a94-12bd-4653-89b0-eaa20b299d5a" containerName="registry-server" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.578760 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ce91a94-12bd-4653-89b0-eaa20b299d5a" containerName="registry-server" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.580080 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.582562 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-brp95" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.596504 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n"] Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.724689 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls962\" (UniqueName: \"kubernetes.io/projected/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-kube-api-access-ls962\") pod \"de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n\" (UID: \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\") " pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.725835 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-util\") pod \"de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n\" (UID: \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\") " pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.726130 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-bundle\") pod \"de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n\" (UID: \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\") " pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.827116 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-bundle\") pod \"de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n\" (UID: \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\") " pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.827219 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls962\" (UniqueName: \"kubernetes.io/projected/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-kube-api-access-ls962\") pod \"de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n\" (UID: \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\") " pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.827412 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-util\") pod \"de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n\" (UID: \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\") " pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.827949 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-bundle\") pod \"de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n\" (UID: \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\") " pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.828151 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-util\") pod \"de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n\" (UID: \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\") " pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.862252 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls962\" (UniqueName: \"kubernetes.io/projected/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-kube-api-access-ls962\") pod \"de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n\" (UID: \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\") " pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" Dec 03 06:47:38 crc kubenswrapper[4831]: I1203 06:47:38.914767 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" Dec 03 06:47:39 crc kubenswrapper[4831]: I1203 06:47:39.422868 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n"] Dec 03 06:47:39 crc kubenswrapper[4831]: W1203 06:47:39.430371 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcfbcafe_4cef_4e71_a46f_e655a31beb6b.slice/crio-0d1d7b429d10909733e00df6ebe9ecc6c9f35778c1d26ebda3f05b641292343c WatchSource:0}: Error finding container 0d1d7b429d10909733e00df6ebe9ecc6c9f35778c1d26ebda3f05b641292343c: Status 404 returned error can't find the container with id 0d1d7b429d10909733e00df6ebe9ecc6c9f35778c1d26ebda3f05b641292343c Dec 03 06:47:40 crc kubenswrapper[4831]: I1203 06:47:40.161089 4831 generic.go:334] "Generic (PLEG): container finished" podID="dcfbcafe-4cef-4e71-a46f-e655a31beb6b" containerID="a347672d0ea19b3f76582da6eafe1625748a5279d0bdd0fd7608ec9ad5b3e013" exitCode=0 Dec 03 06:47:40 crc kubenswrapper[4831]: I1203 06:47:40.161172 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" event={"ID":"dcfbcafe-4cef-4e71-a46f-e655a31beb6b","Type":"ContainerDied","Data":"a347672d0ea19b3f76582da6eafe1625748a5279d0bdd0fd7608ec9ad5b3e013"} Dec 03 06:47:40 crc kubenswrapper[4831]: I1203 06:47:40.164969 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" event={"ID":"dcfbcafe-4cef-4e71-a46f-e655a31beb6b","Type":"ContainerStarted","Data":"0d1d7b429d10909733e00df6ebe9ecc6c9f35778c1d26ebda3f05b641292343c"} Dec 03 06:47:41 crc kubenswrapper[4831]: I1203 06:47:41.175530 4831 generic.go:334] "Generic (PLEG): container finished" podID="dcfbcafe-4cef-4e71-a46f-e655a31beb6b" containerID="4c6032733400abd10e628b2c87e441cbfdab15cd2c3da0873d884bd1e7ba0d6f" exitCode=0 Dec 03 06:47:41 crc kubenswrapper[4831]: I1203 06:47:41.175631 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" event={"ID":"dcfbcafe-4cef-4e71-a46f-e655a31beb6b","Type":"ContainerDied","Data":"4c6032733400abd10e628b2c87e441cbfdab15cd2c3da0873d884bd1e7ba0d6f"} Dec 03 06:47:42 crc kubenswrapper[4831]: I1203 06:47:42.187013 4831 generic.go:334] "Generic (PLEG): container finished" podID="dcfbcafe-4cef-4e71-a46f-e655a31beb6b" containerID="e54cbbf908261659bbb5a0df0821b1e3d031fc75a401a8ea8ea3aed8a2f30601" exitCode=0 Dec 03 06:47:42 crc kubenswrapper[4831]: I1203 06:47:42.187113 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" event={"ID":"dcfbcafe-4cef-4e71-a46f-e655a31beb6b","Type":"ContainerDied","Data":"e54cbbf908261659bbb5a0df0821b1e3d031fc75a401a8ea8ea3aed8a2f30601"} Dec 03 06:47:43 crc kubenswrapper[4831]: I1203 06:47:43.437377 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" Dec 03 06:47:43 crc kubenswrapper[4831]: I1203 06:47:43.592959 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-util\") pod \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\" (UID: \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\") " Dec 03 06:47:43 crc kubenswrapper[4831]: I1203 06:47:43.592992 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-bundle\") pod \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\" (UID: \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\") " Dec 03 06:47:43 crc kubenswrapper[4831]: I1203 06:47:43.593035 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ls962\" (UniqueName: \"kubernetes.io/projected/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-kube-api-access-ls962\") pod \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\" (UID: \"dcfbcafe-4cef-4e71-a46f-e655a31beb6b\") " Dec 03 06:47:43 crc kubenswrapper[4831]: I1203 06:47:43.594003 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-bundle" (OuterVolumeSpecName: "bundle") pod "dcfbcafe-4cef-4e71-a46f-e655a31beb6b" (UID: "dcfbcafe-4cef-4e71-a46f-e655a31beb6b"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:47:43 crc kubenswrapper[4831]: I1203 06:47:43.602741 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-kube-api-access-ls962" (OuterVolumeSpecName: "kube-api-access-ls962") pod "dcfbcafe-4cef-4e71-a46f-e655a31beb6b" (UID: "dcfbcafe-4cef-4e71-a46f-e655a31beb6b"). InnerVolumeSpecName "kube-api-access-ls962". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:47:43 crc kubenswrapper[4831]: I1203 06:47:43.616535 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-util" (OuterVolumeSpecName: "util") pod "dcfbcafe-4cef-4e71-a46f-e655a31beb6b" (UID: "dcfbcafe-4cef-4e71-a46f-e655a31beb6b"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:47:43 crc kubenswrapper[4831]: I1203 06:47:43.694125 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:43 crc kubenswrapper[4831]: I1203 06:47:43.694159 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-util\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:43 crc kubenswrapper[4831]: I1203 06:47:43.694170 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ls962\" (UniqueName: \"kubernetes.io/projected/dcfbcafe-4cef-4e71-a46f-e655a31beb6b-kube-api-access-ls962\") on node \"crc\" DevicePath \"\"" Dec 03 06:47:44 crc kubenswrapper[4831]: I1203 06:47:44.205652 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" event={"ID":"dcfbcafe-4cef-4e71-a46f-e655a31beb6b","Type":"ContainerDied","Data":"0d1d7b429d10909733e00df6ebe9ecc6c9f35778c1d26ebda3f05b641292343c"} Dec 03 06:47:44 crc kubenswrapper[4831]: I1203 06:47:44.205703 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0d1d7b429d10909733e00df6ebe9ecc6c9f35778c1d26ebda3f05b641292343c" Dec 03 06:47:44 crc kubenswrapper[4831]: I1203 06:47:44.205752 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n" Dec 03 06:47:50 crc kubenswrapper[4831]: I1203 06:47:50.944723 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5b4678cf94-jj86g"] Dec 03 06:47:50 crc kubenswrapper[4831]: E1203 06:47:50.945431 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcfbcafe-4cef-4e71-a46f-e655a31beb6b" containerName="util" Dec 03 06:47:50 crc kubenswrapper[4831]: I1203 06:47:50.945447 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcfbcafe-4cef-4e71-a46f-e655a31beb6b" containerName="util" Dec 03 06:47:50 crc kubenswrapper[4831]: E1203 06:47:50.945461 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcfbcafe-4cef-4e71-a46f-e655a31beb6b" containerName="pull" Dec 03 06:47:50 crc kubenswrapper[4831]: I1203 06:47:50.945469 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcfbcafe-4cef-4e71-a46f-e655a31beb6b" containerName="pull" Dec 03 06:47:50 crc kubenswrapper[4831]: E1203 06:47:50.945481 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dcfbcafe-4cef-4e71-a46f-e655a31beb6b" containerName="extract" Dec 03 06:47:50 crc kubenswrapper[4831]: I1203 06:47:50.945488 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dcfbcafe-4cef-4e71-a46f-e655a31beb6b" containerName="extract" Dec 03 06:47:50 crc kubenswrapper[4831]: I1203 06:47:50.945623 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="dcfbcafe-4cef-4e71-a46f-e655a31beb6b" containerName="extract" Dec 03 06:47:50 crc kubenswrapper[4831]: I1203 06:47:50.946012 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5b4678cf94-jj86g" Dec 03 06:47:50 crc kubenswrapper[4831]: I1203 06:47:50.948035 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-operator-dockercfg-g9z46" Dec 03 06:47:50 crc kubenswrapper[4831]: I1203 06:47:50.978097 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5b4678cf94-jj86g"] Dec 03 06:47:50 crc kubenswrapper[4831]: I1203 06:47:50.996979 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qlj4\" (UniqueName: \"kubernetes.io/projected/3ed824d8-b8a7-4ada-9d92-a8d58a5e91b1-kube-api-access-6qlj4\") pod \"openstack-operator-controller-operator-5b4678cf94-jj86g\" (UID: \"3ed824d8-b8a7-4ada-9d92-a8d58a5e91b1\") " pod="openstack-operators/openstack-operator-controller-operator-5b4678cf94-jj86g" Dec 03 06:47:51 crc kubenswrapper[4831]: I1203 06:47:51.098793 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qlj4\" (UniqueName: \"kubernetes.io/projected/3ed824d8-b8a7-4ada-9d92-a8d58a5e91b1-kube-api-access-6qlj4\") pod \"openstack-operator-controller-operator-5b4678cf94-jj86g\" (UID: \"3ed824d8-b8a7-4ada-9d92-a8d58a5e91b1\") " pod="openstack-operators/openstack-operator-controller-operator-5b4678cf94-jj86g" Dec 03 06:47:51 crc kubenswrapper[4831]: I1203 06:47:51.120480 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qlj4\" (UniqueName: \"kubernetes.io/projected/3ed824d8-b8a7-4ada-9d92-a8d58a5e91b1-kube-api-access-6qlj4\") pod \"openstack-operator-controller-operator-5b4678cf94-jj86g\" (UID: \"3ed824d8-b8a7-4ada-9d92-a8d58a5e91b1\") " pod="openstack-operators/openstack-operator-controller-operator-5b4678cf94-jj86g" Dec 03 06:47:51 crc kubenswrapper[4831]: I1203 06:47:51.262041 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-operator-5b4678cf94-jj86g" Dec 03 06:47:51 crc kubenswrapper[4831]: I1203 06:47:51.709592 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-operator-5b4678cf94-jj86g"] Dec 03 06:47:52 crc kubenswrapper[4831]: I1203 06:47:52.264769 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5b4678cf94-jj86g" event={"ID":"3ed824d8-b8a7-4ada-9d92-a8d58a5e91b1","Type":"ContainerStarted","Data":"ca1ffdb02c953e9e7b941369fc1bbbbc9e964e291b2662f9471deb26ebf20399"} Dec 03 06:47:57 crc kubenswrapper[4831]: I1203 06:47:57.299125 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-operator-5b4678cf94-jj86g" event={"ID":"3ed824d8-b8a7-4ada-9d92-a8d58a5e91b1","Type":"ContainerStarted","Data":"c59fafdd58eb83e19882c41ea278410e832eef31de6115707033c9cf0cda5104"} Dec 03 06:47:57 crc kubenswrapper[4831]: I1203 06:47:57.299698 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-operator-5b4678cf94-jj86g" Dec 03 06:47:57 crc kubenswrapper[4831]: I1203 06:47:57.352236 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-operator-5b4678cf94-jj86g" podStartSLOduration=2.454523337 podStartE2EDuration="7.352210507s" podCreationTimestamp="2025-12-03 06:47:50 +0000 UTC" firstStartedPulling="2025-12-03 06:47:51.718599029 +0000 UTC m=+1009.062182547" lastFinishedPulling="2025-12-03 06:47:56.616286199 +0000 UTC m=+1013.959869717" observedRunningTime="2025-12-03 06:47:57.344724043 +0000 UTC m=+1014.688307561" watchObservedRunningTime="2025-12-03 06:47:57.352210507 +0000 UTC m=+1014.695794055" Dec 03 06:48:01 crc kubenswrapper[4831]: I1203 06:48:01.266289 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-operator-5b4678cf94-jj86g" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.232753 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.234123 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.241921 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.242590 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-7cb9b" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.269889 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.270852 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.273690 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-8wknx" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.274743 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wczqs\" (UniqueName: \"kubernetes.io/projected/379fb9f5-e9c6-4362-b40a-c80ac7f58562-kube-api-access-wczqs\") pod \"cinder-operator-controller-manager-859b6ccc6-mcp6w\" (UID: \"379fb9f5-e9c6-4362-b40a-c80ac7f58562\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.274800 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rjrk\" (UniqueName: \"kubernetes.io/projected/dd91cf33-b91c-4430-ae22-ff8f52171f08-kube-api-access-8rjrk\") pod \"barbican-operator-controller-manager-7d9dfd778-9tvxr\" (UID: \"dd91cf33-b91c-4430-ae22-ff8f52171f08\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.278572 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.279768 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.282684 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-xzd9t" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.292542 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.298293 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.305983 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-cd784" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.324565 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.336998 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.340179 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.342981 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-ctz8p" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.355186 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.356144 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.364619 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-fzk9s" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.371387 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.375962 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lbsn\" (UniqueName: \"kubernetes.io/projected/13a6a910-c42b-4ba4-85ad-62d932c41b4d-kube-api-access-4lbsn\") pod \"glance-operator-controller-manager-77987cd8cd-xgw7z\" (UID: \"13a6a910-c42b-4ba4-85ad-62d932c41b4d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.376007 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ls9k\" (UniqueName: \"kubernetes.io/projected/facee23f-2039-4bc2-84e2-c209c96f0812-kube-api-access-2ls9k\") pod \"heat-operator-controller-manager-5f64f6f8bb-vqvxq\" (UID: \"facee23f-2039-4bc2-84e2-c209c96f0812\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.376036 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd6vv\" (UniqueName: \"kubernetes.io/projected/f4448da7-6edc-46ba-8a6c-d5491ddfc9a2-kube-api-access-sd6vv\") pod \"horizon-operator-controller-manager-68c6d99b8f-cvxcd\" (UID: \"f4448da7-6edc-46ba-8a6c-d5491ddfc9a2\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.376070 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wczqs\" (UniqueName: \"kubernetes.io/projected/379fb9f5-e9c6-4362-b40a-c80ac7f58562-kube-api-access-wczqs\") pod \"cinder-operator-controller-manager-859b6ccc6-mcp6w\" (UID: \"379fb9f5-e9c6-4362-b40a-c80ac7f58562\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.376094 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rjrk\" (UniqueName: \"kubernetes.io/projected/dd91cf33-b91c-4430-ae22-ff8f52171f08-kube-api-access-8rjrk\") pod \"barbican-operator-controller-manager-7d9dfd778-9tvxr\" (UID: \"dd91cf33-b91c-4430-ae22-ff8f52171f08\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.376130 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gncv8\" (UniqueName: \"kubernetes.io/projected/61e7e997-91d1-4a49-8243-a0032d9ce077-kube-api-access-gncv8\") pod \"designate-operator-controller-manager-78b4bc895b-5lshd\" (UID: \"61e7e997-91d1-4a49-8243-a0032d9ce077\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.386383 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.391143 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.400341 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rjrk\" (UniqueName: \"kubernetes.io/projected/dd91cf33-b91c-4430-ae22-ff8f52171f08-kube-api-access-8rjrk\") pod \"barbican-operator-controller-manager-7d9dfd778-9tvxr\" (UID: \"dd91cf33-b91c-4430-ae22-ff8f52171f08\") " pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.401298 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wczqs\" (UniqueName: \"kubernetes.io/projected/379fb9f5-e9c6-4362-b40a-c80ac7f58562-kube-api-access-wczqs\") pod \"cinder-operator-controller-manager-859b6ccc6-mcp6w\" (UID: \"379fb9f5-e9c6-4362-b40a-c80ac7f58562\") " pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.405662 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.410516 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-bmk89"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.411401 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.415208 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.417517 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-wbqsk" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.422358 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.423350 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.434922 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-kzlx9" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.441112 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-bmk89"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.450801 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.458013 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.459006 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.460278 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-nl245" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.468740 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.475870 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.476793 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gncv8\" (UniqueName: \"kubernetes.io/projected/61e7e997-91d1-4a49-8243-a0032d9ce077-kube-api-access-gncv8\") pod \"designate-operator-controller-manager-78b4bc895b-5lshd\" (UID: \"61e7e997-91d1-4a49-8243-a0032d9ce077\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.476848 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzrw9\" (UniqueName: \"kubernetes.io/projected/0566c3c5-2a1f-4fee-a3f0-ed880b1aba9b-kube-api-access-rzrw9\") pod \"ironic-operator-controller-manager-6c548fd776-l25t8\" (UID: \"0566c3c5-2a1f-4fee-a3f0-ed880b1aba9b\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.476879 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp2qp\" (UniqueName: \"kubernetes.io/projected/6589826b-47ab-4f38-bfc6-e6d79787e272-kube-api-access-jp2qp\") pod \"keystone-operator-controller-manager-7765d96ddf-xl7b2\" (UID: \"6589826b-47ab-4f38-bfc6-e6d79787e272\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.476899 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lbsn\" (UniqueName: \"kubernetes.io/projected/13a6a910-c42b-4ba4-85ad-62d932c41b4d-kube-api-access-4lbsn\") pod \"glance-operator-controller-manager-77987cd8cd-xgw7z\" (UID: \"13a6a910-c42b-4ba4-85ad-62d932c41b4d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.476916 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.476928 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ls9k\" (UniqueName: \"kubernetes.io/projected/facee23f-2039-4bc2-84e2-c209c96f0812-kube-api-access-2ls9k\") pod \"heat-operator-controller-manager-5f64f6f8bb-vqvxq\" (UID: \"facee23f-2039-4bc2-84e2-c209c96f0812\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.476944 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5jvn\" (UniqueName: \"kubernetes.io/projected/34580c97-5b51-43ab-affa-68c03a7c1d4d-kube-api-access-z5jvn\") pod \"infra-operator-controller-manager-57548d458d-bmk89\" (UID: \"34580c97-5b51-43ab-affa-68c03a7c1d4d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.476972 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd6vv\" (UniqueName: \"kubernetes.io/projected/f4448da7-6edc-46ba-8a6c-d5491ddfc9a2-kube-api-access-sd6vv\") pod \"horizon-operator-controller-manager-68c6d99b8f-cvxcd\" (UID: \"f4448da7-6edc-46ba-8a6c-d5491ddfc9a2\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.477001 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert\") pod \"infra-operator-controller-manager-57548d458d-bmk89\" (UID: \"34580c97-5b51-43ab-affa-68c03a7c1d4d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.483270 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-6ds7k" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.489293 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.512958 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.514053 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.516367 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-2l2fq" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.520371 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.523349 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ls9k\" (UniqueName: \"kubernetes.io/projected/facee23f-2039-4bc2-84e2-c209c96f0812-kube-api-access-2ls9k\") pod \"heat-operator-controller-manager-5f64f6f8bb-vqvxq\" (UID: \"facee23f-2039-4bc2-84e2-c209c96f0812\") " pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.524612 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gncv8\" (UniqueName: \"kubernetes.io/projected/61e7e997-91d1-4a49-8243-a0032d9ce077-kube-api-access-gncv8\") pod \"designate-operator-controller-manager-78b4bc895b-5lshd\" (UID: \"61e7e997-91d1-4a49-8243-a0032d9ce077\") " pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.527028 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lbsn\" (UniqueName: \"kubernetes.io/projected/13a6a910-c42b-4ba4-85ad-62d932c41b4d-kube-api-access-4lbsn\") pod \"glance-operator-controller-manager-77987cd8cd-xgw7z\" (UID: \"13a6a910-c42b-4ba4-85ad-62d932c41b4d\") " pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.534865 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd6vv\" (UniqueName: \"kubernetes.io/projected/f4448da7-6edc-46ba-8a6c-d5491ddfc9a2-kube-api-access-sd6vv\") pod \"horizon-operator-controller-manager-68c6d99b8f-cvxcd\" (UID: \"f4448da7-6edc-46ba-8a6c-d5491ddfc9a2\") " pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.548940 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.549871 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.552709 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-qpvld" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.554587 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.574372 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.579712 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mzkp\" (UniqueName: \"kubernetes.io/projected/f720b38f-39f1-4b9e-a6ee-268c76a855a0-kube-api-access-6mzkp\") pod \"mariadb-operator-controller-manager-56bbcc9d85-dg285\" (UID: \"f720b38f-39f1-4b9e-a6ee-268c76a855a0\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.579764 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzrw9\" (UniqueName: \"kubernetes.io/projected/0566c3c5-2a1f-4fee-a3f0-ed880b1aba9b-kube-api-access-rzrw9\") pod \"ironic-operator-controller-manager-6c548fd776-l25t8\" (UID: \"0566c3c5-2a1f-4fee-a3f0-ed880b1aba9b\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.579801 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp2qp\" (UniqueName: \"kubernetes.io/projected/6589826b-47ab-4f38-bfc6-e6d79787e272-kube-api-access-jp2qp\") pod \"keystone-operator-controller-manager-7765d96ddf-xl7b2\" (UID: \"6589826b-47ab-4f38-bfc6-e6d79787e272\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.579831 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5jvn\" (UniqueName: \"kubernetes.io/projected/34580c97-5b51-43ab-affa-68c03a7c1d4d-kube-api-access-z5jvn\") pod \"infra-operator-controller-manager-57548d458d-bmk89\" (UID: \"34580c97-5b51-43ab-affa-68c03a7c1d4d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.579862 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlpct\" (UniqueName: \"kubernetes.io/projected/a7a1c9f6-03de-405f-b50a-31494910f498-kube-api-access-rlpct\") pod \"manila-operator-controller-manager-7c79b5df47-k9882\" (UID: \"a7a1c9f6-03de-405f-b50a-31494910f498\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.579887 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzf2s\" (UniqueName: \"kubernetes.io/projected/273bb4e9-067c-47e7-8ef0-973e2890ecb0-kube-api-access-dzf2s\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-6wzs6\" (UID: \"273bb4e9-067c-47e7-8ef0-973e2890ecb0\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.579910 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert\") pod \"infra-operator-controller-manager-57548d458d-bmk89\" (UID: \"34580c97-5b51-43ab-affa-68c03a7c1d4d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.580258 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh"] Dec 03 06:48:35 crc kubenswrapper[4831]: E1203 06:48:35.581055 4831 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 06:48:35 crc kubenswrapper[4831]: E1203 06:48:35.581239 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert podName:34580c97-5b51-43ab-affa-68c03a7c1d4d nodeName:}" failed. No retries permitted until 2025-12-03 06:48:36.081212301 +0000 UTC m=+1053.424795809 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert") pod "infra-operator-controller-manager-57548d458d-bmk89" (UID: "34580c97-5b51-43ab-affa-68c03a7c1d4d") : secret "infra-operator-webhook-server-cert" not found Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.581425 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.585823 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-8fn7b" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.593592 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.602647 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.609721 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5jvn\" (UniqueName: \"kubernetes.io/projected/34580c97-5b51-43ab-affa-68c03a7c1d4d-kube-api-access-z5jvn\") pod \"infra-operator-controller-manager-57548d458d-bmk89\" (UID: \"34580c97-5b51-43ab-affa-68c03a7c1d4d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.610014 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.616999 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.618444 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.619729 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp2qp\" (UniqueName: \"kubernetes.io/projected/6589826b-47ab-4f38-bfc6-e6d79787e272-kube-api-access-jp2qp\") pod \"keystone-operator-controller-manager-7765d96ddf-xl7b2\" (UID: \"6589826b-47ab-4f38-bfc6-e6d79787e272\") " pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.625346 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-sqv9t" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.637120 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.637522 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzrw9\" (UniqueName: \"kubernetes.io/projected/0566c3c5-2a1f-4fee-a3f0-ed880b1aba9b-kube-api-access-rzrw9\") pod \"ironic-operator-controller-manager-6c548fd776-l25t8\" (UID: \"0566c3c5-2a1f-4fee-a3f0-ed880b1aba9b\") " pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.640757 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.665655 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.676011 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.676424 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.677459 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.681438 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-v6zsz" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.681438 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.681686 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlpct\" (UniqueName: \"kubernetes.io/projected/a7a1c9f6-03de-405f-b50a-31494910f498-kube-api-access-rlpct\") pod \"manila-operator-controller-manager-7c79b5df47-k9882\" (UID: \"a7a1c9f6-03de-405f-b50a-31494910f498\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.681745 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzf2s\" (UniqueName: \"kubernetes.io/projected/273bb4e9-067c-47e7-8ef0-973e2890ecb0-kube-api-access-dzf2s\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-6wzs6\" (UID: \"273bb4e9-067c-47e7-8ef0-973e2890ecb0\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.683407 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6mzkp\" (UniqueName: \"kubernetes.io/projected/f720b38f-39f1-4b9e-a6ee-268c76a855a0-kube-api-access-6mzkp\") pod \"mariadb-operator-controller-manager-56bbcc9d85-dg285\" (UID: \"f720b38f-39f1-4b9e-a6ee-268c76a855a0\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.721683 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlpct\" (UniqueName: \"kubernetes.io/projected/a7a1c9f6-03de-405f-b50a-31494910f498-kube-api-access-rlpct\") pod \"manila-operator-controller-manager-7c79b5df47-k9882\" (UID: \"a7a1c9f6-03de-405f-b50a-31494910f498\") " pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.729190 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzf2s\" (UniqueName: \"kubernetes.io/projected/273bb4e9-067c-47e7-8ef0-973e2890ecb0-kube-api-access-dzf2s\") pod \"neutron-operator-controller-manager-5fdfd5b6b5-6wzs6\" (UID: \"273bb4e9-067c-47e7-8ef0-973e2890ecb0\") " pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.737923 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.750054 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mzkp\" (UniqueName: \"kubernetes.io/projected/f720b38f-39f1-4b9e-a6ee-268c76a855a0-kube-api-access-6mzkp\") pod \"mariadb-operator-controller-manager-56bbcc9d85-dg285\" (UID: \"f720b38f-39f1-4b9e-a6ee-268c76a855a0\") " pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.766868 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.784417 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.786743 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8j7s\" (UniqueName: \"kubernetes.io/projected/17a7b8c5-b7a4-430e-b910-20d0c9a97dc1-kube-api-access-n8j7s\") pod \"nova-operator-controller-manager-697bc559fc-4fxmh\" (UID: \"17a7b8c5-b7a4-430e-b910-20d0c9a97dc1\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.786812 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp44f\" (UniqueName: \"kubernetes.io/projected/1a45949e-adca-4398-82a3-a0d25c8f9702-kube-api-access-jp44f\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn\" (UID: \"1a45949e-adca-4398-82a3-a0d25c8f9702\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.786862 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrcgx\" (UniqueName: \"kubernetes.io/projected/aa93d4b6-18b8-43c4-a1d2-e6c95e4eebf2-kube-api-access-jrcgx\") pod \"octavia-operator-controller-manager-998648c74-cdbrr\" (UID: \"aa93d4b6-18b8-43c4-a1d2-e6c95e4eebf2\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.786922 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn\" (UID: \"1a45949e-adca-4398-82a3-a0d25c8f9702\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.788459 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.789744 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.800599 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.811068 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.813518 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.829984 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-446jb" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.830419 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-7wqsx" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.843886 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.894713 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8j7s\" (UniqueName: \"kubernetes.io/projected/17a7b8c5-b7a4-430e-b910-20d0c9a97dc1-kube-api-access-n8j7s\") pod \"nova-operator-controller-manager-697bc559fc-4fxmh\" (UID: \"17a7b8c5-b7a4-430e-b910-20d0c9a97dc1\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.894766 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp44f\" (UniqueName: \"kubernetes.io/projected/1a45949e-adca-4398-82a3-a0d25c8f9702-kube-api-access-jp44f\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn\" (UID: \"1a45949e-adca-4398-82a3-a0d25c8f9702\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.894811 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrcgx\" (UniqueName: \"kubernetes.io/projected/aa93d4b6-18b8-43c4-a1d2-e6c95e4eebf2-kube-api-access-jrcgx\") pod \"octavia-operator-controller-manager-998648c74-cdbrr\" (UID: \"aa93d4b6-18b8-43c4-a1d2-e6c95e4eebf2\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.894838 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn\" (UID: \"1a45949e-adca-4398-82a3-a0d25c8f9702\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:48:35 crc kubenswrapper[4831]: E1203 06:48:35.894966 4831 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:48:35 crc kubenswrapper[4831]: E1203 06:48:35.895009 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert podName:1a45949e-adca-4398-82a3-a0d25c8f9702 nodeName:}" failed. No retries permitted until 2025-12-03 06:48:36.39499451 +0000 UTC m=+1053.738578008 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" (UID: "1a45949e-adca-4398-82a3-a0d25c8f9702") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.918133 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.934747 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.935359 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp44f\" (UniqueName: \"kubernetes.io/projected/1a45949e-adca-4398-82a3-a0d25c8f9702-kube-api-access-jp44f\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn\" (UID: \"1a45949e-adca-4398-82a3-a0d25c8f9702\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.936729 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8j7s\" (UniqueName: \"kubernetes.io/projected/17a7b8c5-b7a4-430e-b910-20d0c9a97dc1-kube-api-access-n8j7s\") pod \"nova-operator-controller-manager-697bc559fc-4fxmh\" (UID: \"17a7b8c5-b7a4-430e-b910-20d0c9a97dc1\") " pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.937431 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.938463 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.939181 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.941625 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrcgx\" (UniqueName: \"kubernetes.io/projected/aa93d4b6-18b8-43c4-a1d2-e6c95e4eebf2-kube-api-access-jrcgx\") pod \"octavia-operator-controller-manager-998648c74-cdbrr\" (UID: \"aa93d4b6-18b8-43c4-a1d2-e6c95e4eebf2\") " pod="openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.944753 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-t9pf4" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.965077 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.984438 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr" Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.985156 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp"] Dec 03 06:48:35 crc kubenswrapper[4831]: I1203 06:48:35.986174 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:35.995847 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.009680 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-h9nms" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.013722 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qctc\" (UniqueName: \"kubernetes.io/projected/3c0f7e03-1610-4c34-824f-c6b7ad6310ea-kube-api-access-8qctc\") pod \"placement-operator-controller-manager-78f8948974-vg5hb\" (UID: \"3c0f7e03-1610-4c34-824f-c6b7ad6310ea\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.013831 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jn542\" (UniqueName: \"kubernetes.io/projected/4628f220-2e57-479f-b91a-9dea443d3456-kube-api-access-jn542\") pod \"ovn-operator-controller-manager-b6456fdb6-kx5qm\" (UID: \"4628f220-2e57-479f-b91a-9dea443d3456\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.040466 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.117292 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jn542\" (UniqueName: \"kubernetes.io/projected/4628f220-2e57-479f-b91a-9dea443d3456-kube-api-access-jn542\") pod \"ovn-operator-controller-manager-b6456fdb6-kx5qm\" (UID: \"4628f220-2e57-479f-b91a-9dea443d3456\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.117357 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert\") pod \"infra-operator-controller-manager-57548d458d-bmk89\" (UID: \"34580c97-5b51-43ab-affa-68c03a7c1d4d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.117395 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swc29\" (UniqueName: \"kubernetes.io/projected/45d17cba-18de-45cc-8561-dd1d50b9061a-kube-api-access-swc29\") pod \"swift-operator-controller-manager-5f8c65bbfc-v9ptz\" (UID: \"45d17cba-18de-45cc-8561-dd1d50b9061a\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.117442 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5rrt\" (UniqueName: \"kubernetes.io/projected/44a494a8-2cda-4092-9510-41314e5f93c8-kube-api-access-l5rrt\") pod \"telemetry-operator-controller-manager-76cc84c6bb-txlnp\" (UID: \"44a494a8-2cda-4092-9510-41314e5f93c8\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.117461 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qctc\" (UniqueName: \"kubernetes.io/projected/3c0f7e03-1610-4c34-824f-c6b7ad6310ea-kube-api-access-8qctc\") pod \"placement-operator-controller-manager-78f8948974-vg5hb\" (UID: \"3c0f7e03-1610-4c34-824f-c6b7ad6310ea\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb" Dec 03 06:48:36 crc kubenswrapper[4831]: E1203 06:48:36.117932 4831 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 06:48:36 crc kubenswrapper[4831]: E1203 06:48:36.117970 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert podName:34580c97-5b51-43ab-affa-68c03a7c1d4d nodeName:}" failed. No retries permitted until 2025-12-03 06:48:37.117955906 +0000 UTC m=+1054.461539414 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert") pod "infra-operator-controller-manager-57548d458d-bmk89" (UID: "34580c97-5b51-43ab-affa-68c03a7c1d4d") : secret "infra-operator-webhook-server-cert" not found Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.139611 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.140614 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.157758 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-mb6ns" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.168604 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qctc\" (UniqueName: \"kubernetes.io/projected/3c0f7e03-1610-4c34-824f-c6b7ad6310ea-kube-api-access-8qctc\") pod \"placement-operator-controller-manager-78f8948974-vg5hb\" (UID: \"3c0f7e03-1610-4c34-824f-c6b7ad6310ea\") " pod="openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.187808 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.202773 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.218170 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swc29\" (UniqueName: \"kubernetes.io/projected/45d17cba-18de-45cc-8561-dd1d50b9061a-kube-api-access-swc29\") pod \"swift-operator-controller-manager-5f8c65bbfc-v9ptz\" (UID: \"45d17cba-18de-45cc-8561-dd1d50b9061a\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.218555 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5rrt\" (UniqueName: \"kubernetes.io/projected/44a494a8-2cda-4092-9510-41314e5f93c8-kube-api-access-l5rrt\") pod \"telemetry-operator-controller-manager-76cc84c6bb-txlnp\" (UID: \"44a494a8-2cda-4092-9510-41314e5f93c8\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.255071 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jn542\" (UniqueName: \"kubernetes.io/projected/4628f220-2e57-479f-b91a-9dea443d3456-kube-api-access-jn542\") pod \"ovn-operator-controller-manager-b6456fdb6-kx5qm\" (UID: \"4628f220-2e57-479f-b91a-9dea443d3456\") " pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.267090 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.268482 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.270766 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5rrt\" (UniqueName: \"kubernetes.io/projected/44a494a8-2cda-4092-9510-41314e5f93c8-kube-api-access-l5rrt\") pod \"telemetry-operator-controller-manager-76cc84c6bb-txlnp\" (UID: \"44a494a8-2cda-4092-9510-41314e5f93c8\") " pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.271298 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swc29\" (UniqueName: \"kubernetes.io/projected/45d17cba-18de-45cc-8561-dd1d50b9061a-kube-api-access-swc29\") pod \"swift-operator-controller-manager-5f8c65bbfc-v9ptz\" (UID: \"45d17cba-18de-45cc-8561-dd1d50b9061a\") " pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.271820 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-lzbrs" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.291528 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.300688 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.321371 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwfvw\" (UniqueName: \"kubernetes.io/projected/8ce51182-8548-461c-a6b9-1dae9a549221-kube-api-access-qwfvw\") pod \"test-operator-controller-manager-5854674fcc-4sh4s\" (UID: \"8ce51182-8548-461c-a6b9-1dae9a549221\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.321454 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxqbg\" (UniqueName: \"kubernetes.io/projected/c1bc052b-bcf5-43a3-a84b-7ea23a95f18a-kube-api-access-zxqbg\") pod \"watcher-operator-controller-manager-769dc69bc-2tgr7\" (UID: \"c1bc052b-bcf5-43a3-a84b-7ea23a95f18a\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.330224 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.420995 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.422176 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.422484 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxqbg\" (UniqueName: \"kubernetes.io/projected/c1bc052b-bcf5-43a3-a84b-7ea23a95f18a-kube-api-access-zxqbg\") pod \"watcher-operator-controller-manager-769dc69bc-2tgr7\" (UID: \"c1bc052b-bcf5-43a3-a84b-7ea23a95f18a\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.422565 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn\" (UID: \"1a45949e-adca-4398-82a3-a0d25c8f9702\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.422605 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwfvw\" (UniqueName: \"kubernetes.io/projected/8ce51182-8548-461c-a6b9-1dae9a549221-kube-api-access-qwfvw\") pod \"test-operator-controller-manager-5854674fcc-4sh4s\" (UID: \"8ce51182-8548-461c-a6b9-1dae9a549221\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" Dec 03 06:48:36 crc kubenswrapper[4831]: E1203 06:48:36.422916 4831 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:48:36 crc kubenswrapper[4831]: E1203 06:48:36.422955 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert podName:1a45949e-adca-4398-82a3-a0d25c8f9702 nodeName:}" failed. No retries permitted until 2025-12-03 06:48:37.42294088 +0000 UTC m=+1054.766524388 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" (UID: "1a45949e-adca-4398-82a3-a0d25c8f9702") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.435627 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.435788 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.435898 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5wpqz" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.442583 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.444236 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxqbg\" (UniqueName: \"kubernetes.io/projected/c1bc052b-bcf5-43a3-a84b-7ea23a95f18a-kube-api-access-zxqbg\") pod \"watcher-operator-controller-manager-769dc69bc-2tgr7\" (UID: \"c1bc052b-bcf5-43a3-a84b-7ea23a95f18a\") " pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.444391 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwfvw\" (UniqueName: \"kubernetes.io/projected/8ce51182-8548-461c-a6b9-1dae9a549221-kube-api-access-qwfvw\") pod \"test-operator-controller-manager-5854674fcc-4sh4s\" (UID: \"8ce51182-8548-461c-a6b9-1dae9a549221\") " pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.482148 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.492601 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.492886 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.493383 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.497538 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-rfj67" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.518108 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.525434 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.525805 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.619474 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w" event={"ID":"379fb9f5-e9c6-4362-b40a-c80ac7f58562","Type":"ContainerStarted","Data":"f846d8b55a5a278fd3656039397098dff00535596d82b53bd04a5ebe7c49b106"} Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.626462 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z" event={"ID":"13a6a910-c42b-4ba4-85ad-62d932c41b4d","Type":"ContainerStarted","Data":"f8079cecdbe107776f657aafc391cec7367b8356728ce6ce96984709b6c51861"} Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.629173 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.629219 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2sjj\" (UniqueName: \"kubernetes.io/projected/0c0294b0-7070-44b3-adc3-63f6cae3992c-kube-api-access-f2sjj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wmfmc\" (UID: \"0c0294b0-7070-44b3-adc3-63f6cae3992c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.629259 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjvn9\" (UniqueName: \"kubernetes.io/projected/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-kube-api-access-mjvn9\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.629300 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.644903 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.669921 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.674351 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd"] Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.730221 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.730264 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2sjj\" (UniqueName: \"kubernetes.io/projected/0c0294b0-7070-44b3-adc3-63f6cae3992c-kube-api-access-f2sjj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wmfmc\" (UID: \"0c0294b0-7070-44b3-adc3-63f6cae3992c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.730285 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjvn9\" (UniqueName: \"kubernetes.io/projected/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-kube-api-access-mjvn9\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.730337 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:36 crc kubenswrapper[4831]: E1203 06:48:36.730446 4831 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 06:48:36 crc kubenswrapper[4831]: E1203 06:48:36.730488 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs podName:8a2d00a9-7a0c-45d2-8f1d-080748f8366b nodeName:}" failed. No retries permitted until 2025-12-03 06:48:37.230475013 +0000 UTC m=+1054.574058521 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs") pod "openstack-operator-controller-manager-5586f6bb8b-ps5jg" (UID: "8a2d00a9-7a0c-45d2-8f1d-080748f8366b") : secret "metrics-server-cert" not found Dec 03 06:48:36 crc kubenswrapper[4831]: E1203 06:48:36.730677 4831 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 06:48:36 crc kubenswrapper[4831]: E1203 06:48:36.730701 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs podName:8a2d00a9-7a0c-45d2-8f1d-080748f8366b nodeName:}" failed. No retries permitted until 2025-12-03 06:48:37.23069283 +0000 UTC m=+1054.574276338 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs") pod "openstack-operator-controller-manager-5586f6bb8b-ps5jg" (UID: "8a2d00a9-7a0c-45d2-8f1d-080748f8366b") : secret "webhook-server-cert" not found Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.750879 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjvn9\" (UniqueName: \"kubernetes.io/projected/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-kube-api-access-mjvn9\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.755858 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2sjj\" (UniqueName: \"kubernetes.io/projected/0c0294b0-7070-44b3-adc3-63f6cae3992c-kube-api-access-f2sjj\") pod \"rabbitmq-cluster-operator-manager-668c99d594-wmfmc\" (UID: \"0c0294b0-7070-44b3-adc3-63f6cae3992c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.798823 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc" Dec 03 06:48:36 crc kubenswrapper[4831]: I1203 06:48:36.895935 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd"] Dec 03 06:48:36 crc kubenswrapper[4831]: W1203 06:48:36.955236 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61e7e997_91d1_4a49_8243_a0032d9ce077.slice/crio-412532fdedb30bef213daec50b9001f8425c8db30a598628207657eacf89ffa3 WatchSource:0}: Error finding container 412532fdedb30bef213daec50b9001f8425c8db30a598628207657eacf89ffa3: Status 404 returned error can't find the container with id 412532fdedb30bef213daec50b9001f8425c8db30a598628207657eacf89ffa3 Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.156853 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert\") pod \"infra-operator-controller-manager-57548d458d-bmk89\" (UID: \"34580c97-5b51-43ab-affa-68c03a7c1d4d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.157067 4831 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.157146 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert podName:34580c97-5b51-43ab-affa-68c03a7c1d4d nodeName:}" failed. No retries permitted until 2025-12-03 06:48:39.157128474 +0000 UTC m=+1056.500711982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert") pod "infra-operator-controller-manager-57548d458d-bmk89" (UID: "34580c97-5b51-43ab-affa-68c03a7c1d4d") : secret "infra-operator-webhook-server-cert" not found Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.190912 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882"] Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.208887 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq"] Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.218598 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285"] Dec 03 06:48:37 crc kubenswrapper[4831]: W1203 06:48:37.225694 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf720b38f_39f1_4b9e_a6ee_268c76a855a0.slice/crio-c11b3b348624d91e6c0ef3a59060065aad7a04b5c04607142fb0b46be078ec21 WatchSource:0}: Error finding container c11b3b348624d91e6c0ef3a59060065aad7a04b5c04607142fb0b46be078ec21: Status 404 returned error can't find the container with id c11b3b348624d91e6c0ef3a59060065aad7a04b5c04607142fb0b46be078ec21 Dec 03 06:48:37 crc kubenswrapper[4831]: W1203 06:48:37.233274 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfacee23f_2039_4bc2_84e2_c209c96f0812.slice/crio-b7c0b1870cc71f3995b7fccf6c58da6a5b51521b54dad6f95727cba93e67f44f WatchSource:0}: Error finding container b7c0b1870cc71f3995b7fccf6c58da6a5b51521b54dad6f95727cba93e67f44f: Status 404 returned error can't find the container with id b7c0b1870cc71f3995b7fccf6c58da6a5b51521b54dad6f95727cba93e67f44f Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.236085 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6"] Dec 03 06:48:37 crc kubenswrapper[4831]: W1203 06:48:37.240702 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7a1c9f6_03de_405f_b50a_31494910f498.slice/crio-12a45a254d2ccab89224d61132b535b97b745c802dc858c22e5d5013844edd83 WatchSource:0}: Error finding container 12a45a254d2ccab89224d61132b535b97b745c802dc858c22e5d5013844edd83: Status 404 returned error can't find the container with id 12a45a254d2ccab89224d61132b535b97b745c802dc858c22e5d5013844edd83 Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.258758 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.258880 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.259051 4831 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.259105 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs podName:8a2d00a9-7a0c-45d2-8f1d-080748f8366b nodeName:}" failed. No retries permitted until 2025-12-03 06:48:38.259090014 +0000 UTC m=+1055.602673522 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs") pod "openstack-operator-controller-manager-5586f6bb8b-ps5jg" (UID: "8a2d00a9-7a0c-45d2-8f1d-080748f8366b") : secret "webhook-server-cert" not found Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.259202 4831 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.259223 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs podName:8a2d00a9-7a0c-45d2-8f1d-080748f8366b nodeName:}" failed. No retries permitted until 2025-12-03 06:48:38.259216458 +0000 UTC m=+1055.602799966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs") pod "openstack-operator-controller-manager-5586f6bb8b-ps5jg" (UID: "8a2d00a9-7a0c-45d2-8f1d-080748f8366b") : secret "metrics-server-cert" not found Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.420384 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8"] Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.462519 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2"] Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.464115 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn\" (UID: \"1a45949e-adca-4398-82a3-a0d25c8f9702\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.464266 4831 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.464329 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert podName:1a45949e-adca-4398-82a3-a0d25c8f9702 nodeName:}" failed. No retries permitted until 2025-12-03 06:48:39.464299896 +0000 UTC m=+1056.807883404 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" (UID: "1a45949e-adca-4398-82a3-a0d25c8f9702") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.469056 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh"] Dec 03 06:48:37 crc kubenswrapper[4831]: W1203 06:48:37.477779 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod17a7b8c5_b7a4_430e_b910_20d0c9a97dc1.slice/crio-8643c22f11a143d8ce4a7dd6f4eecac489869cda92f68eb41a3b0c710fee4964 WatchSource:0}: Error finding container 8643c22f11a143d8ce4a7dd6f4eecac489869cda92f68eb41a3b0c710fee4964: Status 404 returned error can't find the container with id 8643c22f11a143d8ce4a7dd6f4eecac489869cda92f68eb41a3b0c710fee4964 Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.550857 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr"] Dec 03 06:48:37 crc kubenswrapper[4831]: W1203 06:48:37.551003 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaa93d4b6_18b8_43c4_a1d2_e6c95e4eebf2.slice/crio-f5787395913d0523b4e6acb98270dde86aada0f36eaa928c3be7fd68a7b09a2d WatchSource:0}: Error finding container f5787395913d0523b4e6acb98270dde86aada0f36eaa928c3be7fd68a7b09a2d: Status 404 returned error can't find the container with id f5787395913d0523b4e6acb98270dde86aada0f36eaa928c3be7fd68a7b09a2d Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.587176 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb"] Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.650622 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh" event={"ID":"17a7b8c5-b7a4-430e-b910-20d0c9a97dc1","Type":"ContainerStarted","Data":"8643c22f11a143d8ce4a7dd6f4eecac489869cda92f68eb41a3b0c710fee4964"} Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.669547 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq" event={"ID":"facee23f-2039-4bc2-84e2-c209c96f0812","Type":"ContainerStarted","Data":"b7c0b1870cc71f3995b7fccf6c58da6a5b51521b54dad6f95727cba93e67f44f"} Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.698673 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm"] Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.732158 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz"] Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.732806 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882" event={"ID":"a7a1c9f6-03de-405f-b50a-31494910f498","Type":"ContainerStarted","Data":"12a45a254d2ccab89224d61132b535b97b745c802dc858c22e5d5013844edd83"} Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.744780 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr" event={"ID":"aa93d4b6-18b8-43c4-a1d2-e6c95e4eebf2","Type":"ContainerStarted","Data":"f5787395913d0523b4e6acb98270dde86aada0f36eaa928c3be7fd68a7b09a2d"} Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.746855 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd" event={"ID":"61e7e997-91d1-4a49-8243-a0032d9ce077","Type":"ContainerStarted","Data":"412532fdedb30bef213daec50b9001f8425c8db30a598628207657eacf89ffa3"} Dec 03 06:48:37 crc kubenswrapper[4831]: W1203 06:48:37.752624 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4628f220_2e57_479f_b91a_9dea443d3456.slice/crio-3700e8b88440f8df1f81eb0157ed895ab6295c298006b157c5df144c4af9299d WatchSource:0}: Error finding container 3700e8b88440f8df1f81eb0157ed895ab6295c298006b157c5df144c4af9299d: Status 404 returned error can't find the container with id 3700e8b88440f8df1f81eb0157ed895ab6295c298006b157c5df144c4af9299d Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.752916 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8" event={"ID":"0566c3c5-2a1f-4fee-a3f0-ed880b1aba9b","Type":"ContainerStarted","Data":"b569f7f9787f6b8590403e62ffb754882e2044c9479ed2ed177ce85c5f7b432a"} Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.755019 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7"] Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.756305 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285" event={"ID":"f720b38f-39f1-4b9e-a6ee-268c76a855a0","Type":"ContainerStarted","Data":"c11b3b348624d91e6c0ef3a59060065aad7a04b5c04607142fb0b46be078ec21"} Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.762600 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp"] Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.779686 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb" event={"ID":"3c0f7e03-1610-4c34-824f-c6b7ad6310ea","Type":"ContainerStarted","Data":"4dbbc5b673edd92a20679fc807e4a38ac64b23ef852e7d9fd207d8de1a696217"} Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.786070 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6" event={"ID":"273bb4e9-067c-47e7-8ef0-973e2890ecb0","Type":"ContainerStarted","Data":"90f882908b16251748c1c28815648dceab629e111831e1315cc0d529e4a42e77"} Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.793006 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2" event={"ID":"6589826b-47ab-4f38-bfc6-e6d79787e272","Type":"ContainerStarted","Data":"c933dd0f9e9e2109d5f7fbb5f09bc20375a0b8ec2d2a260a73d14e978788db8a"} Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.800210 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc"] Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.804671 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s"] Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.819914 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr" event={"ID":"dd91cf33-b91c-4430-ae22-ff8f52171f08","Type":"ContainerStarted","Data":"5a887b6c1a619eb93a4e64a0b65101b5c3e83f93d8ca5c7ab76d61cc7669886a"} Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.822126 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zxqbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-2tgr7_openstack-operators(c1bc052b-bcf5-43a3-a84b-7ea23a95f18a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.825859 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zxqbg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-769dc69bc-2tgr7_openstack-operators(c1bc052b-bcf5-43a3-a84b-7ea23a95f18a): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.826995 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" podUID="c1bc052b-bcf5-43a3-a84b-7ea23a95f18a" Dec 03 06:48:37 crc kubenswrapper[4831]: I1203 06:48:37.830229 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd" event={"ID":"f4448da7-6edc-46ba-8a6c-d5491ddfc9a2","Type":"ContainerStarted","Data":"249750a5b387130e1cb8aee106ec548548e98a4ac878c3c072940eb65e561f8d"} Dec 03 06:48:37 crc kubenswrapper[4831]: W1203 06:48:37.888211 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44a494a8_2cda_4092_9510_41314e5f93c8.slice/crio-e443e1dbb43c95762f47d6021f09ced8af342372064af8092ed18c391f53b2be WatchSource:0}: Error finding container e443e1dbb43c95762f47d6021f09ced8af342372064af8092ed18c391f53b2be: Status 404 returned error can't find the container with id e443e1dbb43c95762f47d6021f09ced8af342372064af8092ed18c391f53b2be Dec 03 06:48:37 crc kubenswrapper[4831]: W1203 06:48:37.889901 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ce51182_8548_461c_a6b9_1dae9a549221.slice/crio-3ec78b52b6c3d8a3c34eae850415fe700c066b8e9ede130ad3447f4202ad873f WatchSource:0}: Error finding container 3ec78b52b6c3d8a3c34eae850415fe700c066b8e9ede130ad3447f4202ad873f: Status 404 returned error can't find the container with id 3ec78b52b6c3d8a3c34eae850415fe700c066b8e9ede130ad3447f4202ad873f Dec 03 06:48:37 crc kubenswrapper[4831]: W1203 06:48:37.893981 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c0294b0_7070_44b3_adc3_63f6cae3992c.slice/crio-e362f3a0b7412c4f670fc9e98e63cf96340fe86b3f3b9b5b1b24378e238d9e29 WatchSource:0}: Error finding container e362f3a0b7412c4f670fc9e98e63cf96340fe86b3f3b9b5b1b24378e238d9e29: Status 404 returned error can't find the container with id e362f3a0b7412c4f670fc9e98e63cf96340fe86b3f3b9b5b1b24378e238d9e29 Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.899571 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qwfvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-4sh4s_openstack-operators(8ce51182-8548-461c-a6b9-1dae9a549221): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.900078 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f2sjj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-wmfmc_openstack-operators(0c0294b0-7070-44b3-adc3-63f6cae3992c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.901655 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc" podUID="0c0294b0-7070-44b3-adc3-63f6cae3992c" Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.902666 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0,Command:[],Args:[--secure-listen-address=0.0.0.0:8443 --upstream=http://127.0.0.1:8080/ --logtostderr=true --v=0],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{134217728 0} {} BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-qwfvw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5854674fcc-4sh4s_openstack-operators(8ce51182-8548-461c-a6b9-1dae9a549221): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Dec 03 06:48:37 crc kubenswrapper[4831]: E1203 06:48:37.903829 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ErrImagePull: \"pull QPS exceeded\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" podUID="8ce51182-8548-461c-a6b9-1dae9a549221" Dec 03 06:48:38 crc kubenswrapper[4831]: I1203 06:48:38.284151 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:38 crc kubenswrapper[4831]: I1203 06:48:38.284293 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:38 crc kubenswrapper[4831]: E1203 06:48:38.284410 4831 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 06:48:38 crc kubenswrapper[4831]: E1203 06:48:38.284484 4831 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 06:48:38 crc kubenswrapper[4831]: E1203 06:48:38.284534 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs podName:8a2d00a9-7a0c-45d2-8f1d-080748f8366b nodeName:}" failed. No retries permitted until 2025-12-03 06:48:40.284509522 +0000 UTC m=+1057.628093080 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs") pod "openstack-operator-controller-manager-5586f6bb8b-ps5jg" (UID: "8a2d00a9-7a0c-45d2-8f1d-080748f8366b") : secret "metrics-server-cert" not found Dec 03 06:48:38 crc kubenswrapper[4831]: E1203 06:48:38.284571 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs podName:8a2d00a9-7a0c-45d2-8f1d-080748f8366b nodeName:}" failed. No retries permitted until 2025-12-03 06:48:40.284546133 +0000 UTC m=+1057.628129711 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs") pod "openstack-operator-controller-manager-5586f6bb8b-ps5jg" (UID: "8a2d00a9-7a0c-45d2-8f1d-080748f8366b") : secret "webhook-server-cert" not found Dec 03 06:48:38 crc kubenswrapper[4831]: I1203 06:48:38.852506 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm" event={"ID":"4628f220-2e57-479f-b91a-9dea443d3456","Type":"ContainerStarted","Data":"3700e8b88440f8df1f81eb0157ed895ab6295c298006b157c5df144c4af9299d"} Dec 03 06:48:38 crc kubenswrapper[4831]: I1203 06:48:38.861529 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" event={"ID":"c1bc052b-bcf5-43a3-a84b-7ea23a95f18a","Type":"ContainerStarted","Data":"bcdafb912277b35497a086e94057c84ace5b890eb28796d3bc991d05576e4e7a"} Dec 03 06:48:38 crc kubenswrapper[4831]: I1203 06:48:38.865894 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz" event={"ID":"45d17cba-18de-45cc-8561-dd1d50b9061a","Type":"ContainerStarted","Data":"962a209c5ee06a72dd4fae79a899fb3cd6c3a61f2be071dc76a30089bedb6f54"} Dec 03 06:48:38 crc kubenswrapper[4831]: E1203 06:48:38.866603 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" podUID="c1bc052b-bcf5-43a3-a84b-7ea23a95f18a" Dec 03 06:48:38 crc kubenswrapper[4831]: I1203 06:48:38.869732 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc" event={"ID":"0c0294b0-7070-44b3-adc3-63f6cae3992c","Type":"ContainerStarted","Data":"e362f3a0b7412c4f670fc9e98e63cf96340fe86b3f3b9b5b1b24378e238d9e29"} Dec 03 06:48:38 crc kubenswrapper[4831]: E1203 06:48:38.875595 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc" podUID="0c0294b0-7070-44b3-adc3-63f6cae3992c" Dec 03 06:48:38 crc kubenswrapper[4831]: I1203 06:48:38.876587 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp" event={"ID":"44a494a8-2cda-4092-9510-41314e5f93c8","Type":"ContainerStarted","Data":"e443e1dbb43c95762f47d6021f09ced8af342372064af8092ed18c391f53b2be"} Dec 03 06:48:38 crc kubenswrapper[4831]: I1203 06:48:38.879344 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" event={"ID":"8ce51182-8548-461c-a6b9-1dae9a549221","Type":"ContainerStarted","Data":"3ec78b52b6c3d8a3c34eae850415fe700c066b8e9ede130ad3447f4202ad873f"} Dec 03 06:48:38 crc kubenswrapper[4831]: E1203 06:48:38.887773 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" podUID="8ce51182-8548-461c-a6b9-1dae9a549221" Dec 03 06:48:39 crc kubenswrapper[4831]: I1203 06:48:39.201002 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert\") pod \"infra-operator-controller-manager-57548d458d-bmk89\" (UID: \"34580c97-5b51-43ab-affa-68c03a7c1d4d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:48:39 crc kubenswrapper[4831]: E1203 06:48:39.201399 4831 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 06:48:39 crc kubenswrapper[4831]: E1203 06:48:39.201464 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert podName:34580c97-5b51-43ab-affa-68c03a7c1d4d nodeName:}" failed. No retries permitted until 2025-12-03 06:48:43.201448926 +0000 UTC m=+1060.545032434 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert") pod "infra-operator-controller-manager-57548d458d-bmk89" (UID: "34580c97-5b51-43ab-affa-68c03a7c1d4d") : secret "infra-operator-webhook-server-cert" not found Dec 03 06:48:39 crc kubenswrapper[4831]: I1203 06:48:39.505166 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn\" (UID: \"1a45949e-adca-4398-82a3-a0d25c8f9702\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:48:39 crc kubenswrapper[4831]: E1203 06:48:39.505380 4831 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:48:39 crc kubenswrapper[4831]: E1203 06:48:39.505425 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert podName:1a45949e-adca-4398-82a3-a0d25c8f9702 nodeName:}" failed. No retries permitted until 2025-12-03 06:48:43.505410967 +0000 UTC m=+1060.848994475 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" (UID: "1a45949e-adca-4398-82a3-a0d25c8f9702") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:48:39 crc kubenswrapper[4831]: E1203 06:48:39.944732 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc" podUID="0c0294b0-7070-44b3-adc3-63f6cae3992c" Dec 03 06:48:39 crc kubenswrapper[4831]: E1203 06:48:39.945884 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:9aa8c03633e4b934c57868c1660acf47e7d386ac86bcb344df262c9ad76b8621\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" podUID="c1bc052b-bcf5-43a3-a84b-7ea23a95f18a" Dec 03 06:48:39 crc kubenswrapper[4831]: E1203 06:48:39.947999 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:101b3e007d8c9f2e183262d7712f986ad51256448099069bc14f1ea5f997ab94\\\"\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/kube-rbac-proxy:v0.16.0\\\"\"]" pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" podUID="8ce51182-8548-461c-a6b9-1dae9a549221" Dec 03 06:48:40 crc kubenswrapper[4831]: I1203 06:48:40.345164 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:40 crc kubenswrapper[4831]: I1203 06:48:40.345480 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:40 crc kubenswrapper[4831]: E1203 06:48:40.345602 4831 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 06:48:40 crc kubenswrapper[4831]: E1203 06:48:40.345643 4831 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 06:48:40 crc kubenswrapper[4831]: E1203 06:48:40.345649 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs podName:8a2d00a9-7a0c-45d2-8f1d-080748f8366b nodeName:}" failed. No retries permitted until 2025-12-03 06:48:44.345635639 +0000 UTC m=+1061.689219147 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs") pod "openstack-operator-controller-manager-5586f6bb8b-ps5jg" (UID: "8a2d00a9-7a0c-45d2-8f1d-080748f8366b") : secret "metrics-server-cert" not found Dec 03 06:48:40 crc kubenswrapper[4831]: E1203 06:48:40.345714 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs podName:8a2d00a9-7a0c-45d2-8f1d-080748f8366b nodeName:}" failed. No retries permitted until 2025-12-03 06:48:44.345696921 +0000 UTC m=+1061.689280419 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs") pod "openstack-operator-controller-manager-5586f6bb8b-ps5jg" (UID: "8a2d00a9-7a0c-45d2-8f1d-080748f8366b") : secret "webhook-server-cert" not found Dec 03 06:48:43 crc kubenswrapper[4831]: I1203 06:48:43.287718 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert\") pod \"infra-operator-controller-manager-57548d458d-bmk89\" (UID: \"34580c97-5b51-43ab-affa-68c03a7c1d4d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:48:43 crc kubenswrapper[4831]: E1203 06:48:43.287964 4831 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Dec 03 06:48:43 crc kubenswrapper[4831]: E1203 06:48:43.288023 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert podName:34580c97-5b51-43ab-affa-68c03a7c1d4d nodeName:}" failed. No retries permitted until 2025-12-03 06:48:51.288004532 +0000 UTC m=+1068.631588040 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert") pod "infra-operator-controller-manager-57548d458d-bmk89" (UID: "34580c97-5b51-43ab-affa-68c03a7c1d4d") : secret "infra-operator-webhook-server-cert" not found Dec 03 06:48:43 crc kubenswrapper[4831]: I1203 06:48:43.592376 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn\" (UID: \"1a45949e-adca-4398-82a3-a0d25c8f9702\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:48:43 crc kubenswrapper[4831]: E1203 06:48:43.592888 4831 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:48:43 crc kubenswrapper[4831]: E1203 06:48:43.592937 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert podName:1a45949e-adca-4398-82a3-a0d25c8f9702 nodeName:}" failed. No retries permitted until 2025-12-03 06:48:51.592921633 +0000 UTC m=+1068.936505141 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert") pod "openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" (UID: "1a45949e-adca-4398-82a3-a0d25c8f9702") : secret "openstack-baremetal-operator-webhook-server-cert" not found Dec 03 06:48:44 crc kubenswrapper[4831]: I1203 06:48:44.412032 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:44 crc kubenswrapper[4831]: I1203 06:48:44.412217 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:44 crc kubenswrapper[4831]: E1203 06:48:44.412227 4831 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 06:48:44 crc kubenswrapper[4831]: E1203 06:48:44.412305 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs podName:8a2d00a9-7a0c-45d2-8f1d-080748f8366b nodeName:}" failed. No retries permitted until 2025-12-03 06:48:52.412285813 +0000 UTC m=+1069.755869321 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs") pod "openstack-operator-controller-manager-5586f6bb8b-ps5jg" (UID: "8a2d00a9-7a0c-45d2-8f1d-080748f8366b") : secret "metrics-server-cert" not found Dec 03 06:48:44 crc kubenswrapper[4831]: E1203 06:48:44.412377 4831 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Dec 03 06:48:44 crc kubenswrapper[4831]: E1203 06:48:44.412430 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs podName:8a2d00a9-7a0c-45d2-8f1d-080748f8366b nodeName:}" failed. No retries permitted until 2025-12-03 06:48:52.412415957 +0000 UTC m=+1069.755999555 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs") pod "openstack-operator-controller-manager-5586f6bb8b-ps5jg" (UID: "8a2d00a9-7a0c-45d2-8f1d-080748f8366b") : secret "webhook-server-cert" not found Dec 03 06:48:49 crc kubenswrapper[4831]: E1203 06:48:49.908048 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429" Dec 03 06:48:49 crc kubenswrapper[4831]: E1203 06:48:49.908566 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c4abfc148600dfa85915f3dc911d988ea2335f26cb6b8d749fe79bfe53e5e429,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2ls9k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-5f64f6f8bb-vqvxq_openstack-operators(facee23f-2039-4bc2-84e2-c209c96f0812): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:48:50 crc kubenswrapper[4831]: E1203 06:48:50.667305 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557" Dec 03 06:48:50 crc kubenswrapper[4831]: E1203 06:48:50.667905 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:0b3fb69f35c151895d3dffd514974a9f9fe1c77c3bca69b78b81efb183cf4557,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dzf2s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-5fdfd5b6b5-6wzs6_openstack-operators(273bb4e9-067c-47e7-8ef0-973e2890ecb0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:48:51 crc kubenswrapper[4831]: I1203 06:48:51.317080 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert\") pod \"infra-operator-controller-manager-57548d458d-bmk89\" (UID: \"34580c97-5b51-43ab-affa-68c03a7c1d4d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:48:51 crc kubenswrapper[4831]: I1203 06:48:51.325120 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/34580c97-5b51-43ab-affa-68c03a7c1d4d-cert\") pod \"infra-operator-controller-manager-57548d458d-bmk89\" (UID: \"34580c97-5b51-43ab-affa-68c03a7c1d4d\") " pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:48:51 crc kubenswrapper[4831]: I1203 06:48:51.358337 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:48:51 crc kubenswrapper[4831]: I1203 06:48:51.622019 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn\" (UID: \"1a45949e-adca-4398-82a3-a0d25c8f9702\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:48:51 crc kubenswrapper[4831]: I1203 06:48:51.628355 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1a45949e-adca-4398-82a3-a0d25c8f9702-cert\") pod \"openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn\" (UID: \"1a45949e-adca-4398-82a3-a0d25c8f9702\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:48:51 crc kubenswrapper[4831]: I1203 06:48:51.637444 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:48:52 crc kubenswrapper[4831]: I1203 06:48:52.433876 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:52 crc kubenswrapper[4831]: E1203 06:48:52.434055 4831 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Dec 03 06:48:52 crc kubenswrapper[4831]: I1203 06:48:52.434391 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:52 crc kubenswrapper[4831]: E1203 06:48:52.434398 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs podName:8a2d00a9-7a0c-45d2-8f1d-080748f8366b nodeName:}" failed. No retries permitted until 2025-12-03 06:49:08.434378022 +0000 UTC m=+1085.777961540 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs") pod "openstack-operator-controller-manager-5586f6bb8b-ps5jg" (UID: "8a2d00a9-7a0c-45d2-8f1d-080748f8366b") : secret "metrics-server-cert" not found Dec 03 06:48:52 crc kubenswrapper[4831]: I1203 06:48:52.439873 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-webhook-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:48:52 crc kubenswrapper[4831]: E1203 06:48:52.625213 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168" Dec 03 06:48:52 crc kubenswrapper[4831]: E1203 06:48:52.625506 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:d9a3694865a7d54ee96397add18c3898886e98d079aa20876a0f4de1fa7a7168,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jrcgx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-998648c74-cdbrr_openstack-operators(aa93d4b6-18b8-43c4-a1d2-e6c95e4eebf2): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:48:53 crc kubenswrapper[4831]: E1203 06:48:53.215091 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f" Dec 03 06:48:53 crc kubenswrapper[4831]: E1203 06:48:53.215292 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:d29650b006da97eb9178fcc58f2eb9fead8c2b414fac18f86a3c3a1507488c4f,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8qctc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-78f8948974-vg5hb_openstack-operators(3c0f7e03-1610-4c34-824f-c6b7ad6310ea): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:48:53 crc kubenswrapper[4831]: E1203 06:48:53.969761 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801" Dec 03 06:48:53 crc kubenswrapper[4831]: E1203 06:48:53.970152 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:1d60701214b39cdb0fa70bbe5710f9b131139a9f4b482c2db4058a04daefb801,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wczqs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-859b6ccc6-mcp6w_openstack-operators(379fb9f5-e9c6-4362-b40a-c80ac7f58562): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:48:54 crc kubenswrapper[4831]: E1203 06:48:54.549876 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670" Dec 03 06:48:54 crc kubenswrapper[4831]: E1203 06:48:54.550079 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:779f0cee6024d0fb8f259b036fe790e62aa5a3b0431ea9bf15a6e7d02e2e5670,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-n8j7s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-697bc559fc-4fxmh_openstack-operators(17a7b8c5-b7a4-430e-b910-20d0c9a97dc1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:48:55 crc kubenswrapper[4831]: E1203 06:48:55.046561 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7" Dec 03 06:48:55 crc kubenswrapper[4831]: E1203 06:48:55.046740 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:72ad6517987f674af0d0ae092cbb874aeae909c8b8b60188099c311762ebc8f7,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jp2qp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-7765d96ddf-xl7b2_openstack-operators(6589826b-47ab-4f38-bfc6-e6d79787e272): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:49:04 crc kubenswrapper[4831]: I1203 06:49:04.188027 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn"] Dec 03 06:49:04 crc kubenswrapper[4831]: E1203 06:49:04.268774 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Dec 03 06:49:04 crc kubenswrapper[4831]: E1203 06:49:04.269006 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-f2sjj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-wmfmc_openstack-operators(0c0294b0-7070-44b3-adc3-63f6cae3992c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:49:04 crc kubenswrapper[4831]: E1203 06:49:04.270242 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc" podUID="0c0294b0-7070-44b3-adc3-63f6cae3992c" Dec 03 06:49:04 crc kubenswrapper[4831]: W1203 06:49:04.340177 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a45949e_adca_4398_82a3_a0d25c8f9702.slice/crio-44aa98712332c75e40be9eef8938b0014e7b12c1c222bc80b9c24086be8249fe WatchSource:0}: Error finding container 44aa98712332c75e40be9eef8938b0014e7b12c1c222bc80b9c24086be8249fe: Status 404 returned error can't find the container with id 44aa98712332c75e40be9eef8938b0014e7b12c1c222bc80b9c24086be8249fe Dec 03 06:49:04 crc kubenswrapper[4831]: I1203 06:49:04.343756 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 06:49:04 crc kubenswrapper[4831]: I1203 06:49:04.819615 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-57548d458d-bmk89"] Dec 03 06:49:04 crc kubenswrapper[4831]: W1203 06:49:04.972142 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod34580c97_5b51_43ab_affa_68c03a7c1d4d.slice/crio-b4d51f27857a24714c593cdb48c629696f1f2d82fe1ec05f526f4eac4392fe2a WatchSource:0}: Error finding container b4d51f27857a24714c593cdb48c629696f1f2d82fe1ec05f526f4eac4392fe2a: Status 404 returned error can't find the container with id b4d51f27857a24714c593cdb48c629696f1f2d82fe1ec05f526f4eac4392fe2a Dec 03 06:49:05 crc kubenswrapper[4831]: I1203 06:49:05.215633 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" event={"ID":"1a45949e-adca-4398-82a3-a0d25c8f9702","Type":"ContainerStarted","Data":"44aa98712332c75e40be9eef8938b0014e7b12c1c222bc80b9c24086be8249fe"} Dec 03 06:49:05 crc kubenswrapper[4831]: I1203 06:49:05.217203 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" event={"ID":"34580c97-5b51-43ab-affa-68c03a7c1d4d","Type":"ContainerStarted","Data":"b4d51f27857a24714c593cdb48c629696f1f2d82fe1ec05f526f4eac4392fe2a"} Dec 03 06:49:06 crc kubenswrapper[4831]: I1203 06:49:06.227065 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8" event={"ID":"0566c3c5-2a1f-4fee-a3f0-ed880b1aba9b","Type":"ContainerStarted","Data":"c56a3c78ec57312974542f4ffdb1a9997b3145210e71dade2ed9f411b4c3d94d"} Dec 03 06:49:06 crc kubenswrapper[4831]: I1203 06:49:06.228175 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp" event={"ID":"44a494a8-2cda-4092-9510-41314e5f93c8","Type":"ContainerStarted","Data":"e1307d613328a3db9fa732881c57825b2a9ccc9ae951c29b0d4da9b7b0949c38"} Dec 03 06:49:06 crc kubenswrapper[4831]: I1203 06:49:06.229015 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm" event={"ID":"4628f220-2e57-479f-b91a-9dea443d3456","Type":"ContainerStarted","Data":"fb70871325f2693a606c39d213355f0615051d9ab52c5e2160c0e3a5268334c4"} Dec 03 06:49:06 crc kubenswrapper[4831]: I1203 06:49:06.229894 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd" event={"ID":"61e7e997-91d1-4a49-8243-a0032d9ce077","Type":"ContainerStarted","Data":"c84335bc87cfdb69633e7683b8f47c791a2b6e1c93e239b9f99fd95ebdba1e15"} Dec 03 06:49:06 crc kubenswrapper[4831]: I1203 06:49:06.248594 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285" event={"ID":"f720b38f-39f1-4b9e-a6ee-268c76a855a0","Type":"ContainerStarted","Data":"85358df2dcbbbfbb2c4667135737def243dc5a50ebe7875872a16822fb4e8bfa"} Dec 03 06:49:06 crc kubenswrapper[4831]: I1203 06:49:06.251296 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr" event={"ID":"dd91cf33-b91c-4430-ae22-ff8f52171f08","Type":"ContainerStarted","Data":"1cb6ce4d3fa7870f86b954c918f0b5b584c4a071fbb40ad9d5ee211e3d4a54c9"} Dec 03 06:49:06 crc kubenswrapper[4831]: I1203 06:49:06.252194 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd" event={"ID":"f4448da7-6edc-46ba-8a6c-d5491ddfc9a2","Type":"ContainerStarted","Data":"d1be07e5f68f5041a32b2de6bee1ee8cfca8ac3f4ed66ac92393582262a6f1b6"} Dec 03 06:49:06 crc kubenswrapper[4831]: I1203 06:49:06.252899 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz" event={"ID":"45d17cba-18de-45cc-8561-dd1d50b9061a","Type":"ContainerStarted","Data":"080456bd46a01beebd3bb7fc37b231a0b364f42077976a5dd148cccbe9b0c877"} Dec 03 06:49:06 crc kubenswrapper[4831]: I1203 06:49:06.253780 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882" event={"ID":"a7a1c9f6-03de-405f-b50a-31494910f498","Type":"ContainerStarted","Data":"2554b6ff60b682c2ac4709e8d584761314b5ed61f52cbc73e81ba07df944ab4d"} Dec 03 06:49:06 crc kubenswrapper[4831]: I1203 06:49:06.254530 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z" event={"ID":"13a6a910-c42b-4ba4-85ad-62d932c41b4d","Type":"ContainerStarted","Data":"0eaa01ff459f5b7a3b38fb6e7b3d207c8434a476cd7a454f6d3bbb68b095631c"} Dec 03 06:49:08 crc kubenswrapper[4831]: I1203 06:49:08.514284 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:49:08 crc kubenswrapper[4831]: I1203 06:49:08.557147 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8a2d00a9-7a0c-45d2-8f1d-080748f8366b-metrics-certs\") pod \"openstack-operator-controller-manager-5586f6bb8b-ps5jg\" (UID: \"8a2d00a9-7a0c-45d2-8f1d-080748f8366b\") " pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:49:08 crc kubenswrapper[4831]: I1203 06:49:08.591485 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-5wpqz" Dec 03 06:49:08 crc kubenswrapper[4831]: I1203 06:49:08.599692 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:49:09 crc kubenswrapper[4831]: I1203 06:49:09.289832 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" event={"ID":"8ce51182-8548-461c-a6b9-1dae9a549221","Type":"ContainerStarted","Data":"116bac1d8ece54451102db66204b33f375e5d2d23be1524d0ec0a7b38b8508bd"} Dec 03 06:49:10 crc kubenswrapper[4831]: I1203 06:49:10.099391 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg"] Dec 03 06:49:10 crc kubenswrapper[4831]: W1203 06:49:10.112687 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a2d00a9_7a0c_45d2_8f1d_080748f8366b.slice/crio-87120995a47d9d5fee01e4aac5633eb5150aa28779ca9f0953282509064d62a1 WatchSource:0}: Error finding container 87120995a47d9d5fee01e4aac5633eb5150aa28779ca9f0953282509064d62a1: Status 404 returned error can't find the container with id 87120995a47d9d5fee01e4aac5633eb5150aa28779ca9f0953282509064d62a1 Dec 03 06:49:10 crc kubenswrapper[4831]: I1203 06:49:10.298129 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" event={"ID":"8a2d00a9-7a0c-45d2-8f1d-080748f8366b","Type":"ContainerStarted","Data":"87120995a47d9d5fee01e4aac5633eb5150aa28779ca9f0953282509064d62a1"} Dec 03 06:49:10 crc kubenswrapper[4831]: I1203 06:49:10.300775 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" event={"ID":"c1bc052b-bcf5-43a3-a84b-7ea23a95f18a","Type":"ContainerStarted","Data":"4f3f8a98c3fc18c5601b6a4e07f0be8beacfe21093ba279dc722356313f7e5c8"} Dec 03 06:49:10 crc kubenswrapper[4831]: E1203 06:49:10.429246 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb" podUID="3c0f7e03-1610-4c34-824f-c6b7ad6310ea" Dec 03 06:49:10 crc kubenswrapper[4831]: E1203 06:49:10.449660 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6" podUID="273bb4e9-067c-47e7-8ef0-973e2890ecb0" Dec 03 06:49:10 crc kubenswrapper[4831]: E1203 06:49:10.464130 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr" podUID="aa93d4b6-18b8-43c4-a1d2-e6c95e4eebf2" Dec 03 06:49:10 crc kubenswrapper[4831]: E1203 06:49:10.643353 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq" podUID="facee23f-2039-4bc2-84e2-c209c96f0812" Dec 03 06:49:10 crc kubenswrapper[4831]: E1203 06:49:10.675197 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w" podUID="379fb9f5-e9c6-4362-b40a-c80ac7f58562" Dec 03 06:49:10 crc kubenswrapper[4831]: E1203 06:49:10.898158 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh" podUID="17a7b8c5-b7a4-430e-b910-20d0c9a97dc1" Dec 03 06:49:11 crc kubenswrapper[4831]: E1203 06:49:11.125806 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2" podUID="6589826b-47ab-4f38-bfc6-e6d79787e272" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.319700 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882" event={"ID":"a7a1c9f6-03de-405f-b50a-31494910f498","Type":"ContainerStarted","Data":"02e0a0a471feb755b3afcacfb9c226b367e3ccf7d108c2e90eecf6b3d64822e8"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.320026 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.322401 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.326598 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2" event={"ID":"6589826b-47ab-4f38-bfc6-e6d79787e272","Type":"ContainerStarted","Data":"29b0b5907275e645c7382d0d1b147d12f8e5ab8961e7533a2df295af53ed4aca"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.332969 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr" event={"ID":"aa93d4b6-18b8-43c4-a1d2-e6c95e4eebf2","Type":"ContainerStarted","Data":"507d3eb9c0b2ec8611254dad850c458663eefd82bf2cfe5071c9b97585701a7b"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.338743 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb" event={"ID":"3c0f7e03-1610-4c34-824f-c6b7ad6310ea","Type":"ContainerStarted","Data":"237ae1c381854c3e080d6a14a3b8c534740ffbd73591018ee2e52cd3eedb0191"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.346811 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh" event={"ID":"17a7b8c5-b7a4-430e-b910-20d0c9a97dc1","Type":"ContainerStarted","Data":"570b477cb7dc2bef659e4a76b8689b031263c732bf1409b8ecd0ef3978e72b9c"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.353264 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-7c79b5df47-k9882" podStartSLOduration=3.502211245 podStartE2EDuration="36.353237872s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.245654814 +0000 UTC m=+1054.589238322" lastFinishedPulling="2025-12-03 06:49:10.096681441 +0000 UTC m=+1087.440264949" observedRunningTime="2025-12-03 06:49:11.349884657 +0000 UTC m=+1088.693468165" watchObservedRunningTime="2025-12-03 06:49:11.353237872 +0000 UTC m=+1088.696821380" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.353509 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285" event={"ID":"f720b38f-39f1-4b9e-a6ee-268c76a855a0","Type":"ContainerStarted","Data":"9fb5144b6654d7db2fb52846dae5217eb136d4146b9157478064a5fc91d8a3eb"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.353685 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.356083 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz" event={"ID":"45d17cba-18de-45cc-8561-dd1d50b9061a","Type":"ContainerStarted","Data":"1495c6b0d048328daa844153001e6e6f05079dd5439029d903d5bac4612951da"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.356545 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.356832 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.361367 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.366042 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6" event={"ID":"273bb4e9-067c-47e7-8ef0-973e2890ecb0","Type":"ContainerStarted","Data":"b51dd240fa7042b5da4827774edab22f9cad4c9a547b4e758378e66eda3d0726"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.369694 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" event={"ID":"1a45949e-adca-4398-82a3-a0d25c8f9702","Type":"ContainerStarted","Data":"bfb0abb0f0b04b8e64a129548df5cdb7940b4852077b1edb1ddaf79cb8f2b680"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.374739 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" event={"ID":"8ce51182-8548-461c-a6b9-1dae9a549221","Type":"ContainerStarted","Data":"937ab0eed928f2f9b21413df070b111d2420c119da534c77efbee26ba231e31f"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.375515 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.380256 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm" event={"ID":"4628f220-2e57-479f-b91a-9dea443d3456","Type":"ContainerStarted","Data":"8156c3a35026165495f93b4f966f50884d204ce3e64fee7a8c74889255d83bbc"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.380666 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.382371 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.390553 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" event={"ID":"c1bc052b-bcf5-43a3-a84b-7ea23a95f18a","Type":"ContainerStarted","Data":"52e41bb73e049e1d93ac9f4deed605e0fd01799b5ffed2b536f984060769960e"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.391239 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.405558 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" event={"ID":"34580c97-5b51-43ab-affa-68c03a7c1d4d","Type":"ContainerStarted","Data":"0083c845cfb7bde107e3a59d1412bf4a66b771fd653f734af8f979aaf8d7e9e5"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.408038 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq" event={"ID":"facee23f-2039-4bc2-84e2-c209c96f0812","Type":"ContainerStarted","Data":"99994e363ef152e361099e585d33eaf66d787f3ded9cec4b10a3d3699501abad"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.415861 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" event={"ID":"8a2d00a9-7a0c-45d2-8f1d-080748f8366b","Type":"ContainerStarted","Data":"392485f9b8a5c0df4d7440e9fa8942dc62c493c7bdb3615d4a3312cd93ac4751"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.416495 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.426540 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w" event={"ID":"379fb9f5-e9c6-4362-b40a-c80ac7f58562","Type":"ContainerStarted","Data":"eb6db54291a5cec4b08e50f53983acd337afe014414974969564240fdcb4742f"} Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.505511 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" podStartSLOduration=10.068381544 podStartE2EDuration="36.505493616s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.899483514 +0000 UTC m=+1055.243067012" lastFinishedPulling="2025-12-03 06:49:04.336595536 +0000 UTC m=+1081.680179084" observedRunningTime="2025-12-03 06:49:11.504509305 +0000 UTC m=+1088.848092813" watchObservedRunningTime="2025-12-03 06:49:11.505493616 +0000 UTC m=+1088.849077114" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.528859 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-5f8c65bbfc-v9ptz" podStartSLOduration=4.198763691 podStartE2EDuration="36.528844036s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.788185432 +0000 UTC m=+1055.131768940" lastFinishedPulling="2025-12-03 06:49:10.118265777 +0000 UTC m=+1087.461849285" observedRunningTime="2025-12-03 06:49:11.527738892 +0000 UTC m=+1088.871322390" watchObservedRunningTime="2025-12-03 06:49:11.528844036 +0000 UTC m=+1088.872427534" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.564276 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" podStartSLOduration=9.662055579 podStartE2EDuration="36.564258265s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.821969018 +0000 UTC m=+1055.165552526" lastFinishedPulling="2025-12-03 06:49:04.724171704 +0000 UTC m=+1082.067755212" observedRunningTime="2025-12-03 06:49:11.563931265 +0000 UTC m=+1088.907514773" watchObservedRunningTime="2025-12-03 06:49:11.564258265 +0000 UTC m=+1088.907841773" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.630857 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" podStartSLOduration=35.630839668 podStartE2EDuration="35.630839668s" podCreationTimestamp="2025-12-03 06:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:49:11.625031137 +0000 UTC m=+1088.968614645" watchObservedRunningTime="2025-12-03 06:49:11.630839668 +0000 UTC m=+1088.974423176" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.701777 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-56bbcc9d85-dg285" podStartSLOduration=3.838291562 podStartE2EDuration="36.701759448s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.233185215 +0000 UTC m=+1054.576768723" lastFinishedPulling="2025-12-03 06:49:10.096653111 +0000 UTC m=+1087.440236609" observedRunningTime="2025-12-03 06:49:11.697102552 +0000 UTC m=+1089.040686060" watchObservedRunningTime="2025-12-03 06:49:11.701759448 +0000 UTC m=+1089.045342956" Dec 03 06:49:11 crc kubenswrapper[4831]: I1203 06:49:11.726234 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-b6456fdb6-kx5qm" podStartSLOduration=4.37542627 podStartE2EDuration="36.726213093s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.767565836 +0000 UTC m=+1055.111149344" lastFinishedPulling="2025-12-03 06:49:10.118352619 +0000 UTC m=+1087.461936167" observedRunningTime="2025-12-03 06:49:11.725733818 +0000 UTC m=+1089.069317336" watchObservedRunningTime="2025-12-03 06:49:11.726213093 +0000 UTC m=+1089.069796591" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.438085 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" event={"ID":"1a45949e-adca-4398-82a3-a0d25c8f9702","Type":"ContainerStarted","Data":"b15a322ebe6b4b1e1d6444ef550f3f601e596e4ffcf42819cea51ce834739dcd"} Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.438252 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.446123 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8" event={"ID":"0566c3c5-2a1f-4fee-a3f0-ed880b1aba9b","Type":"ContainerStarted","Data":"b71876ecbb94472180bb5e97682434efb82b141bc96e68af71d321b43becaddb"} Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.446427 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.448184 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.450673 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp" event={"ID":"44a494a8-2cda-4092-9510-41314e5f93c8","Type":"ContainerStarted","Data":"c49a6d7b31f51aa614913738065d5696cdf08544a6b5fb0c7eab0322ab574199"} Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.451790 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.456605 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" event={"ID":"34580c97-5b51-43ab-affa-68c03a7c1d4d","Type":"ContainerStarted","Data":"cadc3fbc2e0f74b746b3ff15a0e8512637d42618c4bacd4af36c3af606176823"} Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.457447 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.463064 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.466660 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6" event={"ID":"273bb4e9-067c-47e7-8ef0-973e2890ecb0","Type":"ContainerStarted","Data":"6180fed9c644ccdd1ed73795aef3bfe5fa66381b52bae5c1de1986b24348ca9a"} Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.467419 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.474777 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" podStartSLOduration=32.050491111 podStartE2EDuration="37.474760597s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:49:04.343476782 +0000 UTC m=+1081.687060290" lastFinishedPulling="2025-12-03 06:49:09.767746268 +0000 UTC m=+1087.111329776" observedRunningTime="2025-12-03 06:49:12.473153336 +0000 UTC m=+1089.816736854" watchObservedRunningTime="2025-12-03 06:49:12.474760597 +0000 UTC m=+1089.818344105" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.491260 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z" event={"ID":"13a6a910-c42b-4ba4-85ad-62d932c41b4d","Type":"ContainerStarted","Data":"88f810f68ab9b12fa8142a193bc7d873da94805707bf928f4258eb080e6f3caa"} Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.492165 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.494072 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.497994 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd" event={"ID":"61e7e997-91d1-4a49-8243-a0032d9ce077","Type":"ContainerStarted","Data":"d53afe7c3e7ceda3be6d79e877bca5d93217a7836349496cbadc63bc778d713e"} Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.499182 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.500792 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.501857 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr" event={"ID":"dd91cf33-b91c-4430-ae22-ff8f52171f08","Type":"ContainerStarted","Data":"b9c192b2f86f7b8b7013f4d74761835870f9eff79020d15e2c6e6a47bc6f0ea9"} Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.503665 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.504361 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.506429 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq" event={"ID":"facee23f-2039-4bc2-84e2-c209c96f0812","Type":"ContainerStarted","Data":"f3feddfa88afe4d8cfcb7d31bb4bb2434c48944b0677e202c6bcb34f470884da"} Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.507437 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.513652 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd" event={"ID":"f4448da7-6edc-46ba-8a6c-d5491ddfc9a2","Type":"ContainerStarted","Data":"b8b77e08f941b42d75e068e83a28be4cfb097912c1fefad70ee757b5a21aa376"} Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.514203 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.516362 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.516458 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" podStartSLOduration=32.526563248 podStartE2EDuration="37.516441231s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:49:04.97968499 +0000 UTC m=+1082.323268498" lastFinishedPulling="2025-12-03 06:49:09.969562973 +0000 UTC m=+1087.313146481" observedRunningTime="2025-12-03 06:49:12.513722896 +0000 UTC m=+1089.857306424" watchObservedRunningTime="2025-12-03 06:49:12.516441231 +0000 UTC m=+1089.860024729" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.523481 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr" event={"ID":"aa93d4b6-18b8-43c4-a1d2-e6c95e4eebf2","Type":"ContainerStarted","Data":"cde832e4b3a34ad17184e2021cfd15bc1ed4b892393cdcfbd0f129e38d803c80"} Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.536201 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-76cc84c6bb-txlnp" podStartSLOduration=5.312293307 podStartE2EDuration="37.536184169s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.895252722 +0000 UTC m=+1055.238836230" lastFinishedPulling="2025-12-03 06:49:10.119143584 +0000 UTC m=+1087.462727092" observedRunningTime="2025-12-03 06:49:12.535436415 +0000 UTC m=+1089.879019973" watchObservedRunningTime="2025-12-03 06:49:12.536184169 +0000 UTC m=+1089.879767677" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.566806 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6c548fd776-l25t8" podStartSLOduration=4.915079036 podStartE2EDuration="37.566777816s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.46504812 +0000 UTC m=+1054.808631628" lastFinishedPulling="2025-12-03 06:49:10.1167469 +0000 UTC m=+1087.460330408" observedRunningTime="2025-12-03 06:49:12.559270211 +0000 UTC m=+1089.902853719" watchObservedRunningTime="2025-12-03 06:49:12.566777816 +0000 UTC m=+1089.910361324" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.585397 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6" podStartSLOduration=2.8395398480000003 podStartE2EDuration="37.585380038s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.244064775 +0000 UTC m=+1054.587648283" lastFinishedPulling="2025-12-03 06:49:11.989904975 +0000 UTC m=+1089.333488473" observedRunningTime="2025-12-03 06:49:12.582979833 +0000 UTC m=+1089.926563341" watchObservedRunningTime="2025-12-03 06:49:12.585380038 +0000 UTC m=+1089.928963546" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.606455 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr" podStartSLOduration=3.168324168 podStartE2EDuration="37.606438318s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.589623068 +0000 UTC m=+1054.933206576" lastFinishedPulling="2025-12-03 06:49:12.027737218 +0000 UTC m=+1089.371320726" observedRunningTime="2025-12-03 06:49:12.606093317 +0000 UTC m=+1089.949676825" watchObservedRunningTime="2025-12-03 06:49:12.606438318 +0000 UTC m=+1089.950021826" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.645167 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-78b4bc895b-5lshd" podStartSLOduration=4.491642407 podStartE2EDuration="37.645151159s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:36.963104713 +0000 UTC m=+1054.306688221" lastFinishedPulling="2025-12-03 06:49:10.116613465 +0000 UTC m=+1087.460196973" observedRunningTime="2025-12-03 06:49:12.640595586 +0000 UTC m=+1089.984179094" watchObservedRunningTime="2025-12-03 06:49:12.645151159 +0000 UTC m=+1089.988734667" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.715608 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-77987cd8cd-xgw7z" podStartSLOduration=4.111828022 podStartE2EDuration="37.715593153s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:36.511377338 +0000 UTC m=+1053.854960846" lastFinishedPulling="2025-12-03 06:49:10.115142459 +0000 UTC m=+1087.458725977" observedRunningTime="2025-12-03 06:49:12.714877041 +0000 UTC m=+1090.058460549" watchObservedRunningTime="2025-12-03 06:49:12.715593153 +0000 UTC m=+1090.059176661" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.833351 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-68c6d99b8f-cvxcd" podStartSLOduration=4.41473564 podStartE2EDuration="37.833336547s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:36.695770088 +0000 UTC m=+1054.039353596" lastFinishedPulling="2025-12-03 06:49:10.114370995 +0000 UTC m=+1087.457954503" observedRunningTime="2025-12-03 06:49:12.832261634 +0000 UTC m=+1090.175845142" watchObservedRunningTime="2025-12-03 06:49:12.833336547 +0000 UTC m=+1090.176920055" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.916447 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq" podStartSLOduration=3.125670563 podStartE2EDuration="37.916431908s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.244254861 +0000 UTC m=+1054.587838369" lastFinishedPulling="2025-12-03 06:49:12.035016206 +0000 UTC m=+1089.378599714" observedRunningTime="2025-12-03 06:49:12.915586911 +0000 UTC m=+1090.259170419" watchObservedRunningTime="2025-12-03 06:49:12.916431908 +0000 UTC m=+1090.260015406" Dec 03 06:49:12 crc kubenswrapper[4831]: I1203 06:49:12.919603 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7d9dfd778-9tvxr" podStartSLOduration=4.474261404 podStartE2EDuration="37.919593057s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:36.721335548 +0000 UTC m=+1054.064919056" lastFinishedPulling="2025-12-03 06:49:10.166667201 +0000 UTC m=+1087.510250709" observedRunningTime="2025-12-03 06:49:12.883074514 +0000 UTC m=+1090.226658022" watchObservedRunningTime="2025-12-03 06:49:12.919593057 +0000 UTC m=+1090.263176565" Dec 03 06:49:13 crc kubenswrapper[4831]: I1203 06:49:13.534712 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w" event={"ID":"379fb9f5-e9c6-4362-b40a-c80ac7f58562","Type":"ContainerStarted","Data":"a619704d917b0582996e5bde3c6a8501857e51fbe6166c7cf17b83931a49dfec"} Dec 03 06:49:13 crc kubenswrapper[4831]: I1203 06:49:13.536490 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w" Dec 03 06:49:13 crc kubenswrapper[4831]: I1203 06:49:13.539819 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb" event={"ID":"3c0f7e03-1610-4c34-824f-c6b7ad6310ea","Type":"ContainerStarted","Data":"c5c7d5412dcc009e882195f5849639037cd5aece46d842173d76be637d94c7cf"} Dec 03 06:49:13 crc kubenswrapper[4831]: I1203 06:49:13.540132 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb" Dec 03 06:49:13 crc kubenswrapper[4831]: I1203 06:49:13.544416 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2" event={"ID":"6589826b-47ab-4f38-bfc6-e6d79787e272","Type":"ContainerStarted","Data":"050d60bc6e29ac91777c0cf1a39cba2f037b55d7510933e5b0b73daf6e4582b6"} Dec 03 06:49:13 crc kubenswrapper[4831]: I1203 06:49:13.545028 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2" Dec 03 06:49:13 crc kubenswrapper[4831]: I1203 06:49:13.548793 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh" event={"ID":"17a7b8c5-b7a4-430e-b910-20d0c9a97dc1","Type":"ContainerStarted","Data":"aab15b45889c00000b1096f1bff7650947cb939751fed796dcd3aad5046533bc"} Dec 03 06:49:13 crc kubenswrapper[4831]: I1203 06:49:13.548923 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh" Dec 03 06:49:13 crc kubenswrapper[4831]: I1203 06:49:13.552037 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr" Dec 03 06:49:13 crc kubenswrapper[4831]: I1203 06:49:13.566685 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w" podStartSLOduration=2.967125111 podStartE2EDuration="38.566669325s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:36.436530275 +0000 UTC m=+1053.780113773" lastFinishedPulling="2025-12-03 06:49:12.036074479 +0000 UTC m=+1089.379657987" observedRunningTime="2025-12-03 06:49:13.565451767 +0000 UTC m=+1090.909035335" watchObservedRunningTime="2025-12-03 06:49:13.566669325 +0000 UTC m=+1090.910252833" Dec 03 06:49:13 crc kubenswrapper[4831]: I1203 06:49:13.592622 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh" podStartSLOduration=4.041370336 podStartE2EDuration="38.592593506s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.481940438 +0000 UTC m=+1054.825523956" lastFinishedPulling="2025-12-03 06:49:12.033163618 +0000 UTC m=+1089.376747126" observedRunningTime="2025-12-03 06:49:13.586840846 +0000 UTC m=+1090.930424394" watchObservedRunningTime="2025-12-03 06:49:13.592593506 +0000 UTC m=+1090.936177044" Dec 03 06:49:13 crc kubenswrapper[4831]: I1203 06:49:13.607988 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb" podStartSLOduration=4.18045082 podStartE2EDuration="38.607973678s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.60820318 +0000 UTC m=+1054.951786688" lastFinishedPulling="2025-12-03 06:49:12.035726038 +0000 UTC m=+1089.379309546" observedRunningTime="2025-12-03 06:49:13.606632925 +0000 UTC m=+1090.950216473" watchObservedRunningTime="2025-12-03 06:49:13.607973678 +0000 UTC m=+1090.951557186" Dec 03 06:49:13 crc kubenswrapper[4831]: I1203 06:49:13.639465 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2" podStartSLOduration=4.075947938 podStartE2EDuration="38.639435262s" podCreationTimestamp="2025-12-03 06:48:35 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.47112678 +0000 UTC m=+1054.814710288" lastFinishedPulling="2025-12-03 06:49:12.034614104 +0000 UTC m=+1089.378197612" observedRunningTime="2025-12-03 06:49:13.632265947 +0000 UTC m=+1090.975849465" watchObservedRunningTime="2025-12-03 06:49:13.639435262 +0000 UTC m=+1090.983018810" Dec 03 06:49:16 crc kubenswrapper[4831]: I1203 06:49:16.529875 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5854674fcc-4sh4s" Dec 03 06:49:16 crc kubenswrapper[4831]: I1203 06:49:16.649622 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-769dc69bc-2tgr7" Dec 03 06:49:18 crc kubenswrapper[4831]: E1203 06:49:18.015797 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc" podUID="0c0294b0-7070-44b3-adc3-63f6cae3992c" Dec 03 06:49:18 crc kubenswrapper[4831]: I1203 06:49:18.613439 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-5586f6bb8b-ps5jg" Dec 03 06:49:21 crc kubenswrapper[4831]: I1203 06:49:21.370693 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-57548d458d-bmk89" Dec 03 06:49:21 crc kubenswrapper[4831]: I1203 06:49:21.648491 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn" Dec 03 06:49:25 crc kubenswrapper[4831]: I1203 06:49:25.603244 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-859b6ccc6-mcp6w" Dec 03 06:49:25 crc kubenswrapper[4831]: I1203 06:49:25.683240 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-5f64f6f8bb-vqvxq" Dec 03 06:49:25 crc kubenswrapper[4831]: I1203 06:49:25.787935 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7765d96ddf-xl7b2" Dec 03 06:49:25 crc kubenswrapper[4831]: I1203 06:49:25.942195 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-5fdfd5b6b5-6wzs6" Dec 03 06:49:25 crc kubenswrapper[4831]: I1203 06:49:25.969891 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-697bc559fc-4fxmh" Dec 03 06:49:25 crc kubenswrapper[4831]: I1203 06:49:25.988834 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-998648c74-cdbrr" Dec 03 06:49:26 crc kubenswrapper[4831]: I1203 06:49:26.205999 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-78f8948974-vg5hb" Dec 03 06:49:27 crc kubenswrapper[4831]: I1203 06:49:27.597302 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:49:27 crc kubenswrapper[4831]: I1203 06:49:27.597537 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:49:33 crc kubenswrapper[4831]: I1203 06:49:33.743600 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc" event={"ID":"0c0294b0-7070-44b3-adc3-63f6cae3992c","Type":"ContainerStarted","Data":"8c1c5734c67fbedbb067f0e1c6114824b17bdbc41a6428e7b6b49b75863c1f4c"} Dec 03 06:49:33 crc kubenswrapper[4831]: I1203 06:49:33.771245 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-wmfmc" podStartSLOduration=2.348603079 podStartE2EDuration="57.771214976s" podCreationTimestamp="2025-12-03 06:48:36 +0000 UTC" firstStartedPulling="2025-12-03 06:48:37.900014951 +0000 UTC m=+1055.243598459" lastFinishedPulling="2025-12-03 06:49:33.322626818 +0000 UTC m=+1110.666210356" observedRunningTime="2025-12-03 06:49:33.760965464 +0000 UTC m=+1111.104549022" watchObservedRunningTime="2025-12-03 06:49:33.771214976 +0000 UTC m=+1111.114798524" Dec 03 06:49:48 crc kubenswrapper[4831]: I1203 06:49:48.945019 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c8vfv"] Dec 03 06:49:48 crc kubenswrapper[4831]: I1203 06:49:48.947178 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-c8vfv" Dec 03 06:49:48 crc kubenswrapper[4831]: I1203 06:49:48.950382 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-zj8l2" Dec 03 06:49:48 crc kubenswrapper[4831]: I1203 06:49:48.950558 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 06:49:48 crc kubenswrapper[4831]: I1203 06:49:48.958103 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 06:49:48 crc kubenswrapper[4831]: I1203 06:49:48.958638 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c8vfv"] Dec 03 06:49:48 crc kubenswrapper[4831]: I1203 06:49:48.959297 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.035604 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zllbc"] Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.036739 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.038567 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.050731 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zllbc"] Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.063064 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76xss\" (UniqueName: \"kubernetes.io/projected/d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c-kube-api-access-76xss\") pod \"dnsmasq-dns-675f4bcbfc-c8vfv\" (UID: \"d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c8vfv" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.063120 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c-config\") pod \"dnsmasq-dns-675f4bcbfc-c8vfv\" (UID: \"d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c8vfv" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.164514 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/469328a1-75a9-4909-a0cf-1e8f41656089-config\") pod \"dnsmasq-dns-78dd6ddcc-zllbc\" (UID: \"469328a1-75a9-4909-a0cf-1e8f41656089\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.164639 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76xss\" (UniqueName: \"kubernetes.io/projected/d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c-kube-api-access-76xss\") pod \"dnsmasq-dns-675f4bcbfc-c8vfv\" (UID: \"d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c8vfv" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.164667 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c-config\") pod \"dnsmasq-dns-675f4bcbfc-c8vfv\" (UID: \"d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c8vfv" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.164701 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/469328a1-75a9-4909-a0cf-1e8f41656089-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zllbc\" (UID: \"469328a1-75a9-4909-a0cf-1e8f41656089\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.164729 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsgg2\" (UniqueName: \"kubernetes.io/projected/469328a1-75a9-4909-a0cf-1e8f41656089-kube-api-access-gsgg2\") pod \"dnsmasq-dns-78dd6ddcc-zllbc\" (UID: \"469328a1-75a9-4909-a0cf-1e8f41656089\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.165998 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c-config\") pod \"dnsmasq-dns-675f4bcbfc-c8vfv\" (UID: \"d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c8vfv" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.187064 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76xss\" (UniqueName: \"kubernetes.io/projected/d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c-kube-api-access-76xss\") pod \"dnsmasq-dns-675f4bcbfc-c8vfv\" (UID: \"d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c\") " pod="openstack/dnsmasq-dns-675f4bcbfc-c8vfv" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.266253 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/469328a1-75a9-4909-a0cf-1e8f41656089-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zllbc\" (UID: \"469328a1-75a9-4909-a0cf-1e8f41656089\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.266342 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsgg2\" (UniqueName: \"kubernetes.io/projected/469328a1-75a9-4909-a0cf-1e8f41656089-kube-api-access-gsgg2\") pod \"dnsmasq-dns-78dd6ddcc-zllbc\" (UID: \"469328a1-75a9-4909-a0cf-1e8f41656089\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.266382 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/469328a1-75a9-4909-a0cf-1e8f41656089-config\") pod \"dnsmasq-dns-78dd6ddcc-zllbc\" (UID: \"469328a1-75a9-4909-a0cf-1e8f41656089\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.267501 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/469328a1-75a9-4909-a0cf-1e8f41656089-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zllbc\" (UID: \"469328a1-75a9-4909-a0cf-1e8f41656089\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.267521 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/469328a1-75a9-4909-a0cf-1e8f41656089-config\") pod \"dnsmasq-dns-78dd6ddcc-zllbc\" (UID: \"469328a1-75a9-4909-a0cf-1e8f41656089\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.270127 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-c8vfv" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.281899 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsgg2\" (UniqueName: \"kubernetes.io/projected/469328a1-75a9-4909-a0cf-1e8f41656089-kube-api-access-gsgg2\") pod \"dnsmasq-dns-78dd6ddcc-zllbc\" (UID: \"469328a1-75a9-4909-a0cf-1e8f41656089\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.359017 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" Dec 03 06:49:49 crc kubenswrapper[4831]: W1203 06:49:49.743058 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd6c7d82c_a79b_43d2_b08a_cffd5d7c5e2c.slice/crio-904a0103cd3371fb01c2934ed0b3c2deab670dc5f4c0336f7995f3679f07da19 WatchSource:0}: Error finding container 904a0103cd3371fb01c2934ed0b3c2deab670dc5f4c0336f7995f3679f07da19: Status 404 returned error can't find the container with id 904a0103cd3371fb01c2934ed0b3c2deab670dc5f4c0336f7995f3679f07da19 Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.743585 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c8vfv"] Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.844851 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zllbc"] Dec 03 06:49:49 crc kubenswrapper[4831]: W1203 06:49:49.852588 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod469328a1_75a9_4909_a0cf_1e8f41656089.slice/crio-47109f9247c0173f4aadef663f251c901feb74d59bc629c2928f8f09e864ddbf WatchSource:0}: Error finding container 47109f9247c0173f4aadef663f251c901feb74d59bc629c2928f8f09e864ddbf: Status 404 returned error can't find the container with id 47109f9247c0173f4aadef663f251c901feb74d59bc629c2928f8f09e864ddbf Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.886938 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" event={"ID":"469328a1-75a9-4909-a0cf-1e8f41656089","Type":"ContainerStarted","Data":"47109f9247c0173f4aadef663f251c901feb74d59bc629c2928f8f09e864ddbf"} Dec 03 06:49:49 crc kubenswrapper[4831]: I1203 06:49:49.888138 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-c8vfv" event={"ID":"d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c","Type":"ContainerStarted","Data":"904a0103cd3371fb01c2934ed0b3c2deab670dc5f4c0336f7995f3679f07da19"} Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.363658 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c8vfv"] Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.391215 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9lqs6"] Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.392704 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.400724 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9lqs6"] Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.486793 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25903af2-da65-4fb7-81f4-fc2514a738e1-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-9lqs6\" (UID: \"25903af2-da65-4fb7-81f4-fc2514a738e1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.486844 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5fr5\" (UniqueName: \"kubernetes.io/projected/25903af2-da65-4fb7-81f4-fc2514a738e1-kube-api-access-b5fr5\") pod \"dnsmasq-dns-5ccc8479f9-9lqs6\" (UID: \"25903af2-da65-4fb7-81f4-fc2514a738e1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.486915 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25903af2-da65-4fb7-81f4-fc2514a738e1-config\") pod \"dnsmasq-dns-5ccc8479f9-9lqs6\" (UID: \"25903af2-da65-4fb7-81f4-fc2514a738e1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.588340 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25903af2-da65-4fb7-81f4-fc2514a738e1-config\") pod \"dnsmasq-dns-5ccc8479f9-9lqs6\" (UID: \"25903af2-da65-4fb7-81f4-fc2514a738e1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.588451 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25903af2-da65-4fb7-81f4-fc2514a738e1-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-9lqs6\" (UID: \"25903af2-da65-4fb7-81f4-fc2514a738e1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.588491 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5fr5\" (UniqueName: \"kubernetes.io/projected/25903af2-da65-4fb7-81f4-fc2514a738e1-kube-api-access-b5fr5\") pod \"dnsmasq-dns-5ccc8479f9-9lqs6\" (UID: \"25903af2-da65-4fb7-81f4-fc2514a738e1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.589712 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25903af2-da65-4fb7-81f4-fc2514a738e1-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-9lqs6\" (UID: \"25903af2-da65-4fb7-81f4-fc2514a738e1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.590043 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25903af2-da65-4fb7-81f4-fc2514a738e1-config\") pod \"dnsmasq-dns-5ccc8479f9-9lqs6\" (UID: \"25903af2-da65-4fb7-81f4-fc2514a738e1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.610290 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5fr5\" (UniqueName: \"kubernetes.io/projected/25903af2-da65-4fb7-81f4-fc2514a738e1-kube-api-access-b5fr5\") pod \"dnsmasq-dns-5ccc8479f9-9lqs6\" (UID: \"25903af2-da65-4fb7-81f4-fc2514a738e1\") " pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" Dec 03 06:49:50 crc kubenswrapper[4831]: I1203 06:49:50.711053 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.221290 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zllbc"] Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.248333 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2v9pk"] Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.249548 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.254964 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2v9pk"] Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.325535 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9lqs6"] Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.413465 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2v9pk\" (UID: \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.413561 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml8dc\" (UniqueName: \"kubernetes.io/projected/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-kube-api-access-ml8dc\") pod \"dnsmasq-dns-57d769cc4f-2v9pk\" (UID: \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.413633 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-config\") pod \"dnsmasq-dns-57d769cc4f-2v9pk\" (UID: \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.515098 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml8dc\" (UniqueName: \"kubernetes.io/projected/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-kube-api-access-ml8dc\") pod \"dnsmasq-dns-57d769cc4f-2v9pk\" (UID: \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.515189 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-config\") pod \"dnsmasq-dns-57d769cc4f-2v9pk\" (UID: \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.515232 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2v9pk\" (UID: \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.517562 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-2v9pk\" (UID: \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.517763 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-config\") pod \"dnsmasq-dns-57d769cc4f-2v9pk\" (UID: \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.542904 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml8dc\" (UniqueName: \"kubernetes.io/projected/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-kube-api-access-ml8dc\") pod \"dnsmasq-dns-57d769cc4f-2v9pk\" (UID: \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\") " pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.581440 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.735262 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.736723 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.743831 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.744058 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.744202 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.744231 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-nthqk" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.744407 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.744635 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.744999 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.756420 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.825468 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.825512 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.825545 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d6ac806-4ac5-4de4-b6a0-b265032150f4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.825561 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.825585 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.825603 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.825617 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsdbs\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-kube-api-access-fsdbs\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.825635 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.825652 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.825684 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d6ac806-4ac5-4de4-b6a0-b265032150f4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.825718 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.923159 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" event={"ID":"25903af2-da65-4fb7-81f4-fc2514a738e1","Type":"ContainerStarted","Data":"15614f0a7a4fda1be2998ac7f1d26d2b47ef53cb282c701fabed6eece1202b0c"} Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.927489 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.927554 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.927575 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.927593 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsdbs\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-kube-api-access-fsdbs\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.927629 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.927650 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.927682 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d6ac806-4ac5-4de4-b6a0-b265032150f4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.927730 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.927780 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.927807 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.927832 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d6ac806-4ac5-4de4-b6a0-b265032150f4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.929101 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.930423 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.931013 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.931201 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.933494 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.937964 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.940831 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d6ac806-4ac5-4de4-b6a0-b265032150f4-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.941688 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d6ac806-4ac5-4de4-b6a0-b265032150f4-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.941961 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.944118 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.947113 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsdbs\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-kube-api-access-fsdbs\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:51 crc kubenswrapper[4831]: I1203 06:49:51.987769 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.069976 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.179867 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2v9pk"] Dec 03 06:49:52 crc kubenswrapper[4831]: W1203 06:49:52.199902 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb1847a88_1fe3_4ee1_84f7_28e59d30d5e4.slice/crio-cd249c12d081afdb1883da3bcf647e4f2af50ef654a5e9c0798a9ca0a0548b46 WatchSource:0}: Error finding container cd249c12d081afdb1883da3bcf647e4f2af50ef654a5e9c0798a9ca0a0548b46: Status 404 returned error can't find the container with id cd249c12d081afdb1883da3bcf647e4f2af50ef654a5e9c0798a9ca0a0548b46 Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.368328 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.387776 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.387983 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.391791 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.392075 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-vr2dd" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.392115 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.392304 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.392367 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.392325 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.396769 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.544988 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc0cbb94-92ec-4369-b609-f3186f302c66-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.545036 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.545055 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.545080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.545110 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smwwf\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-kube-api-access-smwwf\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.545132 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.545157 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.545193 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.545209 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc0cbb94-92ec-4369-b609-f3186f302c66-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.545227 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.545248 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.646671 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.646753 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.646781 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc0cbb94-92ec-4369-b609-f3186f302c66-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.648235 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-server-conf\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.650716 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.650793 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.650862 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc0cbb94-92ec-4369-b609-f3186f302c66-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.650904 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.650925 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.650975 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.651020 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smwwf\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-kube-api-access-smwwf\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.651065 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.651470 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.652342 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.653773 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.656753 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.657160 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.679304 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.702062 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.702144 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smwwf\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-kube-api-access-smwwf\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.703287 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc0cbb94-92ec-4369-b609-f3186f302c66-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.707977 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.709807 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.710429 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc0cbb94-92ec-4369-b609-f3186f302c66-pod-info\") pod \"rabbitmq-server-0\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.726537 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.954019 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" event={"ID":"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4","Type":"ContainerStarted","Data":"cd249c12d081afdb1883da3bcf647e4f2af50ef654a5e9c0798a9ca0a0548b46"} Dec 03 06:49:52 crc kubenswrapper[4831]: I1203 06:49:52.957012 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d6ac806-4ac5-4de4-b6a0-b265032150f4","Type":"ContainerStarted","Data":"e960d02942ee37d96724ed9e66feb25249f3e46c0430b2f545795a2dd1ce326d"} Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.386814 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.660217 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.661774 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.665062 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.665413 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-r7vst" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.666093 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.668494 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.671552 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.675544 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.776921 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-kolla-config\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.776964 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.777025 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-config-data-default\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.777042 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmfj8\" (UniqueName: \"kubernetes.io/projected/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-kube-api-access-fmfj8\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.777066 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.777081 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.777112 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.777143 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.878795 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.878867 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-kolla-config\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.878891 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.878944 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-config-data-default\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.878961 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmfj8\" (UniqueName: \"kubernetes.io/projected/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-kube-api-access-fmfj8\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.878982 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.878996 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.879030 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.879249 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.879519 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-config-data-generated\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.880092 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-config-data-default\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.880637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-kolla-config\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.880930 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-operator-scripts\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.890479 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.891248 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.912635 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.916154 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmfj8\" (UniqueName: \"kubernetes.io/projected/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-kube-api-access-fmfj8\") pod \"openstack-galera-0\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " pod="openstack/openstack-galera-0" Dec 03 06:49:53 crc kubenswrapper[4831]: I1203 06:49:53.985638 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 06:49:54 crc kubenswrapper[4831]: I1203 06:49:54.013746 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dc0cbb94-92ec-4369-b609-f3186f302c66","Type":"ContainerStarted","Data":"2a0709e4a5635f9cc13d8259ae8df8a212e11b8c97b94c1cb186db44ad3b61cb"} Dec 03 06:49:54 crc kubenswrapper[4831]: I1203 06:49:54.743337 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 06:49:54 crc kubenswrapper[4831]: W1203 06:49:54.756941 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55c21e0b_d5ab_4a59_9b0b_c1d8bbe6a01b.slice/crio-7e40cb987e0469c635f3de5b8417af2e7616ec9f1700bb9c2af7a4b7938dfee1 WatchSource:0}: Error finding container 7e40cb987e0469c635f3de5b8417af2e7616ec9f1700bb9c2af7a4b7938dfee1: Status 404 returned error can't find the container with id 7e40cb987e0469c635f3de5b8417af2e7616ec9f1700bb9c2af7a4b7938dfee1 Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.037087 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b","Type":"ContainerStarted","Data":"7e40cb987e0469c635f3de5b8417af2e7616ec9f1700bb9c2af7a4b7938dfee1"} Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.079281 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.081123 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.085417 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-qlmnw" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.085684 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.093085 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.093880 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.094500 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.228195 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.228239 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdgmf\" (UniqueName: \"kubernetes.io/projected/e358bf07-df54-4268-9421-f31c57f5594c-kube-api-access-tdgmf\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.228277 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.228306 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e358bf07-df54-4268-9421-f31c57f5594c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.228367 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e358bf07-df54-4268-9421-f31c57f5594c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.228404 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.228422 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e358bf07-df54-4268-9421-f31c57f5594c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.228442 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.329735 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.329782 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdgmf\" (UniqueName: \"kubernetes.io/projected/e358bf07-df54-4268-9421-f31c57f5594c-kube-api-access-tdgmf\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.329811 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.329844 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e358bf07-df54-4268-9421-f31c57f5594c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.329886 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e358bf07-df54-4268-9421-f31c57f5594c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.329929 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.329954 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e358bf07-df54-4268-9421-f31c57f5594c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.329985 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.331702 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.331772 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e358bf07-df54-4268-9421-f31c57f5594c-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.331907 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.334007 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.334734 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.339648 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e358bf07-df54-4268-9421-f31c57f5594c-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.359027 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e358bf07-df54-4268-9421-f31c57f5594c-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.364922 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdgmf\" (UniqueName: \"kubernetes.io/projected/e358bf07-df54-4268-9421-f31c57f5594c-kube-api-access-tdgmf\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.365471 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.424552 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.426790 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.431794 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-9rslh" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.432006 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.432214 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.438369 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.453363 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.533175 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwcmz\" (UniqueName: \"kubernetes.io/projected/e1ffe861-7d12-49e2-9737-fc100833da39-kube-api-access-rwcmz\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.533214 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ffe861-7d12-49e2-9737-fc100833da39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.533236 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1ffe861-7d12-49e2-9737-fc100833da39-config-data\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.533291 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e1ffe861-7d12-49e2-9737-fc100833da39-kolla-config\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.533309 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ffe861-7d12-49e2-9737-fc100833da39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.635954 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1ffe861-7d12-49e2-9737-fc100833da39-config-data\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.636357 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e1ffe861-7d12-49e2-9737-fc100833da39-kolla-config\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.636378 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ffe861-7d12-49e2-9737-fc100833da39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.636478 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwcmz\" (UniqueName: \"kubernetes.io/projected/e1ffe861-7d12-49e2-9737-fc100833da39-kube-api-access-rwcmz\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.636501 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ffe861-7d12-49e2-9737-fc100833da39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.638708 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e1ffe861-7d12-49e2-9737-fc100833da39-kolla-config\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.639181 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1ffe861-7d12-49e2-9737-fc100833da39-config-data\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.642680 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ffe861-7d12-49e2-9737-fc100833da39-combined-ca-bundle\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.646979 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ffe861-7d12-49e2-9737-fc100833da39-memcached-tls-certs\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.656768 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwcmz\" (UniqueName: \"kubernetes.io/projected/e1ffe861-7d12-49e2-9737-fc100833da39-kube-api-access-rwcmz\") pod \"memcached-0\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " pod="openstack/memcached-0" Dec 03 06:49:55 crc kubenswrapper[4831]: I1203 06:49:55.777545 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 06:49:56 crc kubenswrapper[4831]: I1203 06:49:56.044010 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 06:49:56 crc kubenswrapper[4831]: W1203 06:49:56.050603 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode358bf07_df54_4268_9421_f31c57f5594c.slice/crio-2e5e2bffb22c4da37da1e4f70e3ae92cb8da1e7129f3872335a5c10d4181003f WatchSource:0}: Error finding container 2e5e2bffb22c4da37da1e4f70e3ae92cb8da1e7129f3872335a5c10d4181003f: Status 404 returned error can't find the container with id 2e5e2bffb22c4da37da1e4f70e3ae92cb8da1e7129f3872335a5c10d4181003f Dec 03 06:49:56 crc kubenswrapper[4831]: I1203 06:49:56.411551 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 06:49:57 crc kubenswrapper[4831]: I1203 06:49:57.068895 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e358bf07-df54-4268-9421-f31c57f5594c","Type":"ContainerStarted","Data":"2e5e2bffb22c4da37da1e4f70e3ae92cb8da1e7129f3872335a5c10d4181003f"} Dec 03 06:49:57 crc kubenswrapper[4831]: I1203 06:49:57.401795 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:49:57 crc kubenswrapper[4831]: I1203 06:49:57.404544 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 06:49:57 crc kubenswrapper[4831]: I1203 06:49:57.417371 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-4p4wd" Dec 03 06:49:57 crc kubenswrapper[4831]: I1203 06:49:57.445365 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:49:57 crc kubenswrapper[4831]: I1203 06:49:57.477676 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2bnf\" (UniqueName: \"kubernetes.io/projected/0f65d17d-bff1-412c-94ab-cf83c538a36c-kube-api-access-v2bnf\") pod \"kube-state-metrics-0\" (UID: \"0f65d17d-bff1-412c-94ab-cf83c538a36c\") " pod="openstack/kube-state-metrics-0" Dec 03 06:49:57 crc kubenswrapper[4831]: I1203 06:49:57.579086 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2bnf\" (UniqueName: \"kubernetes.io/projected/0f65d17d-bff1-412c-94ab-cf83c538a36c-kube-api-access-v2bnf\") pod \"kube-state-metrics-0\" (UID: \"0f65d17d-bff1-412c-94ab-cf83c538a36c\") " pod="openstack/kube-state-metrics-0" Dec 03 06:49:57 crc kubenswrapper[4831]: I1203 06:49:57.596531 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:49:57 crc kubenswrapper[4831]: I1203 06:49:57.596592 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:49:57 crc kubenswrapper[4831]: I1203 06:49:57.615188 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2bnf\" (UniqueName: \"kubernetes.io/projected/0f65d17d-bff1-412c-94ab-cf83c538a36c-kube-api-access-v2bnf\") pod \"kube-state-metrics-0\" (UID: \"0f65d17d-bff1-412c-94ab-cf83c538a36c\") " pod="openstack/kube-state-metrics-0" Dec 03 06:49:57 crc kubenswrapper[4831]: I1203 06:49:57.801973 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.278349 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-95nwv"] Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.280461 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.285396 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-jp726" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.286706 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.287398 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.334039 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-95nwv"] Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.366246 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-w7h89"] Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.367801 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.381025 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-w7h89"] Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.398607 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe1a689-1241-4c11-93ca-875e53319668-ovn-controller-tls-certs\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.398678 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb6d7\" (UniqueName: \"kubernetes.io/projected/5fe1a689-1241-4c11-93ca-875e53319668-kube-api-access-kb6d7\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.398725 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-run-ovn\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.398783 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fe1a689-1241-4c11-93ca-875e53319668-scripts\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.398809 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-log-ovn\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.399130 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-run\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.399235 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fe1a689-1241-4c11-93ca-875e53319668-combined-ca-bundle\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.501520 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-lib\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.501616 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-run\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.501712 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pqxbl\" (UniqueName: \"kubernetes.io/projected/7f657b4b-bed8-4244-8727-2a3c59364041-kube-api-access-pqxbl\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.501779 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fe1a689-1241-4c11-93ca-875e53319668-combined-ca-bundle\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.501836 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe1a689-1241-4c11-93ca-875e53319668-ovn-controller-tls-certs\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.501909 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb6d7\" (UniqueName: \"kubernetes.io/projected/5fe1a689-1241-4c11-93ca-875e53319668-kube-api-access-kb6d7\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.501953 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-run-ovn\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.502024 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-run\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.502073 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f657b4b-bed8-4244-8727-2a3c59364041-scripts\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.502139 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fe1a689-1241-4c11-93ca-875e53319668-scripts\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.502191 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-log-ovn\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.502305 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-log\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.502409 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-etc-ovs\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.502806 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-run\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.502838 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-run-ovn\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.504955 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fe1a689-1241-4c11-93ca-875e53319668-scripts\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.505609 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-log-ovn\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.507889 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe1a689-1241-4c11-93ca-875e53319668-ovn-controller-tls-certs\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.513954 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fe1a689-1241-4c11-93ca-875e53319668-combined-ca-bundle\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.546118 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.557147 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.557257 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.559877 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.561631 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-qslkq" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.561953 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.562104 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.567634 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.604135 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-etc-ovs\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.604186 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-lib\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.604224 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pqxbl\" (UniqueName: \"kubernetes.io/projected/7f657b4b-bed8-4244-8727-2a3c59364041-kube-api-access-pqxbl\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.604295 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-run\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.604311 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f657b4b-bed8-4244-8727-2a3c59364041-scripts\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.604374 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-log\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.604379 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-etc-ovs\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.604491 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-run\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.605058 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-lib\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.605185 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-log\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.606275 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f657b4b-bed8-4244-8727-2a3c59364041-scripts\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.622931 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pqxbl\" (UniqueName: \"kubernetes.io/projected/7f657b4b-bed8-4244-8727-2a3c59364041-kube-api-access-pqxbl\") pod \"ovn-controller-ovs-w7h89\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.685255 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.701667 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb6d7\" (UniqueName: \"kubernetes.io/projected/5fe1a689-1241-4c11-93ca-875e53319668-kube-api-access-kb6d7\") pod \"ovn-controller-95nwv\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.705861 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.705904 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.706003 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fdc967-7fb5-4702-b184-6953e8aefd19-config\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.706376 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0fdc967-7fb5-4702-b184-6953e8aefd19-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.706461 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2htg6\" (UniqueName: \"kubernetes.io/projected/d0fdc967-7fb5-4702-b184-6953e8aefd19-kube-api-access-2htg6\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.706521 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d0fdc967-7fb5-4702-b184-6953e8aefd19-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.706547 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.706566 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.808474 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0fdc967-7fb5-4702-b184-6953e8aefd19-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.808543 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2htg6\" (UniqueName: \"kubernetes.io/projected/d0fdc967-7fb5-4702-b184-6953e8aefd19-kube-api-access-2htg6\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.808589 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d0fdc967-7fb5-4702-b184-6953e8aefd19-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.808639 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.808670 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.808762 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.808805 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.808834 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fdc967-7fb5-4702-b184-6953e8aefd19-config\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.809127 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.809941 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0fdc967-7fb5-4702-b184-6953e8aefd19-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.810290 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fdc967-7fb5-4702-b184-6953e8aefd19-config\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.812241 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d0fdc967-7fb5-4702-b184-6953e8aefd19-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.814528 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.815880 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.817556 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.827275 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2htg6\" (UniqueName: \"kubernetes.io/projected/d0fdc967-7fb5-4702-b184-6953e8aefd19-kube-api-access-2htg6\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.828377 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-nb-0\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.892646 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:01 crc kubenswrapper[4831]: I1203 06:50:01.925039 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-95nwv" Dec 03 06:50:01 crc kubenswrapper[4831]: W1203 06:50:01.981830 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1ffe861_7d12_49e2_9737_fc100833da39.slice/crio-6fc2106d8d8e2ae434b1ee1fc9a96a4258b13ed90efd411bf72d739990bd9e4b WatchSource:0}: Error finding container 6fc2106d8d8e2ae434b1ee1fc9a96a4258b13ed90efd411bf72d739990bd9e4b: Status 404 returned error can't find the container with id 6fc2106d8d8e2ae434b1ee1fc9a96a4258b13ed90efd411bf72d739990bd9e4b Dec 03 06:50:02 crc kubenswrapper[4831]: I1203 06:50:02.101575 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e1ffe861-7d12-49e2-9737-fc100833da39","Type":"ContainerStarted","Data":"6fc2106d8d8e2ae434b1ee1fc9a96a4258b13ed90efd411bf72d739990bd9e4b"} Dec 03 06:50:04 crc kubenswrapper[4831]: I1203 06:50:04.974241 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 06:50:04 crc kubenswrapper[4831]: I1203 06:50:04.975796 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:04 crc kubenswrapper[4831]: I1203 06:50:04.982277 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-8tw4h" Dec 03 06:50:04 crc kubenswrapper[4831]: I1203 06:50:04.982666 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 06:50:04 crc kubenswrapper[4831]: I1203 06:50:04.987770 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Dec 03 06:50:04 crc kubenswrapper[4831]: I1203 06:50:04.987967 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 06:50:04 crc kubenswrapper[4831]: I1203 06:50:04.994967 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.112808 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.113150 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.113365 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.113542 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.113699 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grn7z\" (UniqueName: \"kubernetes.io/projected/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-kube-api-access-grn7z\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.113856 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.113974 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-config\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.114076 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.217264 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.217559 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-config\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.217623 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.217858 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.217941 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.218092 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.218138 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.218525 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-config\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.218683 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grn7z\" (UniqueName: \"kubernetes.io/projected/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-kube-api-access-grn7z\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.218825 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.218900 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.220652 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.225675 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.225698 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.242156 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.244571 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grn7z\" (UniqueName: \"kubernetes.io/projected/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-kube-api-access-grn7z\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.258045 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"ovsdbserver-sb-0\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:05 crc kubenswrapper[4831]: I1203 06:50:05.299462 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:18 crc kubenswrapper[4831]: E1203 06:50:18.876773 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 03 06:50:18 crc kubenswrapper[4831]: E1203 06:50:18.878135 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fmfj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-galera-0_openstack(55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:50:18 crc kubenswrapper[4831]: E1203 06:50:18.879427 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-galera-0" podUID="55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" Dec 03 06:50:19 crc kubenswrapper[4831]: E1203 06:50:19.260261 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-galera-0" podUID="55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" Dec 03 06:50:19 crc kubenswrapper[4831]: E1203 06:50:19.972706 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 03 06:50:19 crc kubenswrapper[4831]: E1203 06:50:19.972890 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-smwwf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-server-0_openstack(dc0cbb94-92ec-4369-b609-f3186f302c66): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:50:19 crc kubenswrapper[4831]: E1203 06:50:19.974423 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-server-0" podUID="dc0cbb94-92ec-4369-b609-f3186f302c66" Dec 03 06:50:19 crc kubenswrapper[4831]: E1203 06:50:19.974522 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Dec 03 06:50:19 crc kubenswrapper[4831]: E1203 06:50:19.974841 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tdgmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(e358bf07-df54-4268-9421-f31c57f5594c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:50:19 crc kubenswrapper[4831]: E1203 06:50:19.976143 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="e358bf07-df54-4268-9421-f31c57f5594c" Dec 03 06:50:19 crc kubenswrapper[4831]: E1203 06:50:19.998839 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified" Dec 03 06:50:19 crc kubenswrapper[4831]: E1203 06:50:19.998981 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:setup-container,Image:quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified,Command:[sh -c cp /tmp/erlang-cookie-secret/.erlang.cookie /var/lib/rabbitmq/.erlang.cookie && chmod 600 /var/lib/rabbitmq/.erlang.cookie ; cp /tmp/rabbitmq-plugins/enabled_plugins /operator/enabled_plugins ; echo '[default]' > /var/lib/rabbitmq/.rabbitmqadmin.conf && sed -e 's/default_user/username/' -e 's/default_pass/password/' /tmp/default_user.conf >> /var/lib/rabbitmq/.rabbitmqadmin.conf && chmod 600 /var/lib/rabbitmq/.rabbitmqadmin.conf ; sleep 30],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:plugins-conf,ReadOnly:false,MountPath:/tmp/rabbitmq-plugins/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-erlang-cookie,ReadOnly:false,MountPath:/var/lib/rabbitmq/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:erlang-cookie-secret,ReadOnly:false,MountPath:/tmp/erlang-cookie-secret/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-plugins,ReadOnly:false,MountPath:/operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:persistence,ReadOnly:false,MountPath:/var/lib/rabbitmq/mnesia/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:rabbitmq-confd,ReadOnly:false,MountPath:/tmp/default_user.conf,SubPath:default_user.conf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fsdbs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cell1-server-0_openstack(8d6ac806-4ac5-4de4-b6a0-b265032150f4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.000151 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8d6ac806-4ac5-4de4-b6a0-b265032150f4" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.267573 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-cell1-server-0" podUID="8d6ac806-4ac5-4de4-b6a0-b265032150f4" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.267945 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"setup-container\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-rabbitmq:current-podified\\\"\"" pod="openstack/rabbitmq-server-0" podUID="dc0cbb94-92ec-4369-b609-f3186f302c66" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.269918 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="e358bf07-df54-4268-9421-f31c57f5594c" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.723624 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.723784 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-76xss,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-c8vfv_openstack(d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.725058 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-c8vfv" podUID="d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.726904 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.727020 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nfdh5dfhb6h64h676hc4h78h97h669h54chfbh696hb5h54bh5d4h6bh64h644h677h584h5cbh698h9dh5bbh5f8h5b8hcdh644h5c7h694hbfh589q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-b5fr5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-5ccc8479f9-9lqs6_openstack(25903af2-da65-4fb7-81f4-fc2514a738e1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.728194 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" podUID="25903af2-da65-4fb7-81f4-fc2514a738e1" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.729237 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.729517 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gsgg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-zllbc_openstack(469328a1-75a9-4909-a0cf-1e8f41656089): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.730757 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" podUID="469328a1-75a9-4909-a0cf-1e8f41656089" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.738005 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.738112 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ml8dc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-2v9pk_openstack(b1847a88-1fe3-4ee1-84f7-28e59d30d5e4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:50:20 crc kubenswrapper[4831]: E1203 06:50:20.739266 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" podUID="b1847a88-1fe3-4ee1-84f7-28e59d30d5e4" Dec 03 06:50:21 crc kubenswrapper[4831]: E1203 06:50:21.279374 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" podUID="25903af2-da65-4fb7-81f4-fc2514a738e1" Dec 03 06:50:21 crc kubenswrapper[4831]: E1203 06:50:21.279704 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" podUID="b1847a88-1fe3-4ee1-84f7-28e59d30d5e4" Dec 03 06:50:21 crc kubenswrapper[4831]: E1203 06:50:21.333690 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Dec 03 06:50:21 crc kubenswrapper[4831]: E1203 06:50:21.338657 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:n5c7h569hb9h8bhf6hcch556h58chch5fdh5b8h69h5b9h9fh7ch59ch74h5cfh696h7bhf8h5fdh96h5fdh686h579h644h55h56fhb6hddh58fq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rwcmz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(e1ffe861-7d12-49e2-9737-fc100833da39): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:50:21 crc kubenswrapper[4831]: E1203 06:50:21.339845 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="e1ffe861-7d12-49e2-9737-fc100833da39" Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.709635 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.717817 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-c8vfv" Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.847129 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gsgg2\" (UniqueName: \"kubernetes.io/projected/469328a1-75a9-4909-a0cf-1e8f41656089-kube-api-access-gsgg2\") pod \"469328a1-75a9-4909-a0cf-1e8f41656089\" (UID: \"469328a1-75a9-4909-a0cf-1e8f41656089\") " Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.847263 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76xss\" (UniqueName: \"kubernetes.io/projected/d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c-kube-api-access-76xss\") pod \"d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c\" (UID: \"d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c\") " Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.847437 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/469328a1-75a9-4909-a0cf-1e8f41656089-config\") pod \"469328a1-75a9-4909-a0cf-1e8f41656089\" (UID: \"469328a1-75a9-4909-a0cf-1e8f41656089\") " Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.848181 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/469328a1-75a9-4909-a0cf-1e8f41656089-config" (OuterVolumeSpecName: "config") pod "469328a1-75a9-4909-a0cf-1e8f41656089" (UID: "469328a1-75a9-4909-a0cf-1e8f41656089"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.848302 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/469328a1-75a9-4909-a0cf-1e8f41656089-dns-svc\") pod \"469328a1-75a9-4909-a0cf-1e8f41656089\" (UID: \"469328a1-75a9-4909-a0cf-1e8f41656089\") " Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.848985 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/469328a1-75a9-4909-a0cf-1e8f41656089-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "469328a1-75a9-4909-a0cf-1e8f41656089" (UID: "469328a1-75a9-4909-a0cf-1e8f41656089"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.849195 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c-config\") pod \"d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c\" (UID: \"d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c\") " Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.849765 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/469328a1-75a9-4909-a0cf-1e8f41656089-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.849797 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/469328a1-75a9-4909-a0cf-1e8f41656089-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.850559 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c-config" (OuterVolumeSpecName: "config") pod "d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c" (UID: "d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.853669 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c-kube-api-access-76xss" (OuterVolumeSpecName: "kube-api-access-76xss") pod "d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c" (UID: "d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c"). InnerVolumeSpecName "kube-api-access-76xss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.860178 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/469328a1-75a9-4909-a0cf-1e8f41656089-kube-api-access-gsgg2" (OuterVolumeSpecName: "kube-api-access-gsgg2") pod "469328a1-75a9-4909-a0cf-1e8f41656089" (UID: "469328a1-75a9-4909-a0cf-1e8f41656089"). InnerVolumeSpecName "kube-api-access-gsgg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.865097 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-95nwv"] Dec 03 06:50:21 crc kubenswrapper[4831]: W1203 06:50:21.873738 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fe1a689_1241_4c11_93ca_875e53319668.slice/crio-23be6442d413a3e0cc1c8839e5cef9929c9b6459e187771ca9c32fd102e5b8a4 WatchSource:0}: Error finding container 23be6442d413a3e0cc1c8839e5cef9929c9b6459e187771ca9c32fd102e5b8a4: Status 404 returned error can't find the container with id 23be6442d413a3e0cc1c8839e5cef9929c9b6459e187771ca9c32fd102e5b8a4 Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.941457 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.951258 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.951286 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gsgg2\" (UniqueName: \"kubernetes.io/projected/469328a1-75a9-4909-a0cf-1e8f41656089-kube-api-access-gsgg2\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.951298 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76xss\" (UniqueName: \"kubernetes.io/projected/d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c-kube-api-access-76xss\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:21 crc kubenswrapper[4831]: I1203 06:50:21.981642 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:50:21 crc kubenswrapper[4831]: W1203 06:50:21.983926 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0f65d17d_bff1_412c_94ab_cf83c538a36c.slice/crio-edcd9b50446d35c2e66d6508ac37e4fd5fa84752dc01e31d062149bdc66dc8cb WatchSource:0}: Error finding container edcd9b50446d35c2e66d6508ac37e4fd5fa84752dc01e31d062149bdc66dc8cb: Status 404 returned error can't find the container with id edcd9b50446d35c2e66d6508ac37e4fd5fa84752dc01e31d062149bdc66dc8cb Dec 03 06:50:22 crc kubenswrapper[4831]: W1203 06:50:22.055752 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0fdc967_7fb5_4702_b184_6953e8aefd19.slice/crio-144fc818d205f39c5f8dc72c8b47e039d45e0bb1ac2fe71d12542c87f95b9669 WatchSource:0}: Error finding container 144fc818d205f39c5f8dc72c8b47e039d45e0bb1ac2fe71d12542c87f95b9669: Status 404 returned error can't find the container with id 144fc818d205f39c5f8dc72c8b47e039d45e0bb1ac2fe71d12542c87f95b9669 Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.057203 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.287557 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-c8vfv" event={"ID":"d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c","Type":"ContainerDied","Data":"904a0103cd3371fb01c2934ed0b3c2deab670dc5f4c0336f7995f3679f07da19"} Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.287699 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-c8vfv" Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.290098 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1d3273a2-0cc5-4a9d-8f8c-5828592973e8","Type":"ContainerStarted","Data":"676c74f7f166e4b0b998e8092bfdfcbcb20b027f142d46b6971ffb74921f4f1c"} Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.291664 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0f65d17d-bff1-412c-94ab-cf83c538a36c","Type":"ContainerStarted","Data":"edcd9b50446d35c2e66d6508ac37e4fd5fa84752dc01e31d062149bdc66dc8cb"} Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.295268 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-95nwv" event={"ID":"5fe1a689-1241-4c11-93ca-875e53319668","Type":"ContainerStarted","Data":"23be6442d413a3e0cc1c8839e5cef9929c9b6459e187771ca9c32fd102e5b8a4"} Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.296772 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d0fdc967-7fb5-4702-b184-6953e8aefd19","Type":"ContainerStarted","Data":"144fc818d205f39c5f8dc72c8b47e039d45e0bb1ac2fe71d12542c87f95b9669"} Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.297953 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" event={"ID":"469328a1-75a9-4909-a0cf-1e8f41656089","Type":"ContainerDied","Data":"47109f9247c0173f4aadef663f251c901feb74d59bc629c2928f8f09e864ddbf"} Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.297971 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zllbc" Dec 03 06:50:22 crc kubenswrapper[4831]: E1203 06:50:22.299326 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="e1ffe861-7d12-49e2-9737-fc100833da39" Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.419564 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c8vfv"] Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.431925 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-c8vfv"] Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.448025 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zllbc"] Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.456981 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zllbc"] Dec 03 06:50:22 crc kubenswrapper[4831]: I1203 06:50:22.662922 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-w7h89"] Dec 03 06:50:22 crc kubenswrapper[4831]: W1203 06:50:22.798668 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f657b4b_bed8_4244_8727_2a3c59364041.slice/crio-77e4435c2b9852cf3bd63836f6d3dc5a707ee52c2e763cfa611a28d7bbaa5e2a WatchSource:0}: Error finding container 77e4435c2b9852cf3bd63836f6d3dc5a707ee52c2e763cfa611a28d7bbaa5e2a: Status 404 returned error can't find the container with id 77e4435c2b9852cf3bd63836f6d3dc5a707ee52c2e763cfa611a28d7bbaa5e2a Dec 03 06:50:23 crc kubenswrapper[4831]: I1203 06:50:23.026406 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="469328a1-75a9-4909-a0cf-1e8f41656089" path="/var/lib/kubelet/pods/469328a1-75a9-4909-a0cf-1e8f41656089/volumes" Dec 03 06:50:23 crc kubenswrapper[4831]: I1203 06:50:23.026850 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c" path="/var/lib/kubelet/pods/d6c7d82c-a79b-43d2-b08a-cffd5d7c5e2c/volumes" Dec 03 06:50:23 crc kubenswrapper[4831]: I1203 06:50:23.305622 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-w7h89" event={"ID":"7f657b4b-bed8-4244-8727-2a3c59364041","Type":"ContainerStarted","Data":"77e4435c2b9852cf3bd63836f6d3dc5a707ee52c2e763cfa611a28d7bbaa5e2a"} Dec 03 06:50:26 crc kubenswrapper[4831]: I1203 06:50:26.334746 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d0fdc967-7fb5-4702-b184-6953e8aefd19","Type":"ContainerStarted","Data":"bde47117f94d4b3d7621163e09c3a26678467d5110db8728318ccec32cffa515"} Dec 03 06:50:26 crc kubenswrapper[4831]: I1203 06:50:26.336306 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1d3273a2-0cc5-4a9d-8f8c-5828592973e8","Type":"ContainerStarted","Data":"f53e3cafe0fbef5b8163e8c9a9b80c9692ebb304d6df1d4e41059696b296c210"} Dec 03 06:50:26 crc kubenswrapper[4831]: I1203 06:50:26.337996 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0f65d17d-bff1-412c-94ab-cf83c538a36c","Type":"ContainerStarted","Data":"ecd51d1a399172f04cada1b6c630a980e38d48f2222540f5057abc4cda8babb0"} Dec 03 06:50:26 crc kubenswrapper[4831]: I1203 06:50:26.338102 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 06:50:26 crc kubenswrapper[4831]: I1203 06:50:26.339678 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-95nwv" event={"ID":"5fe1a689-1241-4c11-93ca-875e53319668","Type":"ContainerStarted","Data":"cfbeb31890e53191672906e82bdc1b8672d17ef21f7bf4d793893c62a489706c"} Dec 03 06:50:26 crc kubenswrapper[4831]: I1203 06:50:26.339806 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-95nwv" Dec 03 06:50:26 crc kubenswrapper[4831]: I1203 06:50:26.341532 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-w7h89" event={"ID":"7f657b4b-bed8-4244-8727-2a3c59364041","Type":"ContainerStarted","Data":"a8a83a53ad6bd26f5c6124009f863fc2f2097fa4dfc3c13079ffbc2ca77116ae"} Dec 03 06:50:26 crc kubenswrapper[4831]: I1203 06:50:26.355659 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=25.368582325 podStartE2EDuration="29.355638389s" podCreationTimestamp="2025-12-03 06:49:57 +0000 UTC" firstStartedPulling="2025-12-03 06:50:21.985459206 +0000 UTC m=+1159.329042714" lastFinishedPulling="2025-12-03 06:50:25.97251527 +0000 UTC m=+1163.316098778" observedRunningTime="2025-12-03 06:50:26.351112327 +0000 UTC m=+1163.694695845" watchObservedRunningTime="2025-12-03 06:50:26.355638389 +0000 UTC m=+1163.699221897" Dec 03 06:50:26 crc kubenswrapper[4831]: I1203 06:50:26.370801 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-95nwv" podStartSLOduration=21.362597179 podStartE2EDuration="25.370783873s" podCreationTimestamp="2025-12-03 06:50:01 +0000 UTC" firstStartedPulling="2025-12-03 06:50:21.880640487 +0000 UTC m=+1159.224223995" lastFinishedPulling="2025-12-03 06:50:25.888827171 +0000 UTC m=+1163.232410689" observedRunningTime="2025-12-03 06:50:26.364715153 +0000 UTC m=+1163.708298671" watchObservedRunningTime="2025-12-03 06:50:26.370783873 +0000 UTC m=+1163.714367381" Dec 03 06:50:27 crc kubenswrapper[4831]: I1203 06:50:27.350687 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f657b4b-bed8-4244-8727-2a3c59364041" containerID="a8a83a53ad6bd26f5c6124009f863fc2f2097fa4dfc3c13079ffbc2ca77116ae" exitCode=0 Dec 03 06:50:27 crc kubenswrapper[4831]: I1203 06:50:27.352143 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-w7h89" event={"ID":"7f657b4b-bed8-4244-8727-2a3c59364041","Type":"ContainerDied","Data":"a8a83a53ad6bd26f5c6124009f863fc2f2097fa4dfc3c13079ffbc2ca77116ae"} Dec 03 06:50:27 crc kubenswrapper[4831]: I1203 06:50:27.596250 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:50:27 crc kubenswrapper[4831]: I1203 06:50:27.596717 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:50:27 crc kubenswrapper[4831]: I1203 06:50:27.596771 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:50:27 crc kubenswrapper[4831]: I1203 06:50:27.597620 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fe6e405940a8abb32a63a2b267869e6a6149449d90d59de98ade036092eb761f"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:50:27 crc kubenswrapper[4831]: I1203 06:50:27.597709 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://fe6e405940a8abb32a63a2b267869e6a6149449d90d59de98ade036092eb761f" gracePeriod=600 Dec 03 06:50:28 crc kubenswrapper[4831]: I1203 06:50:28.361144 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-w7h89" event={"ID":"7f657b4b-bed8-4244-8727-2a3c59364041","Type":"ContainerStarted","Data":"63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4"} Dec 03 06:50:28 crc kubenswrapper[4831]: I1203 06:50:28.364763 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="fe6e405940a8abb32a63a2b267869e6a6149449d90d59de98ade036092eb761f" exitCode=0 Dec 03 06:50:28 crc kubenswrapper[4831]: I1203 06:50:28.364800 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"fe6e405940a8abb32a63a2b267869e6a6149449d90d59de98ade036092eb761f"} Dec 03 06:50:28 crc kubenswrapper[4831]: I1203 06:50:28.364828 4831 scope.go:117] "RemoveContainer" containerID="9c066cdb31940f01296e6a59517a0bf4cdb6c0c7137c9abb1a013450afc9368b" Dec 03 06:50:29 crc kubenswrapper[4831]: I1203 06:50:29.379481 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-w7h89" event={"ID":"7f657b4b-bed8-4244-8727-2a3c59364041","Type":"ContainerStarted","Data":"e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74"} Dec 03 06:50:29 crc kubenswrapper[4831]: I1203 06:50:29.381115 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:29 crc kubenswrapper[4831]: I1203 06:50:29.381219 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:50:29 crc kubenswrapper[4831]: I1203 06:50:29.383095 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"cdd4c5391a62949652f778a85945bc3bd1190df6d8604a3965df5d75b3dfc56a"} Dec 03 06:50:29 crc kubenswrapper[4831]: I1203 06:50:29.386017 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d0fdc967-7fb5-4702-b184-6953e8aefd19","Type":"ContainerStarted","Data":"0546a20824087822da4e65e207c3344405637cecf083c66a5cb892cd81e57ccb"} Dec 03 06:50:29 crc kubenswrapper[4831]: I1203 06:50:29.388255 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1d3273a2-0cc5-4a9d-8f8c-5828592973e8","Type":"ContainerStarted","Data":"f49e9e078c0c7dc85b489ebd0a280551621b057e73799f1a37d159eb18f5dff8"} Dec 03 06:50:29 crc kubenswrapper[4831]: I1203 06:50:29.410002 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-w7h89" podStartSLOduration=25.324149473 podStartE2EDuration="28.409979895s" podCreationTimestamp="2025-12-03 06:50:01 +0000 UTC" firstStartedPulling="2025-12-03 06:50:22.80141992 +0000 UTC m=+1160.145003428" lastFinishedPulling="2025-12-03 06:50:25.887250342 +0000 UTC m=+1163.230833850" observedRunningTime="2025-12-03 06:50:29.400249811 +0000 UTC m=+1166.743833329" watchObservedRunningTime="2025-12-03 06:50:29.409979895 +0000 UTC m=+1166.753563423" Dec 03 06:50:29 crc kubenswrapper[4831]: I1203 06:50:29.433611 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=19.37481378 podStartE2EDuration="26.433577824s" podCreationTimestamp="2025-12-03 06:50:03 +0000 UTC" firstStartedPulling="2025-12-03 06:50:21.949931155 +0000 UTC m=+1159.293514663" lastFinishedPulling="2025-12-03 06:50:29.008695199 +0000 UTC m=+1166.352278707" observedRunningTime="2025-12-03 06:50:29.425193112 +0000 UTC m=+1166.768776630" watchObservedRunningTime="2025-12-03 06:50:29.433577824 +0000 UTC m=+1166.777161342" Dec 03 06:50:29 crc kubenswrapper[4831]: I1203 06:50:29.461501 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=22.519289571 podStartE2EDuration="29.461479247s" podCreationTimestamp="2025-12-03 06:50:00 +0000 UTC" firstStartedPulling="2025-12-03 06:50:22.05745619 +0000 UTC m=+1159.401039718" lastFinishedPulling="2025-12-03 06:50:28.999645896 +0000 UTC m=+1166.343229394" observedRunningTime="2025-12-03 06:50:29.458847195 +0000 UTC m=+1166.802430703" watchObservedRunningTime="2025-12-03 06:50:29.461479247 +0000 UTC m=+1166.805062765" Dec 03 06:50:30 crc kubenswrapper[4831]: I1203 06:50:30.300476 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:31 crc kubenswrapper[4831]: I1203 06:50:31.892901 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:31 crc kubenswrapper[4831]: I1203 06:50:31.893419 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:31 crc kubenswrapper[4831]: I1203 06:50:31.938247 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.300386 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.356501 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.488436 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.498605 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.794125 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9lqs6"] Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.831083 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-8s4vh"] Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.839645 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.842051 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.867993 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8s4vh"] Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.885390 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-xlchk"] Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.886737 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.890259 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.910384 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-xlchk"] Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.955974 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2v9pk"] Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.960931 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-xlchk\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.960999 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-config\") pod \"dnsmasq-dns-5bf47b49b7-xlchk\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.961030 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5c67f9-4d9b-428a-a974-9162d81b1f02-combined-ca-bundle\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.961052 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-xlchk\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.961075 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr5gv\" (UniqueName: \"kubernetes.io/projected/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-kube-api-access-zr5gv\") pod \"dnsmasq-dns-5bf47b49b7-xlchk\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.961101 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b5c67f9-4d9b-428a-a974-9162d81b1f02-config\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.961120 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfd4c\" (UniqueName: \"kubernetes.io/projected/3b5c67f9-4d9b-428a-a974-9162d81b1f02-kube-api-access-gfd4c\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.961162 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b5c67f9-4d9b-428a-a974-9162d81b1f02-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.961176 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3b5c67f9-4d9b-428a-a974-9162d81b1f02-ovs-rundir\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.961207 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3b5c67f9-4d9b-428a-a974-9162d81b1f02-ovn-rundir\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.984128 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-4wrns"] Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.985564 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:32 crc kubenswrapper[4831]: I1203 06:50:32.993356 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.069919 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4wrns"] Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.070187 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.072560 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-config\") pod \"dnsmasq-dns-5bf47b49b7-xlchk\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.072649 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5c67f9-4d9b-428a-a974-9162d81b1f02-combined-ca-bundle\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.072693 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-xlchk\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.072726 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-dns-svc\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.072750 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr5gv\" (UniqueName: \"kubernetes.io/projected/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-kube-api-access-zr5gv\") pod \"dnsmasq-dns-5bf47b49b7-xlchk\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.072786 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.072814 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b5c67f9-4d9b-428a-a974-9162d81b1f02-config\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.072837 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfd4c\" (UniqueName: \"kubernetes.io/projected/3b5c67f9-4d9b-428a-a974-9162d81b1f02-kube-api-access-gfd4c\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.074391 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.074823 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b5c67f9-4d9b-428a-a974-9162d81b1f02-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.075113 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3b5c67f9-4d9b-428a-a974-9162d81b1f02-ovs-rundir\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.075153 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-config\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.075182 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k245d\" (UniqueName: \"kubernetes.io/projected/fd7ebefd-319f-46b1-87eb-7d0669a90bef-kube-api-access-k245d\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.075243 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.075276 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3b5c67f9-4d9b-428a-a974-9162d81b1f02-ovn-rundir\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.075335 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-xlchk\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.076330 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-config\") pod \"dnsmasq-dns-5bf47b49b7-xlchk\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.077117 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-tpxrs" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.077386 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.077542 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.077879 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.078173 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-xlchk\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.078763 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b5c67f9-4d9b-428a-a974-9162d81b1f02-config\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.080235 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3b5c67f9-4d9b-428a-a974-9162d81b1f02-ovn-rundir\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.080331 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3b5c67f9-4d9b-428a-a974-9162d81b1f02-ovs-rundir\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.081033 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5c67f9-4d9b-428a-a974-9162d81b1f02-combined-ca-bundle\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.098640 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-xlchk\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.102842 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b5c67f9-4d9b-428a-a974-9162d81b1f02-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.107250 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.176607 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.176823 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.176869 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-config\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.176922 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k245d\" (UniqueName: \"kubernetes.io/projected/fd7ebefd-319f-46b1-87eb-7d0669a90bef-kube-api-access-k245d\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.176995 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b548290-abc5-4c67-862c-16aa03a652da-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.177020 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.177041 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b548290-abc5-4c67-862c-16aa03a652da-config\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.177085 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.177166 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtx5p\" (UniqueName: \"kubernetes.io/projected/5b548290-abc5-4c67-862c-16aa03a652da-kube-api-access-jtx5p\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.177380 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-dns-svc\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.177452 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.177481 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b548290-abc5-4c67-862c-16aa03a652da-scripts\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.181101 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.183674 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.183725 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-dns-svc\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.184028 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-config\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.219299 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfd4c\" (UniqueName: \"kubernetes.io/projected/3b5c67f9-4d9b-428a-a974-9162d81b1f02-kube-api-access-gfd4c\") pod \"ovn-controller-metrics-8s4vh\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.279413 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.279453 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b548290-abc5-4c67-862c-16aa03a652da-scripts\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.279502 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.279544 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b548290-abc5-4c67-862c-16aa03a652da-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.279559 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.279578 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b548290-abc5-4c67-862c-16aa03a652da-config\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.279609 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtx5p\" (UniqueName: \"kubernetes.io/projected/5b548290-abc5-4c67-862c-16aa03a652da-kube-api-access-jtx5p\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.280823 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b548290-abc5-4c67-862c-16aa03a652da-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.281531 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b548290-abc5-4c67-862c-16aa03a652da-scripts\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.281701 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b548290-abc5-4c67-862c-16aa03a652da-config\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.325752 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr5gv\" (UniqueName: \"kubernetes.io/projected/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-kube-api-access-zr5gv\") pod \"dnsmasq-dns-5bf47b49b7-xlchk\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.325829 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k245d\" (UniqueName: \"kubernetes.io/projected/fd7ebefd-319f-46b1-87eb-7d0669a90bef-kube-api-access-k245d\") pod \"dnsmasq-dns-8554648995-4wrns\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.326516 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.327459 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.327941 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.328225 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtx5p\" (UniqueName: \"kubernetes.io/projected/5b548290-abc5-4c67-862c-16aa03a652da-kube-api-access-jtx5p\") pod \"ovn-northd-0\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.368600 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.387279 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25903af2-da65-4fb7-81f4-fc2514a738e1-config\") pod \"25903af2-da65-4fb7-81f4-fc2514a738e1\" (UID: \"25903af2-da65-4fb7-81f4-fc2514a738e1\") " Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.387441 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25903af2-da65-4fb7-81f4-fc2514a738e1-dns-svc\") pod \"25903af2-da65-4fb7-81f4-fc2514a738e1\" (UID: \"25903af2-da65-4fb7-81f4-fc2514a738e1\") " Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.387544 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5fr5\" (UniqueName: \"kubernetes.io/projected/25903af2-da65-4fb7-81f4-fc2514a738e1-kube-api-access-b5fr5\") pod \"25903af2-da65-4fb7-81f4-fc2514a738e1\" (UID: \"25903af2-da65-4fb7-81f4-fc2514a738e1\") " Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.393272 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25903af2-da65-4fb7-81f4-fc2514a738e1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "25903af2-da65-4fb7-81f4-fc2514a738e1" (UID: "25903af2-da65-4fb7-81f4-fc2514a738e1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.397148 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25903af2-da65-4fb7-81f4-fc2514a738e1-kube-api-access-b5fr5" (OuterVolumeSpecName: "kube-api-access-b5fr5") pod "25903af2-da65-4fb7-81f4-fc2514a738e1" (UID: "25903af2-da65-4fb7-81f4-fc2514a738e1"). InnerVolumeSpecName "kube-api-access-b5fr5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.398679 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25903af2-da65-4fb7-81f4-fc2514a738e1-config" (OuterVolumeSpecName: "config") pod "25903af2-da65-4fb7-81f4-fc2514a738e1" (UID: "25903af2-da65-4fb7-81f4-fc2514a738e1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.411899 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.455919 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.484764 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.485501 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" event={"ID":"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4","Type":"ContainerDied","Data":"cd249c12d081afdb1883da3bcf647e4f2af50ef654a5e9c0798a9ca0a0548b46"} Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.485588 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-2v9pk" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.490389 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/25903af2-da65-4fb7-81f4-fc2514a738e1-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.490424 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/25903af2-da65-4fb7-81f4-fc2514a738e1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.490434 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5fr5\" (UniqueName: \"kubernetes.io/projected/25903af2-da65-4fb7-81f4-fc2514a738e1-kube-api-access-b5fr5\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.508548 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.510411 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-9lqs6" event={"ID":"25903af2-da65-4fb7-81f4-fc2514a738e1","Type":"ContainerDied","Data":"15614f0a7a4fda1be2998ac7f1d26d2b47ef53cb282c701fabed6eece1202b0c"} Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.511588 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.591894 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml8dc\" (UniqueName: \"kubernetes.io/projected/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-kube-api-access-ml8dc\") pod \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\" (UID: \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\") " Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.592238 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-config\") pod \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\" (UID: \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\") " Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.592292 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-dns-svc\") pod \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\" (UID: \"b1847a88-1fe3-4ee1-84f7-28e59d30d5e4\") " Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.592715 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-config" (OuterVolumeSpecName: "config") pod "b1847a88-1fe3-4ee1-84f7-28e59d30d5e4" (UID: "b1847a88-1fe3-4ee1-84f7-28e59d30d5e4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.592802 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b1847a88-1fe3-4ee1-84f7-28e59d30d5e4" (UID: "b1847a88-1fe3-4ee1-84f7-28e59d30d5e4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.620770 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.695721 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.695755 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.721995 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-kube-api-access-ml8dc" (OuterVolumeSpecName: "kube-api-access-ml8dc") pod "b1847a88-1fe3-4ee1-84f7-28e59d30d5e4" (UID: "b1847a88-1fe3-4ee1-84f7-28e59d30d5e4"). InnerVolumeSpecName "kube-api-access-ml8dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:33 crc kubenswrapper[4831]: I1203 06:50:33.797994 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml8dc\" (UniqueName: \"kubernetes.io/projected/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4-kube-api-access-ml8dc\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.000767 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2v9pk"] Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.022657 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-2v9pk"] Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.047631 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9lqs6"] Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.056004 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-9lqs6"] Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.349788 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.434028 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4wrns"] Dec 03 06:50:34 crc kubenswrapper[4831]: W1203 06:50:34.516547 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3b5c67f9_4d9b_428a_a974_9162d81b1f02.slice/crio-97b7e3a9d84ca408ddea79357809a1ee6dd0af40ff26afef0603c2ed7e5c186d WatchSource:0}: Error finding container 97b7e3a9d84ca408ddea79357809a1ee6dd0af40ff26afef0603c2ed7e5c186d: Status 404 returned error can't find the container with id 97b7e3a9d84ca408ddea79357809a1ee6dd0af40ff26afef0603c2ed7e5c186d Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.517384 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d6ac806-4ac5-4de4-b6a0-b265032150f4","Type":"ContainerStarted","Data":"055b028fb60e01afe549cd5b7477b7dee313aab541390287578dc34dd742e9d3"} Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.520630 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b548290-abc5-4c67-862c-16aa03a652da","Type":"ContainerStarted","Data":"9163527ee0c82ba9f14f26c03aa04b245c67e69f182fc2dafc6a7e0626cc9046"} Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.521552 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-8s4vh"] Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.523642 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e358bf07-df54-4268-9421-f31c57f5594c","Type":"ContainerStarted","Data":"08a79d800c1a289cb8bcbf290c6e99647a9abaf260f75ecabb6f2350b81fe0f4"} Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.526495 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b","Type":"ContainerStarted","Data":"48e689b6dcef20403bef843759f14be2c5ce12c72fc1101d7f6683cde1f5871c"} Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.530116 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4wrns" event={"ID":"fd7ebefd-319f-46b1-87eb-7d0669a90bef","Type":"ContainerStarted","Data":"f3bdf7da52912656aad87b39efc9c980070a7844f679b1eb7bb711ee9135d48b"} Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.532489 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dc0cbb94-92ec-4369-b609-f3186f302c66","Type":"ContainerStarted","Data":"454f4b86b6c9c0a48f9173d3808d9aeaba4f55df7d2f0c60530104f9fda643a9"} Dec 03 06:50:34 crc kubenswrapper[4831]: I1203 06:50:34.597410 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-xlchk"] Dec 03 06:50:34 crc kubenswrapper[4831]: W1203 06:50:34.642888 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode596da18_b5fc_4fa5_b960_d3360ae3b1f5.slice/crio-dfc7e10461a78358e26a060657a47967509de2de6d6248daf7764fd1f70e889c WatchSource:0}: Error finding container dfc7e10461a78358e26a060657a47967509de2de6d6248daf7764fd1f70e889c: Status 404 returned error can't find the container with id dfc7e10461a78358e26a060657a47967509de2de6d6248daf7764fd1f70e889c Dec 03 06:50:35 crc kubenswrapper[4831]: I1203 06:50:35.023417 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25903af2-da65-4fb7-81f4-fc2514a738e1" path="/var/lib/kubelet/pods/25903af2-da65-4fb7-81f4-fc2514a738e1/volumes" Dec 03 06:50:35 crc kubenswrapper[4831]: I1203 06:50:35.024002 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1847a88-1fe3-4ee1-84f7-28e59d30d5e4" path="/var/lib/kubelet/pods/b1847a88-1fe3-4ee1-84f7-28e59d30d5e4/volumes" Dec 03 06:50:35 crc kubenswrapper[4831]: I1203 06:50:35.539707 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" event={"ID":"e596da18-b5fc-4fa5-b960-d3360ae3b1f5","Type":"ContainerStarted","Data":"dfc7e10461a78358e26a060657a47967509de2de6d6248daf7764fd1f70e889c"} Dec 03 06:50:35 crc kubenswrapper[4831]: I1203 06:50:35.542543 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8s4vh" event={"ID":"3b5c67f9-4d9b-428a-a974-9162d81b1f02","Type":"ContainerStarted","Data":"85779452492d794c13e8c469cf65390b164504234a47b5ad1564537fb4273f5a"} Dec 03 06:50:35 crc kubenswrapper[4831]: I1203 06:50:35.542572 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8s4vh" event={"ID":"3b5c67f9-4d9b-428a-a974-9162d81b1f02","Type":"ContainerStarted","Data":"97b7e3a9d84ca408ddea79357809a1ee6dd0af40ff26afef0603c2ed7e5c186d"} Dec 03 06:50:35 crc kubenswrapper[4831]: I1203 06:50:35.547856 4831 generic.go:334] "Generic (PLEG): container finished" podID="fd7ebefd-319f-46b1-87eb-7d0669a90bef" containerID="c11bce4eca58128c596f014f0596c3b62dac8a15cdc679ec75f249c2e6dbc25a" exitCode=0 Dec 03 06:50:35 crc kubenswrapper[4831]: I1203 06:50:35.547907 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4wrns" event={"ID":"fd7ebefd-319f-46b1-87eb-7d0669a90bef","Type":"ContainerDied","Data":"c11bce4eca58128c596f014f0596c3b62dac8a15cdc679ec75f249c2e6dbc25a"} Dec 03 06:50:35 crc kubenswrapper[4831]: I1203 06:50:35.569956 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-8s4vh" podStartSLOduration=3.569939394 podStartE2EDuration="3.569939394s" podCreationTimestamp="2025-12-03 06:50:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:35.564267596 +0000 UTC m=+1172.907851124" watchObservedRunningTime="2025-12-03 06:50:35.569939394 +0000 UTC m=+1172.913522902" Dec 03 06:50:37 crc kubenswrapper[4831]: I1203 06:50:37.568871 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4wrns" event={"ID":"fd7ebefd-319f-46b1-87eb-7d0669a90bef","Type":"ContainerStarted","Data":"10cd4cabc32c3ed5d1936ca4d5399cb512ded90ef20802e74e4b10b5ea176d59"} Dec 03 06:50:37 crc kubenswrapper[4831]: I1203 06:50:37.569375 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:37 crc kubenswrapper[4831]: I1203 06:50:37.570908 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b548290-abc5-4c67-862c-16aa03a652da","Type":"ContainerStarted","Data":"86e3f1fa5d839f3ee714b40f2722df67d64bd2305d8193e7a010af7f96797f76"} Dec 03 06:50:37 crc kubenswrapper[4831]: I1203 06:50:37.570980 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b548290-abc5-4c67-862c-16aa03a652da","Type":"ContainerStarted","Data":"f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa"} Dec 03 06:50:37 crc kubenswrapper[4831]: I1203 06:50:37.572062 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 06:50:37 crc kubenswrapper[4831]: I1203 06:50:37.574562 4831 generic.go:334] "Generic (PLEG): container finished" podID="e596da18-b5fc-4fa5-b960-d3360ae3b1f5" containerID="f59e45b2315faa96a3dfc4d301a9fc0cf9fba294eeb36114cdb59a77ccd8a43e" exitCode=0 Dec 03 06:50:37 crc kubenswrapper[4831]: I1203 06:50:37.574641 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" event={"ID":"e596da18-b5fc-4fa5-b960-d3360ae3b1f5","Type":"ContainerDied","Data":"f59e45b2315faa96a3dfc4d301a9fc0cf9fba294eeb36114cdb59a77ccd8a43e"} Dec 03 06:50:37 crc kubenswrapper[4831]: I1203 06:50:37.578125 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e1ffe861-7d12-49e2-9737-fc100833da39","Type":"ContainerStarted","Data":"2a6aed804c2a9f3ed8236fffc4704163803135cc2b4f57e197408d2f4c85bb43"} Dec 03 06:50:37 crc kubenswrapper[4831]: I1203 06:50:37.578399 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 06:50:37 crc kubenswrapper[4831]: I1203 06:50:37.757887 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8554648995-4wrns" podStartSLOduration=5.235215984 podStartE2EDuration="5.757871389s" podCreationTimestamp="2025-12-03 06:50:32 +0000 UTC" firstStartedPulling="2025-12-03 06:50:34.456519003 +0000 UTC m=+1171.800102511" lastFinishedPulling="2025-12-03 06:50:34.979174408 +0000 UTC m=+1172.322757916" observedRunningTime="2025-12-03 06:50:37.752661816 +0000 UTC m=+1175.096245324" watchObservedRunningTime="2025-12-03 06:50:37.757871389 +0000 UTC m=+1175.101454897" Dec 03 06:50:37 crc kubenswrapper[4831]: I1203 06:50:37.779146 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=7.801304986 podStartE2EDuration="42.779127014s" podCreationTimestamp="2025-12-03 06:49:55 +0000 UTC" firstStartedPulling="2025-12-03 06:50:01.98521837 +0000 UTC m=+1139.328801878" lastFinishedPulling="2025-12-03 06:50:36.963040398 +0000 UTC m=+1174.306623906" observedRunningTime="2025-12-03 06:50:37.766245701 +0000 UTC m=+1175.109829209" watchObservedRunningTime="2025-12-03 06:50:37.779127014 +0000 UTC m=+1175.122710522" Dec 03 06:50:37 crc kubenswrapper[4831]: I1203 06:50:37.821675 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.328411496 podStartE2EDuration="4.821652315s" podCreationTimestamp="2025-12-03 06:50:33 +0000 UTC" firstStartedPulling="2025-12-03 06:50:34.379257075 +0000 UTC m=+1171.722840583" lastFinishedPulling="2025-12-03 06:50:36.872497894 +0000 UTC m=+1174.216081402" observedRunningTime="2025-12-03 06:50:37.817153764 +0000 UTC m=+1175.160737272" watchObservedRunningTime="2025-12-03 06:50:37.821652315 +0000 UTC m=+1175.165235823" Dec 03 06:50:37 crc kubenswrapper[4831]: I1203 06:50:37.834812 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 06:50:38 crc kubenswrapper[4831]: I1203 06:50:38.592515 4831 generic.go:334] "Generic (PLEG): container finished" podID="e358bf07-df54-4268-9421-f31c57f5594c" containerID="08a79d800c1a289cb8bcbf290c6e99647a9abaf260f75ecabb6f2350b81fe0f4" exitCode=0 Dec 03 06:50:38 crc kubenswrapper[4831]: I1203 06:50:38.592613 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e358bf07-df54-4268-9421-f31c57f5594c","Type":"ContainerDied","Data":"08a79d800c1a289cb8bcbf290c6e99647a9abaf260f75ecabb6f2350b81fe0f4"} Dec 03 06:50:38 crc kubenswrapper[4831]: I1203 06:50:38.595012 4831 generic.go:334] "Generic (PLEG): container finished" podID="55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" containerID="48e689b6dcef20403bef843759f14be2c5ce12c72fc1101d7f6683cde1f5871c" exitCode=0 Dec 03 06:50:38 crc kubenswrapper[4831]: I1203 06:50:38.595134 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b","Type":"ContainerDied","Data":"48e689b6dcef20403bef843759f14be2c5ce12c72fc1101d7f6683cde1f5871c"} Dec 03 06:50:38 crc kubenswrapper[4831]: I1203 06:50:38.599377 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" event={"ID":"e596da18-b5fc-4fa5-b960-d3360ae3b1f5","Type":"ContainerStarted","Data":"952632fe5965c19b53c2b795c71f34afc1c251edc19237ea430207afcef8aa7f"} Dec 03 06:50:38 crc kubenswrapper[4831]: I1203 06:50:38.687011 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" podStartSLOduration=6.15350995 podStartE2EDuration="6.686966623s" podCreationTimestamp="2025-12-03 06:50:32 +0000 UTC" firstStartedPulling="2025-12-03 06:50:34.647925392 +0000 UTC m=+1171.991508900" lastFinishedPulling="2025-12-03 06:50:35.181382065 +0000 UTC m=+1172.524965573" observedRunningTime="2025-12-03 06:50:38.681263864 +0000 UTC m=+1176.024847392" watchObservedRunningTime="2025-12-03 06:50:38.686966623 +0000 UTC m=+1176.030550131" Dec 03 06:50:39 crc kubenswrapper[4831]: I1203 06:50:39.612602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e358bf07-df54-4268-9421-f31c57f5594c","Type":"ContainerStarted","Data":"a1ad2c7b7ffe4f7516b2ef2e3d38d3444b1dcf04767f85dd6169d820a50b087a"} Dec 03 06:50:39 crc kubenswrapper[4831]: I1203 06:50:39.615884 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b","Type":"ContainerStarted","Data":"2083069d237ad9e634573977e455af0f77cad2d64fa39a6dce0010ff03639849"} Dec 03 06:50:39 crc kubenswrapper[4831]: I1203 06:50:39.616242 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:39 crc kubenswrapper[4831]: I1203 06:50:39.645783 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=7.909660005 podStartE2EDuration="45.645759545s" podCreationTimestamp="2025-12-03 06:49:54 +0000 UTC" firstStartedPulling="2025-12-03 06:49:56.053203324 +0000 UTC m=+1133.396786832" lastFinishedPulling="2025-12-03 06:50:33.789302864 +0000 UTC m=+1171.132886372" observedRunningTime="2025-12-03 06:50:39.642928216 +0000 UTC m=+1176.986511744" watchObservedRunningTime="2025-12-03 06:50:39.645759545 +0000 UTC m=+1176.989343073" Dec 03 06:50:39 crc kubenswrapper[4831]: I1203 06:50:39.675548 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.821888208 podStartE2EDuration="47.675522417s" podCreationTimestamp="2025-12-03 06:49:52 +0000 UTC" firstStartedPulling="2025-12-03 06:49:54.759542664 +0000 UTC m=+1132.103126172" lastFinishedPulling="2025-12-03 06:50:33.613176873 +0000 UTC m=+1170.956760381" observedRunningTime="2025-12-03 06:50:39.672208403 +0000 UTC m=+1177.015791961" watchObservedRunningTime="2025-12-03 06:50:39.675522417 +0000 UTC m=+1177.019105955" Dec 03 06:50:43 crc kubenswrapper[4831]: I1203 06:50:43.514109 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:43 crc kubenswrapper[4831]: I1203 06:50:43.622567 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:43 crc kubenswrapper[4831]: I1203 06:50:43.700389 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-xlchk"] Dec 03 06:50:43 crc kubenswrapper[4831]: I1203 06:50:43.700702 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" podUID="e596da18-b5fc-4fa5-b960-d3360ae3b1f5" containerName="dnsmasq-dns" containerID="cri-o://952632fe5965c19b53c2b795c71f34afc1c251edc19237ea430207afcef8aa7f" gracePeriod=10 Dec 03 06:50:43 crc kubenswrapper[4831]: I1203 06:50:43.987226 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 06:50:43 crc kubenswrapper[4831]: I1203 06:50:43.987545 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.077591 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.152699 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.258527 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr5gv\" (UniqueName: \"kubernetes.io/projected/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-kube-api-access-zr5gv\") pod \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.258686 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-config\") pod \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.259512 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-dns-svc\") pod \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.259609 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-ovsdbserver-nb\") pod \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\" (UID: \"e596da18-b5fc-4fa5-b960-d3360ae3b1f5\") " Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.270849 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-kube-api-access-zr5gv" (OuterVolumeSpecName: "kube-api-access-zr5gv") pod "e596da18-b5fc-4fa5-b960-d3360ae3b1f5" (UID: "e596da18-b5fc-4fa5-b960-d3360ae3b1f5"). InnerVolumeSpecName "kube-api-access-zr5gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.296950 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-config" (OuterVolumeSpecName: "config") pod "e596da18-b5fc-4fa5-b960-d3360ae3b1f5" (UID: "e596da18-b5fc-4fa5-b960-d3360ae3b1f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.300557 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e596da18-b5fc-4fa5-b960-d3360ae3b1f5" (UID: "e596da18-b5fc-4fa5-b960-d3360ae3b1f5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.330065 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e596da18-b5fc-4fa5-b960-d3360ae3b1f5" (UID: "e596da18-b5fc-4fa5-b960-d3360ae3b1f5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.363839 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr5gv\" (UniqueName: \"kubernetes.io/projected/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-kube-api-access-zr5gv\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.363897 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.363910 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.363922 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e596da18-b5fc-4fa5-b960-d3360ae3b1f5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.660501 4831 generic.go:334] "Generic (PLEG): container finished" podID="e596da18-b5fc-4fa5-b960-d3360ae3b1f5" containerID="952632fe5965c19b53c2b795c71f34afc1c251edc19237ea430207afcef8aa7f" exitCode=0 Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.660544 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" event={"ID":"e596da18-b5fc-4fa5-b960-d3360ae3b1f5","Type":"ContainerDied","Data":"952632fe5965c19b53c2b795c71f34afc1c251edc19237ea430207afcef8aa7f"} Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.660581 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" event={"ID":"e596da18-b5fc-4fa5-b960-d3360ae3b1f5","Type":"ContainerDied","Data":"dfc7e10461a78358e26a060657a47967509de2de6d6248daf7764fd1f70e889c"} Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.660599 4831 scope.go:117] "RemoveContainer" containerID="952632fe5965c19b53c2b795c71f34afc1c251edc19237ea430207afcef8aa7f" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.660527 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-xlchk" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.689958 4831 scope.go:117] "RemoveContainer" containerID="f59e45b2315faa96a3dfc4d301a9fc0cf9fba294eeb36114cdb59a77ccd8a43e" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.695856 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-xlchk"] Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.705237 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-xlchk"] Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.723169 4831 scope.go:117] "RemoveContainer" containerID="952632fe5965c19b53c2b795c71f34afc1c251edc19237ea430207afcef8aa7f" Dec 03 06:50:44 crc kubenswrapper[4831]: E1203 06:50:44.723693 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952632fe5965c19b53c2b795c71f34afc1c251edc19237ea430207afcef8aa7f\": container with ID starting with 952632fe5965c19b53c2b795c71f34afc1c251edc19237ea430207afcef8aa7f not found: ID does not exist" containerID="952632fe5965c19b53c2b795c71f34afc1c251edc19237ea430207afcef8aa7f" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.723733 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952632fe5965c19b53c2b795c71f34afc1c251edc19237ea430207afcef8aa7f"} err="failed to get container status \"952632fe5965c19b53c2b795c71f34afc1c251edc19237ea430207afcef8aa7f\": rpc error: code = NotFound desc = could not find container \"952632fe5965c19b53c2b795c71f34afc1c251edc19237ea430207afcef8aa7f\": container with ID starting with 952632fe5965c19b53c2b795c71f34afc1c251edc19237ea430207afcef8aa7f not found: ID does not exist" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.723767 4831 scope.go:117] "RemoveContainer" containerID="f59e45b2315faa96a3dfc4d301a9fc0cf9fba294eeb36114cdb59a77ccd8a43e" Dec 03 06:50:44 crc kubenswrapper[4831]: E1203 06:50:44.723990 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f59e45b2315faa96a3dfc4d301a9fc0cf9fba294eeb36114cdb59a77ccd8a43e\": container with ID starting with f59e45b2315faa96a3dfc4d301a9fc0cf9fba294eeb36114cdb59a77ccd8a43e not found: ID does not exist" containerID="f59e45b2315faa96a3dfc4d301a9fc0cf9fba294eeb36114cdb59a77ccd8a43e" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.724021 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f59e45b2315faa96a3dfc4d301a9fc0cf9fba294eeb36114cdb59a77ccd8a43e"} err="failed to get container status \"f59e45b2315faa96a3dfc4d301a9fc0cf9fba294eeb36114cdb59a77ccd8a43e\": rpc error: code = NotFound desc = could not find container \"f59e45b2315faa96a3dfc4d301a9fc0cf9fba294eeb36114cdb59a77ccd8a43e\": container with ID starting with f59e45b2315faa96a3dfc4d301a9fc0cf9fba294eeb36114cdb59a77ccd8a43e not found: ID does not exist" Dec 03 06:50:44 crc kubenswrapper[4831]: I1203 06:50:44.757730 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.021434 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e596da18-b5fc-4fa5-b960-d3360ae3b1f5" path="/var/lib/kubelet/pods/e596da18-b5fc-4fa5-b960-d3360ae3b1f5/volumes" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.418367 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-7cf1-account-create-update-qc9fq"] Dec 03 06:50:45 crc kubenswrapper[4831]: E1203 06:50:45.419056 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e596da18-b5fc-4fa5-b960-d3360ae3b1f5" containerName="init" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.419076 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e596da18-b5fc-4fa5-b960-d3360ae3b1f5" containerName="init" Dec 03 06:50:45 crc kubenswrapper[4831]: E1203 06:50:45.419092 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e596da18-b5fc-4fa5-b960-d3360ae3b1f5" containerName="dnsmasq-dns" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.419099 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e596da18-b5fc-4fa5-b960-d3360ae3b1f5" containerName="dnsmasq-dns" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.419299 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e596da18-b5fc-4fa5-b960-d3360ae3b1f5" containerName="dnsmasq-dns" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.420284 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cf1-account-create-update-qc9fq" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.425297 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.439380 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.439487 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.451392 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7cf1-account-create-update-qc9fq"] Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.465242 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-9jjj4"] Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.466433 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9jjj4" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.472330 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9jjj4"] Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.484362 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8m9h\" (UniqueName: \"kubernetes.io/projected/072f167c-3d27-49f2-aae9-84ddb84e6a8e-kube-api-access-h8m9h\") pod \"keystone-7cf1-account-create-update-qc9fq\" (UID: \"072f167c-3d27-49f2-aae9-84ddb84e6a8e\") " pod="openstack/keystone-7cf1-account-create-update-qc9fq" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.484420 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67dbccb6-784e-4fba-8b01-227a5b3d1a3e-operator-scripts\") pod \"keystone-db-create-9jjj4\" (UID: \"67dbccb6-784e-4fba-8b01-227a5b3d1a3e\") " pod="openstack/keystone-db-create-9jjj4" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.484459 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/072f167c-3d27-49f2-aae9-84ddb84e6a8e-operator-scripts\") pod \"keystone-7cf1-account-create-update-qc9fq\" (UID: \"072f167c-3d27-49f2-aae9-84ddb84e6a8e\") " pod="openstack/keystone-7cf1-account-create-update-qc9fq" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.484599 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmtvb\" (UniqueName: \"kubernetes.io/projected/67dbccb6-784e-4fba-8b01-227a5b3d1a3e-kube-api-access-dmtvb\") pod \"keystone-db-create-9jjj4\" (UID: \"67dbccb6-784e-4fba-8b01-227a5b3d1a3e\") " pod="openstack/keystone-db-create-9jjj4" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.555112 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.585746 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8m9h\" (UniqueName: \"kubernetes.io/projected/072f167c-3d27-49f2-aae9-84ddb84e6a8e-kube-api-access-h8m9h\") pod \"keystone-7cf1-account-create-update-qc9fq\" (UID: \"072f167c-3d27-49f2-aae9-84ddb84e6a8e\") " pod="openstack/keystone-7cf1-account-create-update-qc9fq" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.585797 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67dbccb6-784e-4fba-8b01-227a5b3d1a3e-operator-scripts\") pod \"keystone-db-create-9jjj4\" (UID: \"67dbccb6-784e-4fba-8b01-227a5b3d1a3e\") " pod="openstack/keystone-db-create-9jjj4" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.585820 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/072f167c-3d27-49f2-aae9-84ddb84e6a8e-operator-scripts\") pod \"keystone-7cf1-account-create-update-qc9fq\" (UID: \"072f167c-3d27-49f2-aae9-84ddb84e6a8e\") " pod="openstack/keystone-7cf1-account-create-update-qc9fq" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.585896 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmtvb\" (UniqueName: \"kubernetes.io/projected/67dbccb6-784e-4fba-8b01-227a5b3d1a3e-kube-api-access-dmtvb\") pod \"keystone-db-create-9jjj4\" (UID: \"67dbccb6-784e-4fba-8b01-227a5b3d1a3e\") " pod="openstack/keystone-db-create-9jjj4" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.586638 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67dbccb6-784e-4fba-8b01-227a5b3d1a3e-operator-scripts\") pod \"keystone-db-create-9jjj4\" (UID: \"67dbccb6-784e-4fba-8b01-227a5b3d1a3e\") " pod="openstack/keystone-db-create-9jjj4" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.586651 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/072f167c-3d27-49f2-aae9-84ddb84e6a8e-operator-scripts\") pod \"keystone-7cf1-account-create-update-qc9fq\" (UID: \"072f167c-3d27-49f2-aae9-84ddb84e6a8e\") " pod="openstack/keystone-7cf1-account-create-update-qc9fq" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.600834 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8m9h\" (UniqueName: \"kubernetes.io/projected/072f167c-3d27-49f2-aae9-84ddb84e6a8e-kube-api-access-h8m9h\") pod \"keystone-7cf1-account-create-update-qc9fq\" (UID: \"072f167c-3d27-49f2-aae9-84ddb84e6a8e\") " pod="openstack/keystone-7cf1-account-create-update-qc9fq" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.601226 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmtvb\" (UniqueName: \"kubernetes.io/projected/67dbccb6-784e-4fba-8b01-227a5b3d1a3e-kube-api-access-dmtvb\") pod \"keystone-db-create-9jjj4\" (UID: \"67dbccb6-784e-4fba-8b01-227a5b3d1a3e\") " pod="openstack/keystone-db-create-9jjj4" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.715839 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-l964v"] Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.716786 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-l964v" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.723424 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-l964v"] Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.732413 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-2619-account-create-update-tlzdz"] Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.734368 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2619-account-create-update-tlzdz" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.735627 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cf1-account-create-update-qc9fq" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.738390 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.739438 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2619-account-create-update-tlzdz"] Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.778684 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.783436 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.797243 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9jjj4" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.894294 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78440d07-b290-4710-a9b0-ce6c0b5efa0c-operator-scripts\") pod \"placement-db-create-l964v\" (UID: \"78440d07-b290-4710-a9b0-ce6c0b5efa0c\") " pod="openstack/placement-db-create-l964v" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.895589 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvx4c\" (UniqueName: \"kubernetes.io/projected/78440d07-b290-4710-a9b0-ce6c0b5efa0c-kube-api-access-dvx4c\") pod \"placement-db-create-l964v\" (UID: \"78440d07-b290-4710-a9b0-ce6c0b5efa0c\") " pod="openstack/placement-db-create-l964v" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.896086 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82zvj\" (UniqueName: \"kubernetes.io/projected/e0778b59-e2e9-4afe-ab6d-3d85eebba895-kube-api-access-82zvj\") pod \"placement-2619-account-create-update-tlzdz\" (UID: \"e0778b59-e2e9-4afe-ab6d-3d85eebba895\") " pod="openstack/placement-2619-account-create-update-tlzdz" Dec 03 06:50:45 crc kubenswrapper[4831]: I1203 06:50:45.896157 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0778b59-e2e9-4afe-ab6d-3d85eebba895-operator-scripts\") pod \"placement-2619-account-create-update-tlzdz\" (UID: \"e0778b59-e2e9-4afe-ab6d-3d85eebba895\") " pod="openstack/placement-2619-account-create-update-tlzdz" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.003444 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82zvj\" (UniqueName: \"kubernetes.io/projected/e0778b59-e2e9-4afe-ab6d-3d85eebba895-kube-api-access-82zvj\") pod \"placement-2619-account-create-update-tlzdz\" (UID: \"e0778b59-e2e9-4afe-ab6d-3d85eebba895\") " pod="openstack/placement-2619-account-create-update-tlzdz" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.003506 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0778b59-e2e9-4afe-ab6d-3d85eebba895-operator-scripts\") pod \"placement-2619-account-create-update-tlzdz\" (UID: \"e0778b59-e2e9-4afe-ab6d-3d85eebba895\") " pod="openstack/placement-2619-account-create-update-tlzdz" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.003606 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78440d07-b290-4710-a9b0-ce6c0b5efa0c-operator-scripts\") pod \"placement-db-create-l964v\" (UID: \"78440d07-b290-4710-a9b0-ce6c0b5efa0c\") " pod="openstack/placement-db-create-l964v" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.003666 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvx4c\" (UniqueName: \"kubernetes.io/projected/78440d07-b290-4710-a9b0-ce6c0b5efa0c-kube-api-access-dvx4c\") pod \"placement-db-create-l964v\" (UID: \"78440d07-b290-4710-a9b0-ce6c0b5efa0c\") " pod="openstack/placement-db-create-l964v" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.004593 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78440d07-b290-4710-a9b0-ce6c0b5efa0c-operator-scripts\") pod \"placement-db-create-l964v\" (UID: \"78440d07-b290-4710-a9b0-ce6c0b5efa0c\") " pod="openstack/placement-db-create-l964v" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.004639 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0778b59-e2e9-4afe-ab6d-3d85eebba895-operator-scripts\") pod \"placement-2619-account-create-update-tlzdz\" (UID: \"e0778b59-e2e9-4afe-ab6d-3d85eebba895\") " pod="openstack/placement-2619-account-create-update-tlzdz" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.005422 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-vzchf"] Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.009508 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vzchf" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.016301 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vzchf"] Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.035818 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvx4c\" (UniqueName: \"kubernetes.io/projected/78440d07-b290-4710-a9b0-ce6c0b5efa0c-kube-api-access-dvx4c\") pod \"placement-db-create-l964v\" (UID: \"78440d07-b290-4710-a9b0-ce6c0b5efa0c\") " pod="openstack/placement-db-create-l964v" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.041951 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82zvj\" (UniqueName: \"kubernetes.io/projected/e0778b59-e2e9-4afe-ab6d-3d85eebba895-kube-api-access-82zvj\") pod \"placement-2619-account-create-update-tlzdz\" (UID: \"e0778b59-e2e9-4afe-ab6d-3d85eebba895\") " pod="openstack/placement-2619-account-create-update-tlzdz" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.050668 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-l964v" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.106610 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3e24-account-create-update-zbd4p"] Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.107734 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3e24-account-create-update-zbd4p" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.109716 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.132035 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3e24-account-create-update-zbd4p"] Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.205661 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce2f30f-5796-4bcd-8c15-f9c71969ebb1-operator-scripts\") pod \"glance-db-create-vzchf\" (UID: \"bce2f30f-5796-4bcd-8c15-f9c71969ebb1\") " pod="openstack/glance-db-create-vzchf" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.205757 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2tmt\" (UniqueName: \"kubernetes.io/projected/bce2f30f-5796-4bcd-8c15-f9c71969ebb1-kube-api-access-x2tmt\") pod \"glance-db-create-vzchf\" (UID: \"bce2f30f-5796-4bcd-8c15-f9c71969ebb1\") " pod="openstack/glance-db-create-vzchf" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.226548 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2619-account-create-update-tlzdz" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.307293 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9ef9a7-5412-4f03-8656-86f20966986f-operator-scripts\") pod \"glance-3e24-account-create-update-zbd4p\" (UID: \"ba9ef9a7-5412-4f03-8656-86f20966986f\") " pod="openstack/glance-3e24-account-create-update-zbd4p" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.307402 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m99pw\" (UniqueName: \"kubernetes.io/projected/ba9ef9a7-5412-4f03-8656-86f20966986f-kube-api-access-m99pw\") pod \"glance-3e24-account-create-update-zbd4p\" (UID: \"ba9ef9a7-5412-4f03-8656-86f20966986f\") " pod="openstack/glance-3e24-account-create-update-zbd4p" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.307444 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2tmt\" (UniqueName: \"kubernetes.io/projected/bce2f30f-5796-4bcd-8c15-f9c71969ebb1-kube-api-access-x2tmt\") pod \"glance-db-create-vzchf\" (UID: \"bce2f30f-5796-4bcd-8c15-f9c71969ebb1\") " pod="openstack/glance-db-create-vzchf" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.307579 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce2f30f-5796-4bcd-8c15-f9c71969ebb1-operator-scripts\") pod \"glance-db-create-vzchf\" (UID: \"bce2f30f-5796-4bcd-8c15-f9c71969ebb1\") " pod="openstack/glance-db-create-vzchf" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.308524 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce2f30f-5796-4bcd-8c15-f9c71969ebb1-operator-scripts\") pod \"glance-db-create-vzchf\" (UID: \"bce2f30f-5796-4bcd-8c15-f9c71969ebb1\") " pod="openstack/glance-db-create-vzchf" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.325739 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2tmt\" (UniqueName: \"kubernetes.io/projected/bce2f30f-5796-4bcd-8c15-f9c71969ebb1-kube-api-access-x2tmt\") pod \"glance-db-create-vzchf\" (UID: \"bce2f30f-5796-4bcd-8c15-f9c71969ebb1\") " pod="openstack/glance-db-create-vzchf" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.334847 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-9jjj4"] Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.393864 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vzchf" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.408129 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-7cf1-account-create-update-qc9fq"] Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.411647 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9ef9a7-5412-4f03-8656-86f20966986f-operator-scripts\") pod \"glance-3e24-account-create-update-zbd4p\" (UID: \"ba9ef9a7-5412-4f03-8656-86f20966986f\") " pod="openstack/glance-3e24-account-create-update-zbd4p" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.412001 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m99pw\" (UniqueName: \"kubernetes.io/projected/ba9ef9a7-5412-4f03-8656-86f20966986f-kube-api-access-m99pw\") pod \"glance-3e24-account-create-update-zbd4p\" (UID: \"ba9ef9a7-5412-4f03-8656-86f20966986f\") " pod="openstack/glance-3e24-account-create-update-zbd4p" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.412694 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9ef9a7-5412-4f03-8656-86f20966986f-operator-scripts\") pod \"glance-3e24-account-create-update-zbd4p\" (UID: \"ba9ef9a7-5412-4f03-8656-86f20966986f\") " pod="openstack/glance-3e24-account-create-update-zbd4p" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.436590 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m99pw\" (UniqueName: \"kubernetes.io/projected/ba9ef9a7-5412-4f03-8656-86f20966986f-kube-api-access-m99pw\") pod \"glance-3e24-account-create-update-zbd4p\" (UID: \"ba9ef9a7-5412-4f03-8656-86f20966986f\") " pod="openstack/glance-3e24-account-create-update-zbd4p" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.451126 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3e24-account-create-update-zbd4p" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.596767 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-l964v"] Dec 03 06:50:46 crc kubenswrapper[4831]: W1203 06:50:46.617183 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod78440d07_b290_4710_a9b0_ce6c0b5efa0c.slice/crio-72acf6db7ff52bbdcbd298bcf920e2ae9b9304bc1a8d3d66fba12d4fa3b193c8 WatchSource:0}: Error finding container 72acf6db7ff52bbdcbd298bcf920e2ae9b9304bc1a8d3d66fba12d4fa3b193c8: Status 404 returned error can't find the container with id 72acf6db7ff52bbdcbd298bcf920e2ae9b9304bc1a8d3d66fba12d4fa3b193c8 Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.678534 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cf1-account-create-update-qc9fq" event={"ID":"072f167c-3d27-49f2-aae9-84ddb84e6a8e","Type":"ContainerStarted","Data":"6ca9630d3043a302dd9c859703f97e0a009cd5cf19e03dd4ecc50cf8420860a9"} Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.679987 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9jjj4" event={"ID":"67dbccb6-784e-4fba-8b01-227a5b3d1a3e","Type":"ContainerStarted","Data":"8c2dcbad7750849b6be9771a4f94a9823beb233f81ffd0e05f59ae49bdc63cf8"} Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.680010 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9jjj4" event={"ID":"67dbccb6-784e-4fba-8b01-227a5b3d1a3e","Type":"ContainerStarted","Data":"2182c7bd81a06a834a65e23a40d7674f1b1db78427f767bd37c2ab73b47dc737"} Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.683205 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-l964v" event={"ID":"78440d07-b290-4710-a9b0-ce6c0b5efa0c","Type":"ContainerStarted","Data":"72acf6db7ff52bbdcbd298bcf920e2ae9b9304bc1a8d3d66fba12d4fa3b193c8"} Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.710441 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-2619-account-create-update-tlzdz"] Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.713094 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-create-9jjj4" podStartSLOduration=1.713079527 podStartE2EDuration="1.713079527s" podCreationTimestamp="2025-12-03 06:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:46.700370398 +0000 UTC m=+1184.043953906" watchObservedRunningTime="2025-12-03 06:50:46.713079527 +0000 UTC m=+1184.056663035" Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.834152 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-vzchf"] Dec 03 06:50:46 crc kubenswrapper[4831]: W1203 06:50:46.835445 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbce2f30f_5796_4bcd_8c15_f9c71969ebb1.slice/crio-ad4ca0ed7801326fd071ceacae8f1500f9f7fe3368bb228d1e3ec0a36d38905e WatchSource:0}: Error finding container ad4ca0ed7801326fd071ceacae8f1500f9f7fe3368bb228d1e3ec0a36d38905e: Status 404 returned error can't find the container with id ad4ca0ed7801326fd071ceacae8f1500f9f7fe3368bb228d1e3ec0a36d38905e Dec 03 06:50:46 crc kubenswrapper[4831]: I1203 06:50:46.940927 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3e24-account-create-update-zbd4p"] Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.678170 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-bkmqn"] Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.683545 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.716278 4831 generic.go:334] "Generic (PLEG): container finished" podID="67dbccb6-784e-4fba-8b01-227a5b3d1a3e" containerID="8c2dcbad7750849b6be9771a4f94a9823beb233f81ffd0e05f59ae49bdc63cf8" exitCode=0 Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.716389 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9jjj4" event={"ID":"67dbccb6-784e-4fba-8b01-227a5b3d1a3e","Type":"ContainerDied","Data":"8c2dcbad7750849b6be9771a4f94a9823beb233f81ffd0e05f59ae49bdc63cf8"} Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.716990 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-bkmqn"] Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.728696 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-l964v" event={"ID":"78440d07-b290-4710-a9b0-ce6c0b5efa0c","Type":"ContainerStarted","Data":"8f3f08db7a40b296fad0ffb99615b25c24f20d9ef4906c8b1c92750fff434683"} Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.743980 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2619-account-create-update-tlzdz" event={"ID":"e0778b59-e2e9-4afe-ab6d-3d85eebba895","Type":"ContainerStarted","Data":"c817be12198932ba478b957a6adc60b0016409ee731b40995f2037effe75923a"} Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.744023 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2619-account-create-update-tlzdz" event={"ID":"e0778b59-e2e9-4afe-ab6d-3d85eebba895","Type":"ContainerStarted","Data":"04eb3b72fbe9fea8295a39db2bdd986b0517cbd08d92ad3fe39abdc830d0c401"} Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.748750 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3e24-account-create-update-zbd4p" event={"ID":"ba9ef9a7-5412-4f03-8656-86f20966986f","Type":"ContainerStarted","Data":"737e737d087255df158d38703f28751940cefd79f763139d2111b0f555e473ed"} Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.768039 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cf1-account-create-update-qc9fq" event={"ID":"072f167c-3d27-49f2-aae9-84ddb84e6a8e","Type":"ContainerStarted","Data":"25a311e276ecde83fbfb561d83a456c375aa79b31fba2fee94be2d8120bcf6e0"} Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.777656 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-2619-account-create-update-tlzdz" podStartSLOduration=2.777636049 podStartE2EDuration="2.777636049s" podCreationTimestamp="2025-12-03 06:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:47.776560725 +0000 UTC m=+1185.120144233" watchObservedRunningTime="2025-12-03 06:50:47.777636049 +0000 UTC m=+1185.121219557" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.789747 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vzchf" event={"ID":"bce2f30f-5796-4bcd-8c15-f9c71969ebb1","Type":"ContainerStarted","Data":"61288a4ffbb5c63055a0602926b289f0293c643cceda88d7579316fb994d6eed"} Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.789798 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vzchf" event={"ID":"bce2f30f-5796-4bcd-8c15-f9c71969ebb1","Type":"ContainerStarted","Data":"ad4ca0ed7801326fd071ceacae8f1500f9f7fe3368bb228d1e3ec0a36d38905e"} Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.806449 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-create-l964v" podStartSLOduration=2.806427589 podStartE2EDuration="2.806427589s" podCreationTimestamp="2025-12-03 06:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:47.798300205 +0000 UTC m=+1185.141883733" watchObservedRunningTime="2025-12-03 06:50:47.806427589 +0000 UTC m=+1185.150011097" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.819032 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-7cf1-account-create-update-qc9fq" podStartSLOduration=2.819013414 podStartE2EDuration="2.819013414s" podCreationTimestamp="2025-12-03 06:50:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:47.813976536 +0000 UTC m=+1185.157560044" watchObservedRunningTime="2025-12-03 06:50:47.819013414 +0000 UTC m=+1185.162596922" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.841564 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.842967 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjb62\" (UniqueName: \"kubernetes.io/projected/ff831122-53fa-4905-a6c7-71f216c98da5-kube-api-access-sjb62\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.843516 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-config\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.843802 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.843943 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.949159 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.949220 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.949260 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.949336 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjb62\" (UniqueName: \"kubernetes.io/projected/ff831122-53fa-4905-a6c7-71f216c98da5-kube-api-access-sjb62\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.949397 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-config\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.950456 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.951038 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.952313 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.954426 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-config\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:47 crc kubenswrapper[4831]: I1203 06:50:47.994040 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjb62\" (UniqueName: \"kubernetes.io/projected/ff831122-53fa-4905-a6c7-71f216c98da5-kube-api-access-sjb62\") pod \"dnsmasq-dns-b8fbc5445-bkmqn\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.067634 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.521190 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-bkmqn"] Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.553510 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.798444 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3e24-account-create-update-zbd4p" event={"ID":"ba9ef9a7-5412-4f03-8656-86f20966986f","Type":"ContainerStarted","Data":"9e4f7d54ce5529f9e5dcc758e6b2dea870d4d72e906329750dd8f3148bb9a9ae"} Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.802136 4831 generic.go:334] "Generic (PLEG): container finished" podID="bce2f30f-5796-4bcd-8c15-f9c71969ebb1" containerID="61288a4ffbb5c63055a0602926b289f0293c643cceda88d7579316fb994d6eed" exitCode=0 Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.802231 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vzchf" event={"ID":"bce2f30f-5796-4bcd-8c15-f9c71969ebb1","Type":"ContainerDied","Data":"61288a4ffbb5c63055a0602926b289f0293c643cceda88d7579316fb994d6eed"} Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.804165 4831 generic.go:334] "Generic (PLEG): container finished" podID="78440d07-b290-4710-a9b0-ce6c0b5efa0c" containerID="8f3f08db7a40b296fad0ffb99615b25c24f20d9ef4906c8b1c92750fff434683" exitCode=0 Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.804253 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-l964v" event={"ID":"78440d07-b290-4710-a9b0-ce6c0b5efa0c","Type":"ContainerDied","Data":"8f3f08db7a40b296fad0ffb99615b25c24f20d9ef4906c8b1c92750fff434683"} Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.805546 4831 generic.go:334] "Generic (PLEG): container finished" podID="ff831122-53fa-4905-a6c7-71f216c98da5" containerID="44812a208760edb2a83837f399a6e36dc6ace94363dcb8fc4fa5e091ea75c01c" exitCode=0 Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.805867 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" event={"ID":"ff831122-53fa-4905-a6c7-71f216c98da5","Type":"ContainerDied","Data":"44812a208760edb2a83837f399a6e36dc6ace94363dcb8fc4fa5e091ea75c01c"} Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.805916 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" event={"ID":"ff831122-53fa-4905-a6c7-71f216c98da5","Type":"ContainerStarted","Data":"61e80bbe7a8066cf24d9b0fb26685887c26c087ebe567e68da5103160fec7f26"} Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.812641 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.820108 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-3e24-account-create-update-zbd4p" podStartSLOduration=2.820077569 podStartE2EDuration="2.820077569s" podCreationTimestamp="2025-12-03 06:50:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:48.819789539 +0000 UTC m=+1186.163373047" watchObservedRunningTime="2025-12-03 06:50:48.820077569 +0000 UTC m=+1186.163661077" Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.821577 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.827165 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.827494 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.827739 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.827985 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-zs4bz" Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.853194 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.969706 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.969892 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-cache\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.969917 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-lock\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.969958 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp86h\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-kube-api-access-jp86h\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:48 crc kubenswrapper[4831]: I1203 06:50:48.970005 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.073098 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-cache\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.073146 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-lock\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.073185 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jp86h\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-kube-api-access-jp86h\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:49 crc kubenswrapper[4831]: E1203 06:50:49.073537 4831 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 06:50:49 crc kubenswrapper[4831]: E1203 06:50:49.073557 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 06:50:49 crc kubenswrapper[4831]: E1203 06:50:49.073604 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift podName:7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5 nodeName:}" failed. No retries permitted until 2025-12-03 06:50:49.573588432 +0000 UTC m=+1186.917171940 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift") pod "swift-storage-0" (UID: "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5") : configmap "swift-ring-files" not found Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.073223 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.073834 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.074125 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-cache\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.074183 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-lock\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.074194 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/swift-storage-0" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.103141 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp86h\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-kube-api-access-jp86h\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.112423 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.180479 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9jjj4" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.269894 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-f2nlc"] Dec 03 06:50:49 crc kubenswrapper[4831]: E1203 06:50:49.270219 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="67dbccb6-784e-4fba-8b01-227a5b3d1a3e" containerName="mariadb-database-create" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.270234 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="67dbccb6-784e-4fba-8b01-227a5b3d1a3e" containerName="mariadb-database-create" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.270387 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="67dbccb6-784e-4fba-8b01-227a5b3d1a3e" containerName="mariadb-database-create" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.270993 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.277623 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.277623 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.277954 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dmtvb\" (UniqueName: \"kubernetes.io/projected/67dbccb6-784e-4fba-8b01-227a5b3d1a3e-kube-api-access-dmtvb\") pod \"67dbccb6-784e-4fba-8b01-227a5b3d1a3e\" (UID: \"67dbccb6-784e-4fba-8b01-227a5b3d1a3e\") " Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.278006 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67dbccb6-784e-4fba-8b01-227a5b3d1a3e-operator-scripts\") pod \"67dbccb6-784e-4fba-8b01-227a5b3d1a3e\" (UID: \"67dbccb6-784e-4fba-8b01-227a5b3d1a3e\") " Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.277711 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.278782 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/67dbccb6-784e-4fba-8b01-227a5b3d1a3e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "67dbccb6-784e-4fba-8b01-227a5b3d1a3e" (UID: "67dbccb6-784e-4fba-8b01-227a5b3d1a3e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.281884 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/67dbccb6-784e-4fba-8b01-227a5b3d1a3e-kube-api-access-dmtvb" (OuterVolumeSpecName: "kube-api-access-dmtvb") pod "67dbccb6-784e-4fba-8b01-227a5b3d1a3e" (UID: "67dbccb6-784e-4fba-8b01-227a5b3d1a3e"). InnerVolumeSpecName "kube-api-access-dmtvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.285930 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-f2nlc"] Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.379748 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-dispersionconf\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.379840 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ce49982-d05f-49dd-9b08-ae54e662b628-scripts\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.379875 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-swiftconf\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.379891 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ce49982-d05f-49dd-9b08-ae54e662b628-etc-swift\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.379920 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-combined-ca-bundle\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.379944 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws69z\" (UniqueName: \"kubernetes.io/projected/9ce49982-d05f-49dd-9b08-ae54e662b628-kube-api-access-ws69z\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.379978 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ce49982-d05f-49dd-9b08-ae54e662b628-ring-data-devices\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.380031 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dmtvb\" (UniqueName: \"kubernetes.io/projected/67dbccb6-784e-4fba-8b01-227a5b3d1a3e-kube-api-access-dmtvb\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.380041 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/67dbccb6-784e-4fba-8b01-227a5b3d1a3e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.481693 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ce49982-d05f-49dd-9b08-ae54e662b628-scripts\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.481762 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-swiftconf\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.481781 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ce49982-d05f-49dd-9b08-ae54e662b628-etc-swift\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.481817 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-combined-ca-bundle\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.481840 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws69z\" (UniqueName: \"kubernetes.io/projected/9ce49982-d05f-49dd-9b08-ae54e662b628-kube-api-access-ws69z\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.481874 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ce49982-d05f-49dd-9b08-ae54e662b628-ring-data-devices\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.481950 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-dispersionconf\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.482413 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ce49982-d05f-49dd-9b08-ae54e662b628-scripts\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.482803 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ce49982-d05f-49dd-9b08-ae54e662b628-etc-swift\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.482891 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ce49982-d05f-49dd-9b08-ae54e662b628-ring-data-devices\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.485448 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-swiftconf\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.485806 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-combined-ca-bundle\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.486720 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-dispersionconf\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.500669 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws69z\" (UniqueName: \"kubernetes.io/projected/9ce49982-d05f-49dd-9b08-ae54e662b628-kube-api-access-ws69z\") pod \"swift-ring-rebalance-f2nlc\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.583730 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:49 crc kubenswrapper[4831]: E1203 06:50:49.584063 4831 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 06:50:49 crc kubenswrapper[4831]: E1203 06:50:49.584127 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 06:50:49 crc kubenswrapper[4831]: E1203 06:50:49.584238 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift podName:7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5 nodeName:}" failed. No retries permitted until 2025-12-03 06:50:50.58420365 +0000 UTC m=+1187.927787208 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift") pod "swift-storage-0" (UID: "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5") : configmap "swift-ring-files" not found Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.626936 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.816185 4831 generic.go:334] "Generic (PLEG): container finished" podID="e0778b59-e2e9-4afe-ab6d-3d85eebba895" containerID="c817be12198932ba478b957a6adc60b0016409ee731b40995f2037effe75923a" exitCode=0 Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.816424 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2619-account-create-update-tlzdz" event={"ID":"e0778b59-e2e9-4afe-ab6d-3d85eebba895","Type":"ContainerDied","Data":"c817be12198932ba478b957a6adc60b0016409ee731b40995f2037effe75923a"} Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.820013 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" event={"ID":"ff831122-53fa-4905-a6c7-71f216c98da5","Type":"ContainerStarted","Data":"003eea5872bfb8451fffe36f9af7621579806c93e61afbb27894cb3cfeb28df9"} Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.822466 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.823161 4831 generic.go:334] "Generic (PLEG): container finished" podID="ba9ef9a7-5412-4f03-8656-86f20966986f" containerID="9e4f7d54ce5529f9e5dcc758e6b2dea870d4d72e906329750dd8f3148bb9a9ae" exitCode=0 Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.823250 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3e24-account-create-update-zbd4p" event={"ID":"ba9ef9a7-5412-4f03-8656-86f20966986f","Type":"ContainerDied","Data":"9e4f7d54ce5529f9e5dcc758e6b2dea870d4d72e906329750dd8f3148bb9a9ae"} Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.828590 4831 generic.go:334] "Generic (PLEG): container finished" podID="072f167c-3d27-49f2-aae9-84ddb84e6a8e" containerID="25a311e276ecde83fbfb561d83a456c375aa79b31fba2fee94be2d8120bcf6e0" exitCode=0 Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.828675 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cf1-account-create-update-qc9fq" event={"ID":"072f167c-3d27-49f2-aae9-84ddb84e6a8e","Type":"ContainerDied","Data":"25a311e276ecde83fbfb561d83a456c375aa79b31fba2fee94be2d8120bcf6e0"} Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.831390 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-9jjj4" event={"ID":"67dbccb6-784e-4fba-8b01-227a5b3d1a3e","Type":"ContainerDied","Data":"2182c7bd81a06a834a65e23a40d7674f1b1db78427f767bd37c2ab73b47dc737"} Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.831421 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2182c7bd81a06a834a65e23a40d7674f1b1db78427f767bd37c2ab73b47dc737" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.831436 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-9jjj4" Dec 03 06:50:49 crc kubenswrapper[4831]: I1203 06:50:49.894168 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" podStartSLOduration=2.894148069 podStartE2EDuration="2.894148069s" podCreationTimestamp="2025-12-03 06:50:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:50:49.885452597 +0000 UTC m=+1187.229036105" watchObservedRunningTime="2025-12-03 06:50:49.894148069 +0000 UTC m=+1187.237731577" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.149424 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-f2nlc"] Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.283127 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vzchf" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.289423 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-l964v" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.399056 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2tmt\" (UniqueName: \"kubernetes.io/projected/bce2f30f-5796-4bcd-8c15-f9c71969ebb1-kube-api-access-x2tmt\") pod \"bce2f30f-5796-4bcd-8c15-f9c71969ebb1\" (UID: \"bce2f30f-5796-4bcd-8c15-f9c71969ebb1\") " Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.399098 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce2f30f-5796-4bcd-8c15-f9c71969ebb1-operator-scripts\") pod \"bce2f30f-5796-4bcd-8c15-f9c71969ebb1\" (UID: \"bce2f30f-5796-4bcd-8c15-f9c71969ebb1\") " Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.399178 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78440d07-b290-4710-a9b0-ce6c0b5efa0c-operator-scripts\") pod \"78440d07-b290-4710-a9b0-ce6c0b5efa0c\" (UID: \"78440d07-b290-4710-a9b0-ce6c0b5efa0c\") " Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.399264 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvx4c\" (UniqueName: \"kubernetes.io/projected/78440d07-b290-4710-a9b0-ce6c0b5efa0c-kube-api-access-dvx4c\") pod \"78440d07-b290-4710-a9b0-ce6c0b5efa0c\" (UID: \"78440d07-b290-4710-a9b0-ce6c0b5efa0c\") " Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.399714 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/78440d07-b290-4710-a9b0-ce6c0b5efa0c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "78440d07-b290-4710-a9b0-ce6c0b5efa0c" (UID: "78440d07-b290-4710-a9b0-ce6c0b5efa0c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.399714 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bce2f30f-5796-4bcd-8c15-f9c71969ebb1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bce2f30f-5796-4bcd-8c15-f9c71969ebb1" (UID: "bce2f30f-5796-4bcd-8c15-f9c71969ebb1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.404250 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bce2f30f-5796-4bcd-8c15-f9c71969ebb1-kube-api-access-x2tmt" (OuterVolumeSpecName: "kube-api-access-x2tmt") pod "bce2f30f-5796-4bcd-8c15-f9c71969ebb1" (UID: "bce2f30f-5796-4bcd-8c15-f9c71969ebb1"). InnerVolumeSpecName "kube-api-access-x2tmt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.404797 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/78440d07-b290-4710-a9b0-ce6c0b5efa0c-kube-api-access-dvx4c" (OuterVolumeSpecName: "kube-api-access-dvx4c") pod "78440d07-b290-4710-a9b0-ce6c0b5efa0c" (UID: "78440d07-b290-4710-a9b0-ce6c0b5efa0c"). InnerVolumeSpecName "kube-api-access-dvx4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.501008 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/78440d07-b290-4710-a9b0-ce6c0b5efa0c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.501286 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvx4c\" (UniqueName: \"kubernetes.io/projected/78440d07-b290-4710-a9b0-ce6c0b5efa0c-kube-api-access-dvx4c\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.501297 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2tmt\" (UniqueName: \"kubernetes.io/projected/bce2f30f-5796-4bcd-8c15-f9c71969ebb1-kube-api-access-x2tmt\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.501306 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bce2f30f-5796-4bcd-8c15-f9c71969ebb1-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.603083 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:50 crc kubenswrapper[4831]: E1203 06:50:50.603297 4831 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 06:50:50 crc kubenswrapper[4831]: E1203 06:50:50.603650 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 06:50:50 crc kubenswrapper[4831]: E1203 06:50:50.603773 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift podName:7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5 nodeName:}" failed. No retries permitted until 2025-12-03 06:50:52.603756954 +0000 UTC m=+1189.947340462 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift") pod "swift-storage-0" (UID: "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5") : configmap "swift-ring-files" not found Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.851852 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-vzchf" event={"ID":"bce2f30f-5796-4bcd-8c15-f9c71969ebb1","Type":"ContainerDied","Data":"ad4ca0ed7801326fd071ceacae8f1500f9f7fe3368bb228d1e3ec0a36d38905e"} Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.851884 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-vzchf" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.851904 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad4ca0ed7801326fd071ceacae8f1500f9f7fe3368bb228d1e3ec0a36d38905e" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.853560 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-l964v" event={"ID":"78440d07-b290-4710-a9b0-ce6c0b5efa0c","Type":"ContainerDied","Data":"72acf6db7ff52bbdcbd298bcf920e2ae9b9304bc1a8d3d66fba12d4fa3b193c8"} Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.853579 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="72acf6db7ff52bbdcbd298bcf920e2ae9b9304bc1a8d3d66fba12d4fa3b193c8" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.853625 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-l964v" Dec 03 06:50:50 crc kubenswrapper[4831]: I1203 06:50:50.855893 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f2nlc" event={"ID":"9ce49982-d05f-49dd-9b08-ae54e662b628","Type":"ContainerStarted","Data":"cdeebd5454301d4f246eb24912e96950742baf7319a722d63d14ae84c0abccf3"} Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.478478 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2619-account-create-update-tlzdz" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.487990 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3e24-account-create-update-zbd4p" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.499159 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cf1-account-create-update-qc9fq" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.627503 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82zvj\" (UniqueName: \"kubernetes.io/projected/e0778b59-e2e9-4afe-ab6d-3d85eebba895-kube-api-access-82zvj\") pod \"e0778b59-e2e9-4afe-ab6d-3d85eebba895\" (UID: \"e0778b59-e2e9-4afe-ab6d-3d85eebba895\") " Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.627593 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0778b59-e2e9-4afe-ab6d-3d85eebba895-operator-scripts\") pod \"e0778b59-e2e9-4afe-ab6d-3d85eebba895\" (UID: \"e0778b59-e2e9-4afe-ab6d-3d85eebba895\") " Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.627723 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9ef9a7-5412-4f03-8656-86f20966986f-operator-scripts\") pod \"ba9ef9a7-5412-4f03-8656-86f20966986f\" (UID: \"ba9ef9a7-5412-4f03-8656-86f20966986f\") " Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.628039 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8m9h\" (UniqueName: \"kubernetes.io/projected/072f167c-3d27-49f2-aae9-84ddb84e6a8e-kube-api-access-h8m9h\") pod \"072f167c-3d27-49f2-aae9-84ddb84e6a8e\" (UID: \"072f167c-3d27-49f2-aae9-84ddb84e6a8e\") " Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.628064 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/072f167c-3d27-49f2-aae9-84ddb84e6a8e-operator-scripts\") pod \"072f167c-3d27-49f2-aae9-84ddb84e6a8e\" (UID: \"072f167c-3d27-49f2-aae9-84ddb84e6a8e\") " Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.628135 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m99pw\" (UniqueName: \"kubernetes.io/projected/ba9ef9a7-5412-4f03-8656-86f20966986f-kube-api-access-m99pw\") pod \"ba9ef9a7-5412-4f03-8656-86f20966986f\" (UID: \"ba9ef9a7-5412-4f03-8656-86f20966986f\") " Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.628630 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e0778b59-e2e9-4afe-ab6d-3d85eebba895-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e0778b59-e2e9-4afe-ab6d-3d85eebba895" (UID: "e0778b59-e2e9-4afe-ab6d-3d85eebba895"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.628636 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba9ef9a7-5412-4f03-8656-86f20966986f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba9ef9a7-5412-4f03-8656-86f20966986f" (UID: "ba9ef9a7-5412-4f03-8656-86f20966986f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.628704 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/072f167c-3d27-49f2-aae9-84ddb84e6a8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "072f167c-3d27-49f2-aae9-84ddb84e6a8e" (UID: "072f167c-3d27-49f2-aae9-84ddb84e6a8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.633926 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0778b59-e2e9-4afe-ab6d-3d85eebba895-kube-api-access-82zvj" (OuterVolumeSpecName: "kube-api-access-82zvj") pod "e0778b59-e2e9-4afe-ab6d-3d85eebba895" (UID: "e0778b59-e2e9-4afe-ab6d-3d85eebba895"). InnerVolumeSpecName "kube-api-access-82zvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.635522 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/072f167c-3d27-49f2-aae9-84ddb84e6a8e-kube-api-access-h8m9h" (OuterVolumeSpecName: "kube-api-access-h8m9h") pod "072f167c-3d27-49f2-aae9-84ddb84e6a8e" (UID: "072f167c-3d27-49f2-aae9-84ddb84e6a8e"). InnerVolumeSpecName "kube-api-access-h8m9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.637257 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba9ef9a7-5412-4f03-8656-86f20966986f-kube-api-access-m99pw" (OuterVolumeSpecName: "kube-api-access-m99pw") pod "ba9ef9a7-5412-4f03-8656-86f20966986f" (UID: "ba9ef9a7-5412-4f03-8656-86f20966986f"). InnerVolumeSpecName "kube-api-access-m99pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.729674 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e0778b59-e2e9-4afe-ab6d-3d85eebba895-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.729709 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba9ef9a7-5412-4f03-8656-86f20966986f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.729720 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8m9h\" (UniqueName: \"kubernetes.io/projected/072f167c-3d27-49f2-aae9-84ddb84e6a8e-kube-api-access-h8m9h\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.729730 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/072f167c-3d27-49f2-aae9-84ddb84e6a8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.729739 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m99pw\" (UniqueName: \"kubernetes.io/projected/ba9ef9a7-5412-4f03-8656-86f20966986f-kube-api-access-m99pw\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.729750 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82zvj\" (UniqueName: \"kubernetes.io/projected/e0778b59-e2e9-4afe-ab6d-3d85eebba895-kube-api-access-82zvj\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.875475 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-2619-account-create-update-tlzdz" event={"ID":"e0778b59-e2e9-4afe-ab6d-3d85eebba895","Type":"ContainerDied","Data":"04eb3b72fbe9fea8295a39db2bdd986b0517cbd08d92ad3fe39abdc830d0c401"} Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.875515 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="04eb3b72fbe9fea8295a39db2bdd986b0517cbd08d92ad3fe39abdc830d0c401" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.875523 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-2619-account-create-update-tlzdz" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.877210 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3e24-account-create-update-zbd4p" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.877234 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3e24-account-create-update-zbd4p" event={"ID":"ba9ef9a7-5412-4f03-8656-86f20966986f","Type":"ContainerDied","Data":"737e737d087255df158d38703f28751940cefd79f763139d2111b0f555e473ed"} Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.877270 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="737e737d087255df158d38703f28751940cefd79f763139d2111b0f555e473ed" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.879003 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-7cf1-account-create-update-qc9fq" event={"ID":"072f167c-3d27-49f2-aae9-84ddb84e6a8e","Type":"ContainerDied","Data":"6ca9630d3043a302dd9c859703f97e0a009cd5cf19e03dd4ecc50cf8420860a9"} Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.879038 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-7cf1-account-create-update-qc9fq" Dec 03 06:50:51 crc kubenswrapper[4831]: I1203 06:50:51.879040 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ca9630d3043a302dd9c859703f97e0a009cd5cf19e03dd4ecc50cf8420860a9" Dec 03 06:50:52 crc kubenswrapper[4831]: I1203 06:50:52.649385 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:52 crc kubenswrapper[4831]: E1203 06:50:52.649868 4831 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 06:50:52 crc kubenswrapper[4831]: E1203 06:50:52.649881 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 06:50:52 crc kubenswrapper[4831]: E1203 06:50:52.649926 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift podName:7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5 nodeName:}" failed. No retries permitted until 2025-12-03 06:50:56.649909661 +0000 UTC m=+1193.993493169 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift") pod "swift-storage-0" (UID: "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5") : configmap "swift-ring-files" not found Dec 03 06:50:55 crc kubenswrapper[4831]: I1203 06:50:55.930434 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f2nlc" event={"ID":"9ce49982-d05f-49dd-9b08-ae54e662b628","Type":"ContainerStarted","Data":"6d6b1c280fd68fd1f6d2b15694ba84f3b57d21b989d2fef9e7c2285d54cbffc8"} Dec 03 06:50:55 crc kubenswrapper[4831]: I1203 06:50:55.953306 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-f2nlc" podStartSLOduration=2.206918027 podStartE2EDuration="6.95327784s" podCreationTimestamp="2025-12-03 06:50:49 +0000 UTC" firstStartedPulling="2025-12-03 06:50:50.161771963 +0000 UTC m=+1187.505355471" lastFinishedPulling="2025-12-03 06:50:54.908131776 +0000 UTC m=+1192.251715284" observedRunningTime="2025-12-03 06:50:55.951210785 +0000 UTC m=+1193.294794303" watchObservedRunningTime="2025-12-03 06:50:55.95327784 +0000 UTC m=+1193.296861388" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.278722 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-nxpq5"] Dec 03 06:50:56 crc kubenswrapper[4831]: E1203 06:50:56.279446 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="78440d07-b290-4710-a9b0-ce6c0b5efa0c" containerName="mariadb-database-create" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.279471 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="78440d07-b290-4710-a9b0-ce6c0b5efa0c" containerName="mariadb-database-create" Dec 03 06:50:56 crc kubenswrapper[4831]: E1203 06:50:56.279495 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="072f167c-3d27-49f2-aae9-84ddb84e6a8e" containerName="mariadb-account-create-update" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.279503 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="072f167c-3d27-49f2-aae9-84ddb84e6a8e" containerName="mariadb-account-create-update" Dec 03 06:50:56 crc kubenswrapper[4831]: E1203 06:50:56.279518 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0778b59-e2e9-4afe-ab6d-3d85eebba895" containerName="mariadb-account-create-update" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.279526 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0778b59-e2e9-4afe-ab6d-3d85eebba895" containerName="mariadb-account-create-update" Dec 03 06:50:56 crc kubenswrapper[4831]: E1203 06:50:56.279546 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bce2f30f-5796-4bcd-8c15-f9c71969ebb1" containerName="mariadb-database-create" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.279553 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bce2f30f-5796-4bcd-8c15-f9c71969ebb1" containerName="mariadb-database-create" Dec 03 06:50:56 crc kubenswrapper[4831]: E1203 06:50:56.279566 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba9ef9a7-5412-4f03-8656-86f20966986f" containerName="mariadb-account-create-update" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.279574 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba9ef9a7-5412-4f03-8656-86f20966986f" containerName="mariadb-account-create-update" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.279785 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="072f167c-3d27-49f2-aae9-84ddb84e6a8e" containerName="mariadb-account-create-update" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.279801 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba9ef9a7-5412-4f03-8656-86f20966986f" containerName="mariadb-account-create-update" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.279812 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0778b59-e2e9-4afe-ab6d-3d85eebba895" containerName="mariadb-account-create-update" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.279826 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="78440d07-b290-4710-a9b0-ce6c0b5efa0c" containerName="mariadb-database-create" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.279844 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bce2f30f-5796-4bcd-8c15-f9c71969ebb1" containerName="mariadb-database-create" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.280538 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.282573 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b8246" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.283327 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.289033 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-nxpq5"] Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.429063 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-config-data\") pod \"glance-db-sync-nxpq5\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.429157 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-combined-ca-bundle\") pod \"glance-db-sync-nxpq5\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.429410 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-db-sync-config-data\") pod \"glance-db-sync-nxpq5\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.429516 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ctfv\" (UniqueName: \"kubernetes.io/projected/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-kube-api-access-4ctfv\") pod \"glance-db-sync-nxpq5\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.531763 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ctfv\" (UniqueName: \"kubernetes.io/projected/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-kube-api-access-4ctfv\") pod \"glance-db-sync-nxpq5\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.531916 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-config-data\") pod \"glance-db-sync-nxpq5\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.531992 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-combined-ca-bundle\") pod \"glance-db-sync-nxpq5\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.532044 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-db-sync-config-data\") pod \"glance-db-sync-nxpq5\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.535787 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-config-data\") pod \"glance-db-sync-nxpq5\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.536032 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-combined-ca-bundle\") pod \"glance-db-sync-nxpq5\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.540472 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-db-sync-config-data\") pod \"glance-db-sync-nxpq5\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.563783 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ctfv\" (UniqueName: \"kubernetes.io/projected/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-kube-api-access-4ctfv\") pod \"glance-db-sync-nxpq5\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.596798 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nxpq5" Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.739366 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:50:56 crc kubenswrapper[4831]: E1203 06:50:56.739609 4831 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Dec 03 06:50:56 crc kubenswrapper[4831]: E1203 06:50:56.739787 4831 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Dec 03 06:50:56 crc kubenswrapper[4831]: E1203 06:50:56.739873 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift podName:7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5 nodeName:}" failed. No retries permitted until 2025-12-03 06:51:04.739851393 +0000 UTC m=+1202.083434921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift") pod "swift-storage-0" (UID: "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5") : configmap "swift-ring-files" not found Dec 03 06:50:56 crc kubenswrapper[4831]: I1203 06:50:56.971083 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-95nwv" podUID="5fe1a689-1241-4c11-93ca-875e53319668" containerName="ovn-controller" probeResult="failure" output=< Dec 03 06:50:56 crc kubenswrapper[4831]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 06:50:56 crc kubenswrapper[4831]: > Dec 03 06:50:57 crc kubenswrapper[4831]: I1203 06:50:57.151566 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-nxpq5"] Dec 03 06:50:57 crc kubenswrapper[4831]: W1203 06:50:57.163135 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode07ae49e_a6fb_478a_8b6f_3f8f687f4afd.slice/crio-c14983c3ee527d1c356f19fee494d88ec4a098cd3a3702b16a4a20e798576de3 WatchSource:0}: Error finding container c14983c3ee527d1c356f19fee494d88ec4a098cd3a3702b16a4a20e798576de3: Status 404 returned error can't find the container with id c14983c3ee527d1c356f19fee494d88ec4a098cd3a3702b16a4a20e798576de3 Dec 03 06:50:57 crc kubenswrapper[4831]: I1203 06:50:57.947244 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nxpq5" event={"ID":"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd","Type":"ContainerStarted","Data":"c14983c3ee527d1c356f19fee494d88ec4a098cd3a3702b16a4a20e798576de3"} Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.069446 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.154434 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4wrns"] Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.154886 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8554648995-4wrns" podUID="fd7ebefd-319f-46b1-87eb-7d0669a90bef" containerName="dnsmasq-dns" containerID="cri-o://10cd4cabc32c3ed5d1936ca4d5399cb512ded90ef20802e74e4b10b5ea176d59" gracePeriod=10 Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.602187 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.679184 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-dns-svc\") pod \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.679235 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k245d\" (UniqueName: \"kubernetes.io/projected/fd7ebefd-319f-46b1-87eb-7d0669a90bef-kube-api-access-k245d\") pod \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.679254 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-ovsdbserver-sb\") pod \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.679288 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-ovsdbserver-nb\") pod \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.679376 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-config\") pod \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\" (UID: \"fd7ebefd-319f-46b1-87eb-7d0669a90bef\") " Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.692509 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd7ebefd-319f-46b1-87eb-7d0669a90bef-kube-api-access-k245d" (OuterVolumeSpecName: "kube-api-access-k245d") pod "fd7ebefd-319f-46b1-87eb-7d0669a90bef" (UID: "fd7ebefd-319f-46b1-87eb-7d0669a90bef"). InnerVolumeSpecName "kube-api-access-k245d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.781114 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k245d\" (UniqueName: \"kubernetes.io/projected/fd7ebefd-319f-46b1-87eb-7d0669a90bef-kube-api-access-k245d\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.781947 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-config" (OuterVolumeSpecName: "config") pod "fd7ebefd-319f-46b1-87eb-7d0669a90bef" (UID: "fd7ebefd-319f-46b1-87eb-7d0669a90bef"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.784496 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fd7ebefd-319f-46b1-87eb-7d0669a90bef" (UID: "fd7ebefd-319f-46b1-87eb-7d0669a90bef"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.804574 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fd7ebefd-319f-46b1-87eb-7d0669a90bef" (UID: "fd7ebefd-319f-46b1-87eb-7d0669a90bef"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.819580 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fd7ebefd-319f-46b1-87eb-7d0669a90bef" (UID: "fd7ebefd-319f-46b1-87eb-7d0669a90bef"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.881667 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.881701 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.881712 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.881721 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd7ebefd-319f-46b1-87eb-7d0669a90bef-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.954982 4831 generic.go:334] "Generic (PLEG): container finished" podID="fd7ebefd-319f-46b1-87eb-7d0669a90bef" containerID="10cd4cabc32c3ed5d1936ca4d5399cb512ded90ef20802e74e4b10b5ea176d59" exitCode=0 Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.955020 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4wrns" event={"ID":"fd7ebefd-319f-46b1-87eb-7d0669a90bef","Type":"ContainerDied","Data":"10cd4cabc32c3ed5d1936ca4d5399cb512ded90ef20802e74e4b10b5ea176d59"} Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.955044 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-4wrns" event={"ID":"fd7ebefd-319f-46b1-87eb-7d0669a90bef","Type":"ContainerDied","Data":"f3bdf7da52912656aad87b39efc9c980070a7844f679b1eb7bb711ee9135d48b"} Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.955060 4831 scope.go:117] "RemoveContainer" containerID="10cd4cabc32c3ed5d1936ca4d5399cb512ded90ef20802e74e4b10b5ea176d59" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.955163 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-4wrns" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.989973 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4wrns"] Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.993437 4831 scope.go:117] "RemoveContainer" containerID="c11bce4eca58128c596f014f0596c3b62dac8a15cdc679ec75f249c2e6dbc25a" Dec 03 06:50:58 crc kubenswrapper[4831]: I1203 06:50:58.997516 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-4wrns"] Dec 03 06:50:59 crc kubenswrapper[4831]: I1203 06:50:59.022689 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd7ebefd-319f-46b1-87eb-7d0669a90bef" path="/var/lib/kubelet/pods/fd7ebefd-319f-46b1-87eb-7d0669a90bef/volumes" Dec 03 06:50:59 crc kubenswrapper[4831]: I1203 06:50:59.023518 4831 scope.go:117] "RemoveContainer" containerID="10cd4cabc32c3ed5d1936ca4d5399cb512ded90ef20802e74e4b10b5ea176d59" Dec 03 06:50:59 crc kubenswrapper[4831]: E1203 06:50:59.023883 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10cd4cabc32c3ed5d1936ca4d5399cb512ded90ef20802e74e4b10b5ea176d59\": container with ID starting with 10cd4cabc32c3ed5d1936ca4d5399cb512ded90ef20802e74e4b10b5ea176d59 not found: ID does not exist" containerID="10cd4cabc32c3ed5d1936ca4d5399cb512ded90ef20802e74e4b10b5ea176d59" Dec 03 06:50:59 crc kubenswrapper[4831]: I1203 06:50:59.023919 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10cd4cabc32c3ed5d1936ca4d5399cb512ded90ef20802e74e4b10b5ea176d59"} err="failed to get container status \"10cd4cabc32c3ed5d1936ca4d5399cb512ded90ef20802e74e4b10b5ea176d59\": rpc error: code = NotFound desc = could not find container \"10cd4cabc32c3ed5d1936ca4d5399cb512ded90ef20802e74e4b10b5ea176d59\": container with ID starting with 10cd4cabc32c3ed5d1936ca4d5399cb512ded90ef20802e74e4b10b5ea176d59 not found: ID does not exist" Dec 03 06:50:59 crc kubenswrapper[4831]: I1203 06:50:59.023943 4831 scope.go:117] "RemoveContainer" containerID="c11bce4eca58128c596f014f0596c3b62dac8a15cdc679ec75f249c2e6dbc25a" Dec 03 06:50:59 crc kubenswrapper[4831]: E1203 06:50:59.024368 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c11bce4eca58128c596f014f0596c3b62dac8a15cdc679ec75f249c2e6dbc25a\": container with ID starting with c11bce4eca58128c596f014f0596c3b62dac8a15cdc679ec75f249c2e6dbc25a not found: ID does not exist" containerID="c11bce4eca58128c596f014f0596c3b62dac8a15cdc679ec75f249c2e6dbc25a" Dec 03 06:50:59 crc kubenswrapper[4831]: I1203 06:50:59.024397 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c11bce4eca58128c596f014f0596c3b62dac8a15cdc679ec75f249c2e6dbc25a"} err="failed to get container status \"c11bce4eca58128c596f014f0596c3b62dac8a15cdc679ec75f249c2e6dbc25a\": rpc error: code = NotFound desc = could not find container \"c11bce4eca58128c596f014f0596c3b62dac8a15cdc679ec75f249c2e6dbc25a\": container with ID starting with c11bce4eca58128c596f014f0596c3b62dac8a15cdc679ec75f249c2e6dbc25a not found: ID does not exist" Dec 03 06:51:01 crc kubenswrapper[4831]: I1203 06:51:01.744862 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:51:01 crc kubenswrapper[4831]: I1203 06:51:01.747375 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:51:01 crc kubenswrapper[4831]: I1203 06:51:01.977673 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-95nwv-config-gtssp"] Dec 03 06:51:01 crc kubenswrapper[4831]: E1203 06:51:01.978015 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd7ebefd-319f-46b1-87eb-7d0669a90bef" containerName="dnsmasq-dns" Dec 03 06:51:01 crc kubenswrapper[4831]: I1203 06:51:01.978031 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd7ebefd-319f-46b1-87eb-7d0669a90bef" containerName="dnsmasq-dns" Dec 03 06:51:01 crc kubenswrapper[4831]: E1203 06:51:01.978042 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd7ebefd-319f-46b1-87eb-7d0669a90bef" containerName="init" Dec 03 06:51:01 crc kubenswrapper[4831]: I1203 06:51:01.978048 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd7ebefd-319f-46b1-87eb-7d0669a90bef" containerName="init" Dec 03 06:51:01 crc kubenswrapper[4831]: I1203 06:51:01.978251 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd7ebefd-319f-46b1-87eb-7d0669a90bef" containerName="dnsmasq-dns" Dec 03 06:51:01 crc kubenswrapper[4831]: I1203 06:51:01.978800 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:01 crc kubenswrapper[4831]: I1203 06:51:01.980829 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-95nwv" podUID="5fe1a689-1241-4c11-93ca-875e53319668" containerName="ovn-controller" probeResult="failure" output=< Dec 03 06:51:01 crc kubenswrapper[4831]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 06:51:01 crc kubenswrapper[4831]: > Dec 03 06:51:01 crc kubenswrapper[4831]: I1203 06:51:01.981451 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.003527 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-95nwv-config-gtssp"] Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.155407 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/24a05c78-ed6a-409d-9f7a-62b96952fb28-additional-scripts\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.155761 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-run\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.155800 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24a05c78-ed6a-409d-9f7a-62b96952fb28-scripts\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.156084 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmzfs\" (UniqueName: \"kubernetes.io/projected/24a05c78-ed6a-409d-9f7a-62b96952fb28-kube-api-access-qmzfs\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.156154 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-log-ovn\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.156326 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-run-ovn\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.260261 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmzfs\" (UniqueName: \"kubernetes.io/projected/24a05c78-ed6a-409d-9f7a-62b96952fb28-kube-api-access-qmzfs\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.260351 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-log-ovn\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.260417 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-run-ovn\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.260450 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/24a05c78-ed6a-409d-9f7a-62b96952fb28-additional-scripts\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.260518 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-run\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.260554 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24a05c78-ed6a-409d-9f7a-62b96952fb28-scripts\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.261275 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-run-ovn\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.261423 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-log-ovn\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.261506 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-run\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.262047 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/24a05c78-ed6a-409d-9f7a-62b96952fb28-additional-scripts\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.264223 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24a05c78-ed6a-409d-9f7a-62b96952fb28-scripts\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.287573 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmzfs\" (UniqueName: \"kubernetes.io/projected/24a05c78-ed6a-409d-9f7a-62b96952fb28-kube-api-access-qmzfs\") pod \"ovn-controller-95nwv-config-gtssp\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:02 crc kubenswrapper[4831]: I1203 06:51:02.299088 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:04 crc kubenswrapper[4831]: I1203 06:51:04.172649 4831 generic.go:334] "Generic (PLEG): container finished" podID="9ce49982-d05f-49dd-9b08-ae54e662b628" containerID="6d6b1c280fd68fd1f6d2b15694ba84f3b57d21b989d2fef9e7c2285d54cbffc8" exitCode=0 Dec 03 06:51:04 crc kubenswrapper[4831]: I1203 06:51:04.173169 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f2nlc" event={"ID":"9ce49982-d05f-49dd-9b08-ae54e662b628","Type":"ContainerDied","Data":"6d6b1c280fd68fd1f6d2b15694ba84f3b57d21b989d2fef9e7c2285d54cbffc8"} Dec 03 06:51:04 crc kubenswrapper[4831]: I1203 06:51:04.931553 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:51:04 crc kubenswrapper[4831]: I1203 06:51:04.940401 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift\") pod \"swift-storage-0\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " pod="openstack/swift-storage-0" Dec 03 06:51:04 crc kubenswrapper[4831]: I1203 06:51:04.976601 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 06:51:06 crc kubenswrapper[4831]: I1203 06:51:06.204885 4831 generic.go:334] "Generic (PLEG): container finished" podID="dc0cbb94-92ec-4369-b609-f3186f302c66" containerID="454f4b86b6c9c0a48f9173d3808d9aeaba4f55df7d2f0c60530104f9fda643a9" exitCode=0 Dec 03 06:51:06 crc kubenswrapper[4831]: I1203 06:51:06.205162 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dc0cbb94-92ec-4369-b609-f3186f302c66","Type":"ContainerDied","Data":"454f4b86b6c9c0a48f9173d3808d9aeaba4f55df7d2f0c60530104f9fda643a9"} Dec 03 06:51:06 crc kubenswrapper[4831]: I1203 06:51:06.208942 4831 generic.go:334] "Generic (PLEG): container finished" podID="8d6ac806-4ac5-4de4-b6a0-b265032150f4" containerID="055b028fb60e01afe549cd5b7477b7dee313aab541390287578dc34dd742e9d3" exitCode=0 Dec 03 06:51:06 crc kubenswrapper[4831]: I1203 06:51:06.208980 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d6ac806-4ac5-4de4-b6a0-b265032150f4","Type":"ContainerDied","Data":"055b028fb60e01afe549cd5b7477b7dee313aab541390287578dc34dd742e9d3"} Dec 03 06:51:06 crc kubenswrapper[4831]: I1203 06:51:06.968488 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-95nwv" podUID="5fe1a689-1241-4c11-93ca-875e53319668" containerName="ovn-controller" probeResult="failure" output=< Dec 03 06:51:06 crc kubenswrapper[4831]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 06:51:06 crc kubenswrapper[4831]: > Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.782580 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.844397 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ce49982-d05f-49dd-9b08-ae54e662b628-ring-data-devices\") pod \"9ce49982-d05f-49dd-9b08-ae54e662b628\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.844432 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-swiftconf\") pod \"9ce49982-d05f-49dd-9b08-ae54e662b628\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.844451 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-combined-ca-bundle\") pod \"9ce49982-d05f-49dd-9b08-ae54e662b628\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.844498 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ce49982-d05f-49dd-9b08-ae54e662b628-scripts\") pod \"9ce49982-d05f-49dd-9b08-ae54e662b628\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.844513 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-dispersionconf\") pod \"9ce49982-d05f-49dd-9b08-ae54e662b628\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.844540 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ce49982-d05f-49dd-9b08-ae54e662b628-etc-swift\") pod \"9ce49982-d05f-49dd-9b08-ae54e662b628\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.844583 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws69z\" (UniqueName: \"kubernetes.io/projected/9ce49982-d05f-49dd-9b08-ae54e662b628-kube-api-access-ws69z\") pod \"9ce49982-d05f-49dd-9b08-ae54e662b628\" (UID: \"9ce49982-d05f-49dd-9b08-ae54e662b628\") " Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.846009 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ce49982-d05f-49dd-9b08-ae54e662b628-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "9ce49982-d05f-49dd-9b08-ae54e662b628" (UID: "9ce49982-d05f-49dd-9b08-ae54e662b628"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.847390 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ce49982-d05f-49dd-9b08-ae54e662b628-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "9ce49982-d05f-49dd-9b08-ae54e662b628" (UID: "9ce49982-d05f-49dd-9b08-ae54e662b628"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.860558 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ce49982-d05f-49dd-9b08-ae54e662b628-kube-api-access-ws69z" (OuterVolumeSpecName: "kube-api-access-ws69z") pod "9ce49982-d05f-49dd-9b08-ae54e662b628" (UID: "9ce49982-d05f-49dd-9b08-ae54e662b628"). InnerVolumeSpecName "kube-api-access-ws69z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.872980 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "9ce49982-d05f-49dd-9b08-ae54e662b628" (UID: "9ce49982-d05f-49dd-9b08-ae54e662b628"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.920694 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ce49982-d05f-49dd-9b08-ae54e662b628-scripts" (OuterVolumeSpecName: "scripts") pod "9ce49982-d05f-49dd-9b08-ae54e662b628" (UID: "9ce49982-d05f-49dd-9b08-ae54e662b628"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.929701 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9ce49982-d05f-49dd-9b08-ae54e662b628" (UID: "9ce49982-d05f-49dd-9b08-ae54e662b628"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.931646 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "9ce49982-d05f-49dd-9b08-ae54e662b628" (UID: "9ce49982-d05f-49dd-9b08-ae54e662b628"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.950823 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9ce49982-d05f-49dd-9b08-ae54e662b628-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.950879 4831 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-dispersionconf\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.950894 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/9ce49982-d05f-49dd-9b08-ae54e662b628-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.950913 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws69z\" (UniqueName: \"kubernetes.io/projected/9ce49982-d05f-49dd-9b08-ae54e662b628-kube-api-access-ws69z\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.950927 4831 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/9ce49982-d05f-49dd-9b08-ae54e662b628-ring-data-devices\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.950944 4831 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-swiftconf\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.950955 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9ce49982-d05f-49dd-9b08-ae54e662b628-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:11 crc kubenswrapper[4831]: I1203 06:51:11.976307 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-95nwv" podUID="5fe1a689-1241-4c11-93ca-875e53319668" containerName="ovn-controller" probeResult="failure" output=< Dec 03 06:51:11 crc kubenswrapper[4831]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 06:51:11 crc kubenswrapper[4831]: > Dec 03 06:51:12 crc kubenswrapper[4831]: I1203 06:51:12.169032 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-95nwv-config-gtssp"] Dec 03 06:51:12 crc kubenswrapper[4831]: W1203 06:51:12.184977 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24a05c78_ed6a_409d_9f7a_62b96952fb28.slice/crio-15f7ed77b98ae5d8171c46bd049ae4961f1b751a1aa388c7bdbf883a5eaf2b1c WatchSource:0}: Error finding container 15f7ed77b98ae5d8171c46bd049ae4961f1b751a1aa388c7bdbf883a5eaf2b1c: Status 404 returned error can't find the container with id 15f7ed77b98ae5d8171c46bd049ae4961f1b751a1aa388c7bdbf883a5eaf2b1c Dec 03 06:51:12 crc kubenswrapper[4831]: I1203 06:51:12.233853 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Dec 03 06:51:12 crc kubenswrapper[4831]: W1203 06:51:12.246637 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f3e5a86_a8e3_416e_adb8_8b92c3cc81e5.slice/crio-8c5c1ed90709483c852f361b002cbf1be4c46fa6a876d4a80cc2c291e47730c3 WatchSource:0}: Error finding container 8c5c1ed90709483c852f361b002cbf1be4c46fa6a876d4a80cc2c291e47730c3: Status 404 returned error can't find the container with id 8c5c1ed90709483c852f361b002cbf1be4c46fa6a876d4a80cc2c291e47730c3 Dec 03 06:51:12 crc kubenswrapper[4831]: I1203 06:51:12.274598 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-95nwv-config-gtssp" event={"ID":"24a05c78-ed6a-409d-9f7a-62b96952fb28","Type":"ContainerStarted","Data":"15f7ed77b98ae5d8171c46bd049ae4961f1b751a1aa388c7bdbf883a5eaf2b1c"} Dec 03 06:51:12 crc kubenswrapper[4831]: I1203 06:51:12.277177 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d6ac806-4ac5-4de4-b6a0-b265032150f4","Type":"ContainerStarted","Data":"e701230003bdb542945df73fbff521649b1c0dbe31d133988a35ce0bd844cb8e"} Dec 03 06:51:12 crc kubenswrapper[4831]: I1203 06:51:12.277390 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:51:12 crc kubenswrapper[4831]: I1203 06:51:12.279167 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f2nlc" event={"ID":"9ce49982-d05f-49dd-9b08-ae54e662b628","Type":"ContainerDied","Data":"cdeebd5454301d4f246eb24912e96950742baf7319a722d63d14ae84c0abccf3"} Dec 03 06:51:12 crc kubenswrapper[4831]: I1203 06:51:12.279194 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cdeebd5454301d4f246eb24912e96950742baf7319a722d63d14ae84c0abccf3" Dec 03 06:51:12 crc kubenswrapper[4831]: I1203 06:51:12.279244 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f2nlc" Dec 03 06:51:12 crc kubenswrapper[4831]: I1203 06:51:12.280746 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"8c5c1ed90709483c852f361b002cbf1be4c46fa6a876d4a80cc2c291e47730c3"} Dec 03 06:51:12 crc kubenswrapper[4831]: I1203 06:51:12.286856 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dc0cbb94-92ec-4369-b609-f3186f302c66","Type":"ContainerStarted","Data":"309cf75d8cbb562efc0116b04c83dd3b65a4b14e1de35f409f188bc0a497e2b6"} Dec 03 06:51:12 crc kubenswrapper[4831]: I1203 06:51:12.287076 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 06:51:12 crc kubenswrapper[4831]: I1203 06:51:12.302385 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=42.536211402 podStartE2EDuration="1m22.302368447s" podCreationTimestamp="2025-12-03 06:49:50 +0000 UTC" firstStartedPulling="2025-12-03 06:49:52.77045037 +0000 UTC m=+1130.114033878" lastFinishedPulling="2025-12-03 06:50:32.536607415 +0000 UTC m=+1169.880190923" observedRunningTime="2025-12-03 06:51:12.299722544 +0000 UTC m=+1209.643306072" watchObservedRunningTime="2025-12-03 06:51:12.302368447 +0000 UTC m=+1209.645951955" Dec 03 06:51:12 crc kubenswrapper[4831]: I1203 06:51:12.338865 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=42.218023818 podStartE2EDuration="1m21.338847968s" podCreationTimestamp="2025-12-03 06:49:51 +0000 UTC" firstStartedPulling="2025-12-03 06:49:53.416640581 +0000 UTC m=+1130.760224089" lastFinishedPulling="2025-12-03 06:50:32.537464731 +0000 UTC m=+1169.881048239" observedRunningTime="2025-12-03 06:51:12.330062954 +0000 UTC m=+1209.673646472" watchObservedRunningTime="2025-12-03 06:51:12.338847968 +0000 UTC m=+1209.682431476" Dec 03 06:51:13 crc kubenswrapper[4831]: I1203 06:51:13.304007 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nxpq5" event={"ID":"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd","Type":"ContainerStarted","Data":"cd5262ff4d09f2a70f87d311ed4d215c4c526cf5ccf08abf8aeb8dd061f54cd4"} Dec 03 06:51:13 crc kubenswrapper[4831]: I1203 06:51:13.309599 4831 generic.go:334] "Generic (PLEG): container finished" podID="24a05c78-ed6a-409d-9f7a-62b96952fb28" containerID="b925f0622d92bf472d21abc6c23f6621534691535cbfaf8ba8388d4d004db117" exitCode=0 Dec 03 06:51:13 crc kubenswrapper[4831]: I1203 06:51:13.309786 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-95nwv-config-gtssp" event={"ID":"24a05c78-ed6a-409d-9f7a-62b96952fb28","Type":"ContainerDied","Data":"b925f0622d92bf472d21abc6c23f6621534691535cbfaf8ba8388d4d004db117"} Dec 03 06:51:13 crc kubenswrapper[4831]: I1203 06:51:13.331751 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-nxpq5" podStartSLOduration=2.76931737 podStartE2EDuration="17.331735108s" podCreationTimestamp="2025-12-03 06:50:56 +0000 UTC" firstStartedPulling="2025-12-03 06:50:57.1666682 +0000 UTC m=+1194.510251708" lastFinishedPulling="2025-12-03 06:51:11.729085938 +0000 UTC m=+1209.072669446" observedRunningTime="2025-12-03 06:51:13.328858348 +0000 UTC m=+1210.672441886" watchObservedRunningTime="2025-12-03 06:51:13.331735108 +0000 UTC m=+1210.675318606" Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.322161 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"8dafbfe58e259c950e7b63464cdbfb891e68abc0a86b7d4e0c9bc5406283a1a8"} Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.322482 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"3c231ebfbf6bf636033051e5a33d02c71564164db1667cbd16b072382a0b97b0"} Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.322498 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"f22d5226986449b8e588d26eb4a1afaecba31a4dd8aa09a17eaa0c7ac5bec2f5"} Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.801134 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.919233 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24a05c78-ed6a-409d-9f7a-62b96952fb28-scripts\") pod \"24a05c78-ed6a-409d-9f7a-62b96952fb28\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.919524 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-log-ovn\") pod \"24a05c78-ed6a-409d-9f7a-62b96952fb28\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.919554 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmzfs\" (UniqueName: \"kubernetes.io/projected/24a05c78-ed6a-409d-9f7a-62b96952fb28-kube-api-access-qmzfs\") pod \"24a05c78-ed6a-409d-9f7a-62b96952fb28\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.919585 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/24a05c78-ed6a-409d-9f7a-62b96952fb28-additional-scripts\") pod \"24a05c78-ed6a-409d-9f7a-62b96952fb28\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.919599 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-run\") pod \"24a05c78-ed6a-409d-9f7a-62b96952fb28\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.919638 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-run-ovn\") pod \"24a05c78-ed6a-409d-9f7a-62b96952fb28\" (UID: \"24a05c78-ed6a-409d-9f7a-62b96952fb28\") " Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.920051 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "24a05c78-ed6a-409d-9f7a-62b96952fb28" (UID: "24a05c78-ed6a-409d-9f7a-62b96952fb28"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.920083 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "24a05c78-ed6a-409d-9f7a-62b96952fb28" (UID: "24a05c78-ed6a-409d-9f7a-62b96952fb28"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.920282 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a05c78-ed6a-409d-9f7a-62b96952fb28-scripts" (OuterVolumeSpecName: "scripts") pod "24a05c78-ed6a-409d-9f7a-62b96952fb28" (UID: "24a05c78-ed6a-409d-9f7a-62b96952fb28"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.920552 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/24a05c78-ed6a-409d-9f7a-62b96952fb28-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "24a05c78-ed6a-409d-9f7a-62b96952fb28" (UID: "24a05c78-ed6a-409d-9f7a-62b96952fb28"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.920585 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-run" (OuterVolumeSpecName: "var-run") pod "24a05c78-ed6a-409d-9f7a-62b96952fb28" (UID: "24a05c78-ed6a-409d-9f7a-62b96952fb28"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:51:14 crc kubenswrapper[4831]: I1203 06:51:14.932787 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24a05c78-ed6a-409d-9f7a-62b96952fb28-kube-api-access-qmzfs" (OuterVolumeSpecName: "kube-api-access-qmzfs") pod "24a05c78-ed6a-409d-9f7a-62b96952fb28" (UID: "24a05c78-ed6a-409d-9f7a-62b96952fb28"). InnerVolumeSpecName "kube-api-access-qmzfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:15 crc kubenswrapper[4831]: I1203 06:51:15.022062 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/24a05c78-ed6a-409d-9f7a-62b96952fb28-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:15 crc kubenswrapper[4831]: I1203 06:51:15.022134 4831 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:15 crc kubenswrapper[4831]: I1203 06:51:15.022148 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmzfs\" (UniqueName: \"kubernetes.io/projected/24a05c78-ed6a-409d-9f7a-62b96952fb28-kube-api-access-qmzfs\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:15 crc kubenswrapper[4831]: I1203 06:51:15.022157 4831 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:15 crc kubenswrapper[4831]: I1203 06:51:15.022173 4831 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/24a05c78-ed6a-409d-9f7a-62b96952fb28-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:15 crc kubenswrapper[4831]: I1203 06:51:15.022182 4831 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/24a05c78-ed6a-409d-9f7a-62b96952fb28-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:15 crc kubenswrapper[4831]: I1203 06:51:15.339770 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"56d4eeeadba1da79db3faf5f409f1a7b9ecb1b330a57a0eb2389a1bfa3a52c83"} Dec 03 06:51:15 crc kubenswrapper[4831]: I1203 06:51:15.343876 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-95nwv-config-gtssp" event={"ID":"24a05c78-ed6a-409d-9f7a-62b96952fb28","Type":"ContainerDied","Data":"15f7ed77b98ae5d8171c46bd049ae4961f1b751a1aa388c7bdbf883a5eaf2b1c"} Dec 03 06:51:15 crc kubenswrapper[4831]: I1203 06:51:15.343902 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15f7ed77b98ae5d8171c46bd049ae4961f1b751a1aa388c7bdbf883a5eaf2b1c" Dec 03 06:51:15 crc kubenswrapper[4831]: I1203 06:51:15.344060 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-95nwv-config-gtssp" Dec 03 06:51:15 crc kubenswrapper[4831]: I1203 06:51:15.927887 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-95nwv-config-gtssp"] Dec 03 06:51:15 crc kubenswrapper[4831]: I1203 06:51:15.935825 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-95nwv-config-gtssp"] Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.061859 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-95nwv-config-kxxlh"] Dec 03 06:51:16 crc kubenswrapper[4831]: E1203 06:51:16.062174 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24a05c78-ed6a-409d-9f7a-62b96952fb28" containerName="ovn-config" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.062185 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="24a05c78-ed6a-409d-9f7a-62b96952fb28" containerName="ovn-config" Dec 03 06:51:16 crc kubenswrapper[4831]: E1203 06:51:16.062197 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ce49982-d05f-49dd-9b08-ae54e662b628" containerName="swift-ring-rebalance" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.062203 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ce49982-d05f-49dd-9b08-ae54e662b628" containerName="swift-ring-rebalance" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.062360 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ce49982-d05f-49dd-9b08-ae54e662b628" containerName="swift-ring-rebalance" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.062401 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="24a05c78-ed6a-409d-9f7a-62b96952fb28" containerName="ovn-config" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.062886 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.070261 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.084289 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-95nwv-config-kxxlh"] Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.140174 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4df6a34-4e97-4800-b3f9-021323687d34-scripts\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.140227 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-log-ovn\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.140262 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-run-ovn\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.140420 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-run\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.140465 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f4df6a34-4e97-4800-b3f9-021323687d34-additional-scripts\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.140650 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m59g\" (UniqueName: \"kubernetes.io/projected/f4df6a34-4e97-4800-b3f9-021323687d34-kube-api-access-7m59g\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.242136 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4df6a34-4e97-4800-b3f9-021323687d34-scripts\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.242193 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-log-ovn\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.242216 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-run-ovn\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.242300 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-run\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.242351 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f4df6a34-4e97-4800-b3f9-021323687d34-additional-scripts\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.242408 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m59g\" (UniqueName: \"kubernetes.io/projected/f4df6a34-4e97-4800-b3f9-021323687d34-kube-api-access-7m59g\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.242569 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-log-ovn\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.242662 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-run\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.242702 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-run-ovn\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.243472 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f4df6a34-4e97-4800-b3f9-021323687d34-additional-scripts\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.244639 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4df6a34-4e97-4800-b3f9-021323687d34-scripts\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.265451 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m59g\" (UniqueName: \"kubernetes.io/projected/f4df6a34-4e97-4800-b3f9-021323687d34-kube-api-access-7m59g\") pod \"ovn-controller-95nwv-config-kxxlh\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.354958 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"6a0f59a871e573533ec16955bf826ca1763261b3d21e5512bb24a0e62bd0b73f"} Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.355033 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"a8bb5d58227d4221c9bfbfe3c07b51f7961676fc4bef20fa5826956eaf08b8f4"} Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.378851 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:16 crc kubenswrapper[4831]: W1203 06:51:16.880239 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4df6a34_4e97_4800_b3f9_021323687d34.slice/crio-824854bc7ffc5798d7ea7e3689492e0129cba683603aa8e158757e11965d81d4 WatchSource:0}: Error finding container 824854bc7ffc5798d7ea7e3689492e0129cba683603aa8e158757e11965d81d4: Status 404 returned error can't find the container with id 824854bc7ffc5798d7ea7e3689492e0129cba683603aa8e158757e11965d81d4 Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.882417 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-95nwv-config-kxxlh"] Dec 03 06:51:16 crc kubenswrapper[4831]: I1203 06:51:16.983920 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-95nwv" Dec 03 06:51:17 crc kubenswrapper[4831]: I1203 06:51:17.023027 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24a05c78-ed6a-409d-9f7a-62b96952fb28" path="/var/lib/kubelet/pods/24a05c78-ed6a-409d-9f7a-62b96952fb28/volumes" Dec 03 06:51:17 crc kubenswrapper[4831]: I1203 06:51:17.366182 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"d637e5874dd130a1ed498e16b28107fbbaefcfdf4116b7e7df8a194953476096"} Dec 03 06:51:17 crc kubenswrapper[4831]: I1203 06:51:17.366488 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"2ba943381a5e11302635d211cf0600c94162fbd3abf92948e4a2fb1e82ec5bde"} Dec 03 06:51:17 crc kubenswrapper[4831]: I1203 06:51:17.367201 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-95nwv-config-kxxlh" event={"ID":"f4df6a34-4e97-4800-b3f9-021323687d34","Type":"ContainerStarted","Data":"4d7afff5bfb7f70c1f6e79575dad07b5923d28bf89c9b89e6bf75d1bd7f50d68"} Dec 03 06:51:17 crc kubenswrapper[4831]: I1203 06:51:17.367268 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-95nwv-config-kxxlh" event={"ID":"f4df6a34-4e97-4800-b3f9-021323687d34","Type":"ContainerStarted","Data":"824854bc7ffc5798d7ea7e3689492e0129cba683603aa8e158757e11965d81d4"} Dec 03 06:51:17 crc kubenswrapper[4831]: I1203 06:51:17.389660 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-95nwv-config-kxxlh" podStartSLOduration=1.389641618 podStartE2EDuration="1.389641618s" podCreationTimestamp="2025-12-03 06:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:17.383377492 +0000 UTC m=+1214.726961010" watchObservedRunningTime="2025-12-03 06:51:17.389641618 +0000 UTC m=+1214.733225126" Dec 03 06:51:18 crc kubenswrapper[4831]: I1203 06:51:18.380382 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"a45f8080689abc78feae747c5b6498cddb0646c863625a0fcdc630c1aa9a119f"} Dec 03 06:51:18 crc kubenswrapper[4831]: I1203 06:51:18.380960 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"46ddf46be9c06097453cf59db024eae7e40683f0a9989b74823fd4ae301bb17d"} Dec 03 06:51:18 crc kubenswrapper[4831]: I1203 06:51:18.380986 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"b9831c9567de92c654defb6777172aef010ae80be0cc6394662c2231bb36974f"} Dec 03 06:51:18 crc kubenswrapper[4831]: I1203 06:51:18.382864 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4df6a34-4e97-4800-b3f9-021323687d34" containerID="4d7afff5bfb7f70c1f6e79575dad07b5923d28bf89c9b89e6bf75d1bd7f50d68" exitCode=0 Dec 03 06:51:18 crc kubenswrapper[4831]: I1203 06:51:18.382891 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-95nwv-config-kxxlh" event={"ID":"f4df6a34-4e97-4800-b3f9-021323687d34","Type":"ContainerDied","Data":"4d7afff5bfb7f70c1f6e79575dad07b5923d28bf89c9b89e6bf75d1bd7f50d68"} Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.766656 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.897866 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f4df6a34-4e97-4800-b3f9-021323687d34-additional-scripts\") pod \"f4df6a34-4e97-4800-b3f9-021323687d34\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.898026 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m59g\" (UniqueName: \"kubernetes.io/projected/f4df6a34-4e97-4800-b3f9-021323687d34-kube-api-access-7m59g\") pod \"f4df6a34-4e97-4800-b3f9-021323687d34\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.898072 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4df6a34-4e97-4800-b3f9-021323687d34-scripts\") pod \"f4df6a34-4e97-4800-b3f9-021323687d34\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.898094 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-run-ovn\") pod \"f4df6a34-4e97-4800-b3f9-021323687d34\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.898176 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-log-ovn\") pod \"f4df6a34-4e97-4800-b3f9-021323687d34\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.898241 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-run\") pod \"f4df6a34-4e97-4800-b3f9-021323687d34\" (UID: \"f4df6a34-4e97-4800-b3f9-021323687d34\") " Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.898334 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f4df6a34-4e97-4800-b3f9-021323687d34" (UID: "f4df6a34-4e97-4800-b3f9-021323687d34"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.898396 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f4df6a34-4e97-4800-b3f9-021323687d34" (UID: "f4df6a34-4e97-4800-b3f9-021323687d34"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.898565 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-run" (OuterVolumeSpecName: "var-run") pod "f4df6a34-4e97-4800-b3f9-021323687d34" (UID: "f4df6a34-4e97-4800-b3f9-021323687d34"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.898653 4831 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.898678 4831 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.898993 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4df6a34-4e97-4800-b3f9-021323687d34-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f4df6a34-4e97-4800-b3f9-021323687d34" (UID: "f4df6a34-4e97-4800-b3f9-021323687d34"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.899234 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4df6a34-4e97-4800-b3f9-021323687d34-scripts" (OuterVolumeSpecName: "scripts") pod "f4df6a34-4e97-4800-b3f9-021323687d34" (UID: "f4df6a34-4e97-4800-b3f9-021323687d34"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.923027 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4df6a34-4e97-4800-b3f9-021323687d34-kube-api-access-7m59g" (OuterVolumeSpecName: "kube-api-access-7m59g") pod "f4df6a34-4e97-4800-b3f9-021323687d34" (UID: "f4df6a34-4e97-4800-b3f9-021323687d34"). InnerVolumeSpecName "kube-api-access-7m59g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.959649 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-95nwv-config-kxxlh"] Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.971694 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-95nwv-config-kxxlh"] Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.999806 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m59g\" (UniqueName: \"kubernetes.io/projected/f4df6a34-4e97-4800-b3f9-021323687d34-kube-api-access-7m59g\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.999848 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f4df6a34-4e97-4800-b3f9-021323687d34-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.999861 4831 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f4df6a34-4e97-4800-b3f9-021323687d34-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:19 crc kubenswrapper[4831]: I1203 06:51:19.999873 4831 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f4df6a34-4e97-4800-b3f9-021323687d34-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:20 crc kubenswrapper[4831]: I1203 06:51:20.404421 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="824854bc7ffc5798d7ea7e3689492e0129cba683603aa8e158757e11965d81d4" Dec 03 06:51:20 crc kubenswrapper[4831]: I1203 06:51:20.404472 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-95nwv-config-kxxlh" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.027012 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4df6a34-4e97-4800-b3f9-021323687d34" path="/var/lib/kubelet/pods/f4df6a34-4e97-4800-b3f9-021323687d34/volumes" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.421526 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"fd5780a1f6eae3fedba096674fc112a06bd17051dfaba7a6dbcd0187d08efbe7"} Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.422237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"4f4ec3457fbe407088edd9608e37a0d64f94f5f8a856063df097964d16e89a41"} Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.422369 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"c3f41ef8ab13f137c3b5591e5b203f6dabd3e35924d132650708724f52cb311a"} Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.422463 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerStarted","Data":"028761397bb478f7acb2d80c81c0b10fa950424ecf33e2cf4134085e78dfa0ea"} Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.472461 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=29.154193089 podStartE2EDuration="34.472435638s" podCreationTimestamp="2025-12-03 06:50:47 +0000 UTC" firstStartedPulling="2025-12-03 06:51:12.250421861 +0000 UTC m=+1209.594005369" lastFinishedPulling="2025-12-03 06:51:17.56866441 +0000 UTC m=+1214.912247918" observedRunningTime="2025-12-03 06:51:21.4699633 +0000 UTC m=+1218.813546818" watchObservedRunningTime="2025-12-03 06:51:21.472435638 +0000 UTC m=+1218.816019146" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.757406 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-2qhvj"] Dec 03 06:51:21 crc kubenswrapper[4831]: E1203 06:51:21.757871 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4df6a34-4e97-4800-b3f9-021323687d34" containerName="ovn-config" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.757890 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4df6a34-4e97-4800-b3f9-021323687d34" containerName="ovn-config" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.758120 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4df6a34-4e97-4800-b3f9-021323687d34" containerName="ovn-config" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.759276 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.768620 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.778267 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-2qhvj"] Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.838546 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-config\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.838605 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.838681 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.838758 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8lhq\" (UniqueName: \"kubernetes.io/projected/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-kube-api-access-h8lhq\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.838786 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.838899 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.940611 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-config\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.940655 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.940709 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.940764 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8lhq\" (UniqueName: \"kubernetes.io/projected/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-kube-api-access-h8lhq\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.940790 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.940812 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.942235 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.942860 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-config\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.943462 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.944102 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.944987 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:21 crc kubenswrapper[4831]: I1203 06:51:21.985926 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8lhq\" (UniqueName: \"kubernetes.io/projected/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-kube-api-access-h8lhq\") pod \"dnsmasq-dns-5c79d794d7-2qhvj\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:22 crc kubenswrapper[4831]: I1203 06:51:22.072372 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="8d6ac806-4ac5-4de4-b6a0-b265032150f4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Dec 03 06:51:22 crc kubenswrapper[4831]: I1203 06:51:22.077477 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:22 crc kubenswrapper[4831]: I1203 06:51:22.715589 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-2qhvj"] Dec 03 06:51:22 crc kubenswrapper[4831]: I1203 06:51:22.729540 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.002189 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-h8zhk"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.003713 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h8zhk" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.025425 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h8zhk"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.062247 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpxmp\" (UniqueName: \"kubernetes.io/projected/3614a116-7996-4106-a222-e2542d9dd89d-kube-api-access-fpxmp\") pod \"cinder-db-create-h8zhk\" (UID: \"3614a116-7996-4106-a222-e2542d9dd89d\") " pod="openstack/cinder-db-create-h8zhk" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.062431 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3614a116-7996-4106-a222-e2542d9dd89d-operator-scripts\") pod \"cinder-db-create-h8zhk\" (UID: \"3614a116-7996-4106-a222-e2542d9dd89d\") " pod="openstack/cinder-db-create-h8zhk" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.107579 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-3a6d-account-create-update-gb5qw"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.108675 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3a6d-account-create-update-gb5qw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.110952 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.115952 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3a6d-account-create-update-gb5qw"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.124784 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-8rkh8"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.125797 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8rkh8" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.159635 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8rkh8"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.164542 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqqrj\" (UniqueName: \"kubernetes.io/projected/4a2f4d8e-71bd-4abc-b0db-165cb43fea80-kube-api-access-xqqrj\") pod \"cinder-3a6d-account-create-update-gb5qw\" (UID: \"4a2f4d8e-71bd-4abc-b0db-165cb43fea80\") " pod="openstack/cinder-3a6d-account-create-update-gb5qw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.164612 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpxmp\" (UniqueName: \"kubernetes.io/projected/3614a116-7996-4106-a222-e2542d9dd89d-kube-api-access-fpxmp\") pod \"cinder-db-create-h8zhk\" (UID: \"3614a116-7996-4106-a222-e2542d9dd89d\") " pod="openstack/cinder-db-create-h8zhk" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.164655 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66411938-f6a0-4c6a-a478-2b9a451a2275-operator-scripts\") pod \"barbican-db-create-8rkh8\" (UID: \"66411938-f6a0-4c6a-a478-2b9a451a2275\") " pod="openstack/barbican-db-create-8rkh8" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.164685 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3614a116-7996-4106-a222-e2542d9dd89d-operator-scripts\") pod \"cinder-db-create-h8zhk\" (UID: \"3614a116-7996-4106-a222-e2542d9dd89d\") " pod="openstack/cinder-db-create-h8zhk" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.164736 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5g97k\" (UniqueName: \"kubernetes.io/projected/66411938-f6a0-4c6a-a478-2b9a451a2275-kube-api-access-5g97k\") pod \"barbican-db-create-8rkh8\" (UID: \"66411938-f6a0-4c6a-a478-2b9a451a2275\") " pod="openstack/barbican-db-create-8rkh8" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.164798 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a2f4d8e-71bd-4abc-b0db-165cb43fea80-operator-scripts\") pod \"cinder-3a6d-account-create-update-gb5qw\" (UID: \"4a2f4d8e-71bd-4abc-b0db-165cb43fea80\") " pod="openstack/cinder-3a6d-account-create-update-gb5qw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.165882 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3614a116-7996-4106-a222-e2542d9dd89d-operator-scripts\") pod \"cinder-db-create-h8zhk\" (UID: \"3614a116-7996-4106-a222-e2542d9dd89d\") " pod="openstack/cinder-db-create-h8zhk" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.189350 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpxmp\" (UniqueName: \"kubernetes.io/projected/3614a116-7996-4106-a222-e2542d9dd89d-kube-api-access-fpxmp\") pod \"cinder-db-create-h8zhk\" (UID: \"3614a116-7996-4106-a222-e2542d9dd89d\") " pod="openstack/cinder-db-create-h8zhk" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.266144 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqqrj\" (UniqueName: \"kubernetes.io/projected/4a2f4d8e-71bd-4abc-b0db-165cb43fea80-kube-api-access-xqqrj\") pod \"cinder-3a6d-account-create-update-gb5qw\" (UID: \"4a2f4d8e-71bd-4abc-b0db-165cb43fea80\") " pod="openstack/cinder-3a6d-account-create-update-gb5qw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.266440 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66411938-f6a0-4c6a-a478-2b9a451a2275-operator-scripts\") pod \"barbican-db-create-8rkh8\" (UID: \"66411938-f6a0-4c6a-a478-2b9a451a2275\") " pod="openstack/barbican-db-create-8rkh8" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.266572 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5g97k\" (UniqueName: \"kubernetes.io/projected/66411938-f6a0-4c6a-a478-2b9a451a2275-kube-api-access-5g97k\") pod \"barbican-db-create-8rkh8\" (UID: \"66411938-f6a0-4c6a-a478-2b9a451a2275\") " pod="openstack/barbican-db-create-8rkh8" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.266688 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a2f4d8e-71bd-4abc-b0db-165cb43fea80-operator-scripts\") pod \"cinder-3a6d-account-create-update-gb5qw\" (UID: \"4a2f4d8e-71bd-4abc-b0db-165cb43fea80\") " pod="openstack/cinder-3a6d-account-create-update-gb5qw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.267377 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66411938-f6a0-4c6a-a478-2b9a451a2275-operator-scripts\") pod \"barbican-db-create-8rkh8\" (UID: \"66411938-f6a0-4c6a-a478-2b9a451a2275\") " pod="openstack/barbican-db-create-8rkh8" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.267530 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a2f4d8e-71bd-4abc-b0db-165cb43fea80-operator-scripts\") pod \"cinder-3a6d-account-create-update-gb5qw\" (UID: \"4a2f4d8e-71bd-4abc-b0db-165cb43fea80\") " pod="openstack/cinder-3a6d-account-create-update-gb5qw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.281459 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqqrj\" (UniqueName: \"kubernetes.io/projected/4a2f4d8e-71bd-4abc-b0db-165cb43fea80-kube-api-access-xqqrj\") pod \"cinder-3a6d-account-create-update-gb5qw\" (UID: \"4a2f4d8e-71bd-4abc-b0db-165cb43fea80\") " pod="openstack/cinder-3a6d-account-create-update-gb5qw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.295802 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5g97k\" (UniqueName: \"kubernetes.io/projected/66411938-f6a0-4c6a-a478-2b9a451a2275-kube-api-access-5g97k\") pod \"barbican-db-create-8rkh8\" (UID: \"66411938-f6a0-4c6a-a478-2b9a451a2275\") " pod="openstack/barbican-db-create-8rkh8" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.314425 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-3e51-account-create-update-k4ppw"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.323702 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h8zhk" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.323938 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3e51-account-create-update-k4ppw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.325606 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3e51-account-create-update-k4ppw"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.328456 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.370909 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmflk\" (UniqueName: \"kubernetes.io/projected/b12b0f68-97d1-4253-b83b-692dbb30d970-kube-api-access-jmflk\") pod \"barbican-3e51-account-create-update-k4ppw\" (UID: \"b12b0f68-97d1-4253-b83b-692dbb30d970\") " pod="openstack/barbican-3e51-account-create-update-k4ppw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.370944 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b12b0f68-97d1-4253-b83b-692dbb30d970-operator-scripts\") pod \"barbican-3e51-account-create-update-k4ppw\" (UID: \"b12b0f68-97d1-4253-b83b-692dbb30d970\") " pod="openstack/barbican-3e51-account-create-update-k4ppw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.384154 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-vgj9j"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.389582 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vgj9j" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.393475 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.393599 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.393710 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v8w6b" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.393803 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.394738 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vgj9j"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.446176 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3a6d-account-create-update-gb5qw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.448417 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8rkh8" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.462073 4831 generic.go:334] "Generic (PLEG): container finished" podID="e07ae49e-a6fb-478a-8b6f-3f8f687f4afd" containerID="cd5262ff4d09f2a70f87d311ed4d215c4c526cf5ccf08abf8aeb8dd061f54cd4" exitCode=0 Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.462148 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nxpq5" event={"ID":"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd","Type":"ContainerDied","Data":"cd5262ff4d09f2a70f87d311ed4d215c4c526cf5ccf08abf8aeb8dd061f54cd4"} Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.466854 4831 generic.go:334] "Generic (PLEG): container finished" podID="ebd0577c-85d9-4133-92a0-c9d6f78de9c3" containerID="7922af8f40fb25a7e49caff84628ea592a9c1b4338928ca8cb7af36d792af898" exitCode=0 Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.466919 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" event={"ID":"ebd0577c-85d9-4133-92a0-c9d6f78de9c3","Type":"ContainerDied","Data":"7922af8f40fb25a7e49caff84628ea592a9c1b4338928ca8cb7af36d792af898"} Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.466950 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" event={"ID":"ebd0577c-85d9-4133-92a0-c9d6f78de9c3","Type":"ContainerStarted","Data":"e7019e118932256266272f3eece8363dc69c2973aac2312ca819d95a31a5ba78"} Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.486603 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tmhsz\" (UniqueName: \"kubernetes.io/projected/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-kube-api-access-tmhsz\") pod \"keystone-db-sync-vgj9j\" (UID: \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\") " pod="openstack/keystone-db-sync-vgj9j" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.486733 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-combined-ca-bundle\") pod \"keystone-db-sync-vgj9j\" (UID: \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\") " pod="openstack/keystone-db-sync-vgj9j" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.486768 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmflk\" (UniqueName: \"kubernetes.io/projected/b12b0f68-97d1-4253-b83b-692dbb30d970-kube-api-access-jmflk\") pod \"barbican-3e51-account-create-update-k4ppw\" (UID: \"b12b0f68-97d1-4253-b83b-692dbb30d970\") " pod="openstack/barbican-3e51-account-create-update-k4ppw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.486791 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b12b0f68-97d1-4253-b83b-692dbb30d970-operator-scripts\") pod \"barbican-3e51-account-create-update-k4ppw\" (UID: \"b12b0f68-97d1-4253-b83b-692dbb30d970\") " pod="openstack/barbican-3e51-account-create-update-k4ppw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.486815 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-config-data\") pod \"keystone-db-sync-vgj9j\" (UID: \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\") " pod="openstack/keystone-db-sync-vgj9j" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.489240 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b12b0f68-97d1-4253-b83b-692dbb30d970-operator-scripts\") pod \"barbican-3e51-account-create-update-k4ppw\" (UID: \"b12b0f68-97d1-4253-b83b-692dbb30d970\") " pod="openstack/barbican-3e51-account-create-update-k4ppw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.516048 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmflk\" (UniqueName: \"kubernetes.io/projected/b12b0f68-97d1-4253-b83b-692dbb30d970-kube-api-access-jmflk\") pod \"barbican-3e51-account-create-update-k4ppw\" (UID: \"b12b0f68-97d1-4253-b83b-692dbb30d970\") " pod="openstack/barbican-3e51-account-create-update-k4ppw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.537920 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-2ms27"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.538969 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2ms27" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.565370 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b468-account-create-update-h97z9"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.566589 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b468-account-create-update-h97z9" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.569039 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.573814 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2ms27"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.591330 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b468-account-create-update-h97z9"] Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.591487 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tmhsz\" (UniqueName: \"kubernetes.io/projected/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-kube-api-access-tmhsz\") pod \"keystone-db-sync-vgj9j\" (UID: \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\") " pod="openstack/keystone-db-sync-vgj9j" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.591547 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-combined-ca-bundle\") pod \"keystone-db-sync-vgj9j\" (UID: \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\") " pod="openstack/keystone-db-sync-vgj9j" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.592093 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-config-data\") pod \"keystone-db-sync-vgj9j\" (UID: \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\") " pod="openstack/keystone-db-sync-vgj9j" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.597417 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-config-data\") pod \"keystone-db-sync-vgj9j\" (UID: \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\") " pod="openstack/keystone-db-sync-vgj9j" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.602288 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-combined-ca-bundle\") pod \"keystone-db-sync-vgj9j\" (UID: \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\") " pod="openstack/keystone-db-sync-vgj9j" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.609006 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tmhsz\" (UniqueName: \"kubernetes.io/projected/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-kube-api-access-tmhsz\") pod \"keystone-db-sync-vgj9j\" (UID: \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\") " pod="openstack/keystone-db-sync-vgj9j" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.695062 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b55e481e-04a7-4cc6-bddd-7f8a87ca6b77-operator-scripts\") pod \"neutron-b468-account-create-update-h97z9\" (UID: \"b55e481e-04a7-4cc6-bddd-7f8a87ca6b77\") " pod="openstack/neutron-b468-account-create-update-h97z9" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.695387 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mgxd\" (UniqueName: \"kubernetes.io/projected/59ff8d77-29ce-4d59-919c-4280b2489608-kube-api-access-2mgxd\") pod \"neutron-db-create-2ms27\" (UID: \"59ff8d77-29ce-4d59-919c-4280b2489608\") " pod="openstack/neutron-db-create-2ms27" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.695448 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k78z\" (UniqueName: \"kubernetes.io/projected/b55e481e-04a7-4cc6-bddd-7f8a87ca6b77-kube-api-access-6k78z\") pod \"neutron-b468-account-create-update-h97z9\" (UID: \"b55e481e-04a7-4cc6-bddd-7f8a87ca6b77\") " pod="openstack/neutron-b468-account-create-update-h97z9" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.695663 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ff8d77-29ce-4d59-919c-4280b2489608-operator-scripts\") pod \"neutron-db-create-2ms27\" (UID: \"59ff8d77-29ce-4d59-919c-4280b2489608\") " pod="openstack/neutron-db-create-2ms27" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.760872 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3e51-account-create-update-k4ppw" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.769886 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vgj9j" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.797483 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ff8d77-29ce-4d59-919c-4280b2489608-operator-scripts\") pod \"neutron-db-create-2ms27\" (UID: \"59ff8d77-29ce-4d59-919c-4280b2489608\") " pod="openstack/neutron-db-create-2ms27" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.797637 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mgxd\" (UniqueName: \"kubernetes.io/projected/59ff8d77-29ce-4d59-919c-4280b2489608-kube-api-access-2mgxd\") pod \"neutron-db-create-2ms27\" (UID: \"59ff8d77-29ce-4d59-919c-4280b2489608\") " pod="openstack/neutron-db-create-2ms27" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.797661 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b55e481e-04a7-4cc6-bddd-7f8a87ca6b77-operator-scripts\") pod \"neutron-b468-account-create-update-h97z9\" (UID: \"b55e481e-04a7-4cc6-bddd-7f8a87ca6b77\") " pod="openstack/neutron-b468-account-create-update-h97z9" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.797748 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k78z\" (UniqueName: \"kubernetes.io/projected/b55e481e-04a7-4cc6-bddd-7f8a87ca6b77-kube-api-access-6k78z\") pod \"neutron-b468-account-create-update-h97z9\" (UID: \"b55e481e-04a7-4cc6-bddd-7f8a87ca6b77\") " pod="openstack/neutron-b468-account-create-update-h97z9" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.798378 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ff8d77-29ce-4d59-919c-4280b2489608-operator-scripts\") pod \"neutron-db-create-2ms27\" (UID: \"59ff8d77-29ce-4d59-919c-4280b2489608\") " pod="openstack/neutron-db-create-2ms27" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.802360 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b55e481e-04a7-4cc6-bddd-7f8a87ca6b77-operator-scripts\") pod \"neutron-b468-account-create-update-h97z9\" (UID: \"b55e481e-04a7-4cc6-bddd-7f8a87ca6b77\") " pod="openstack/neutron-b468-account-create-update-h97z9" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.816724 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k78z\" (UniqueName: \"kubernetes.io/projected/b55e481e-04a7-4cc6-bddd-7f8a87ca6b77-kube-api-access-6k78z\") pod \"neutron-b468-account-create-update-h97z9\" (UID: \"b55e481e-04a7-4cc6-bddd-7f8a87ca6b77\") " pod="openstack/neutron-b468-account-create-update-h97z9" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.823851 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mgxd\" (UniqueName: \"kubernetes.io/projected/59ff8d77-29ce-4d59-919c-4280b2489608-kube-api-access-2mgxd\") pod \"neutron-db-create-2ms27\" (UID: \"59ff8d77-29ce-4d59-919c-4280b2489608\") " pod="openstack/neutron-db-create-2ms27" Dec 03 06:51:23 crc kubenswrapper[4831]: I1203 06:51:23.863436 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2ms27" Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:23.888211 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b468-account-create-update-h97z9" Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:23.989764 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-h8zhk"] Dec 03 06:51:24 crc kubenswrapper[4831]: W1203 06:51:24.007266 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3614a116_7996_4106_a222_e2542d9dd89d.slice/crio-445c23513b1a2fafbfacc5ddd89aa6339ff1d781828de5d1d04693201692e385 WatchSource:0}: Error finding container 445c23513b1a2fafbfacc5ddd89aa6339ff1d781828de5d1d04693201692e385: Status 404 returned error can't find the container with id 445c23513b1a2fafbfacc5ddd89aa6339ff1d781828de5d1d04693201692e385 Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.059919 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-3a6d-account-create-update-gb5qw"] Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.146514 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-8rkh8"] Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.478055 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" event={"ID":"ebd0577c-85d9-4133-92a0-c9d6f78de9c3","Type":"ContainerStarted","Data":"0bf7cb770f6eac71b38475ad3b0d7333cf291fb89f728f1a8e22c3f5e3044a9d"} Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.478288 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.479562 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h8zhk" event={"ID":"3614a116-7996-4106-a222-e2542d9dd89d","Type":"ContainerStarted","Data":"7ef08bde82922dfbabfe3171b51605013b1ffda34f112a536b00995fa61b5fd4"} Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.479584 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h8zhk" event={"ID":"3614a116-7996-4106-a222-e2542d9dd89d","Type":"ContainerStarted","Data":"445c23513b1a2fafbfacc5ddd89aa6339ff1d781828de5d1d04693201692e385"} Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.485770 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8rkh8" event={"ID":"66411938-f6a0-4c6a-a478-2b9a451a2275","Type":"ContainerStarted","Data":"18b7e657c68eea07658d1883b7cea150a92aa7f604639f69593adc7e316fa57c"} Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.485836 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8rkh8" event={"ID":"66411938-f6a0-4c6a-a478-2b9a451a2275","Type":"ContainerStarted","Data":"bd05536290e72e348b8efb2dfb064255757ecb64bc9b95b58518a519f8e64ef0"} Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.495390 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3a6d-account-create-update-gb5qw" event={"ID":"4a2f4d8e-71bd-4abc-b0db-165cb43fea80","Type":"ContainerStarted","Data":"68e8d72f0a3c2b88f2aef5d2282e0149f866aaf159fa298da782fd0be9e123cb"} Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.495442 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3a6d-account-create-update-gb5qw" event={"ID":"4a2f4d8e-71bd-4abc-b0db-165cb43fea80","Type":"ContainerStarted","Data":"df640fa6200e6d0969ca0eef9146dc5b41e707fdc0aa94c3f9bf5eb62cd156b0"} Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.499546 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" podStartSLOduration=3.499529841 podStartE2EDuration="3.499529841s" podCreationTimestamp="2025-12-03 06:51:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:24.495264947 +0000 UTC m=+1221.838848445" watchObservedRunningTime="2025-12-03 06:51:24.499529841 +0000 UTC m=+1221.843113349" Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.512971 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-h8zhk" podStartSLOduration=2.5129588910000002 podStartE2EDuration="2.512958891s" podCreationTimestamp="2025-12-03 06:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:24.510479023 +0000 UTC m=+1221.854062531" watchObservedRunningTime="2025-12-03 06:51:24.512958891 +0000 UTC m=+1221.856542399" Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.547029 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-8rkh8" podStartSLOduration=1.547009716 podStartE2EDuration="1.547009716s" podCreationTimestamp="2025-12-03 06:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:24.524539563 +0000 UTC m=+1221.868123071" watchObservedRunningTime="2025-12-03 06:51:24.547009716 +0000 UTC m=+1221.890593224" Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.568202 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-3a6d-account-create-update-gb5qw" podStartSLOduration=1.568170498 podStartE2EDuration="1.568170498s" podCreationTimestamp="2025-12-03 06:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:24.542337 +0000 UTC m=+1221.885920508" watchObservedRunningTime="2025-12-03 06:51:24.568170498 +0000 UTC m=+1221.911754006" Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.919916 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nxpq5" Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.975015 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-3e51-account-create-update-k4ppw"] Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.981826 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-2ms27"] Dec 03 06:51:24 crc kubenswrapper[4831]: W1203 06:51:24.990980 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb55e481e_04a7_4cc6_bddd_7f8a87ca6b77.slice/crio-034629bb80b99cb32702afbd9273965123715babe7fea3f5ca70cc28540b1bb2 WatchSource:0}: Error finding container 034629bb80b99cb32702afbd9273965123715babe7fea3f5ca70cc28540b1bb2: Status 404 returned error can't find the container with id 034629bb80b99cb32702afbd9273965123715babe7fea3f5ca70cc28540b1bb2 Dec 03 06:51:24 crc kubenswrapper[4831]: W1203 06:51:24.991502 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0310482f_ea00_44c5_8450_6b3b3ebe5e5f.slice/crio-fcafac7a8e0c0d8211168ebc9b2f5364d8a3a17eb67571416b9df2f9c3c4edf5 WatchSource:0}: Error finding container fcafac7a8e0c0d8211168ebc9b2f5364d8a3a17eb67571416b9df2f9c3c4edf5: Status 404 returned error can't find the container with id fcafac7a8e0c0d8211168ebc9b2f5364d8a3a17eb67571416b9df2f9c3c4edf5 Dec 03 06:51:24 crc kubenswrapper[4831]: I1203 06:51:24.992897 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-vgj9j"] Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.000840 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b468-account-create-update-h97z9"] Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.019455 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-config-data\") pod \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.019753 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-db-sync-config-data\") pod \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.019829 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-combined-ca-bundle\") pod \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.019878 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ctfv\" (UniqueName: \"kubernetes.io/projected/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-kube-api-access-4ctfv\") pod \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\" (UID: \"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd\") " Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.025408 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e07ae49e-a6fb-478a-8b6f-3f8f687f4afd" (UID: "e07ae49e-a6fb-478a-8b6f-3f8f687f4afd"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.031083 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-kube-api-access-4ctfv" (OuterVolumeSpecName: "kube-api-access-4ctfv") pod "e07ae49e-a6fb-478a-8b6f-3f8f687f4afd" (UID: "e07ae49e-a6fb-478a-8b6f-3f8f687f4afd"). InnerVolumeSpecName "kube-api-access-4ctfv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.058039 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e07ae49e-a6fb-478a-8b6f-3f8f687f4afd" (UID: "e07ae49e-a6fb-478a-8b6f-3f8f687f4afd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.083143 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-config-data" (OuterVolumeSpecName: "config-data") pod "e07ae49e-a6fb-478a-8b6f-3f8f687f4afd" (UID: "e07ae49e-a6fb-478a-8b6f-3f8f687f4afd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.122213 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.122244 4831 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.122254 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.122262 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ctfv\" (UniqueName: \"kubernetes.io/projected/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd-kube-api-access-4ctfv\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.547912 4831 generic.go:334] "Generic (PLEG): container finished" podID="4a2f4d8e-71bd-4abc-b0db-165cb43fea80" containerID="68e8d72f0a3c2b88f2aef5d2282e0149f866aaf159fa298da782fd0be9e123cb" exitCode=0 Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.548133 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3a6d-account-create-update-gb5qw" event={"ID":"4a2f4d8e-71bd-4abc-b0db-165cb43fea80","Type":"ContainerDied","Data":"68e8d72f0a3c2b88f2aef5d2282e0149f866aaf159fa298da782fd0be9e123cb"} Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.551868 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vgj9j" event={"ID":"0310482f-ea00-44c5-8450-6b3b3ebe5e5f","Type":"ContainerStarted","Data":"fcafac7a8e0c0d8211168ebc9b2f5364d8a3a17eb67571416b9df2f9c3c4edf5"} Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.554252 4831 generic.go:334] "Generic (PLEG): container finished" podID="b55e481e-04a7-4cc6-bddd-7f8a87ca6b77" containerID="5173ebd2612f2bca1eb5bfe9a9a4e301ffa1ac330d0fd6c0b492b51d60a348c0" exitCode=0 Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.554424 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b468-account-create-update-h97z9" event={"ID":"b55e481e-04a7-4cc6-bddd-7f8a87ca6b77","Type":"ContainerDied","Data":"5173ebd2612f2bca1eb5bfe9a9a4e301ffa1ac330d0fd6c0b492b51d60a348c0"} Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.554456 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b468-account-create-update-h97z9" event={"ID":"b55e481e-04a7-4cc6-bddd-7f8a87ca6b77","Type":"ContainerStarted","Data":"034629bb80b99cb32702afbd9273965123715babe7fea3f5ca70cc28540b1bb2"} Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.556333 4831 generic.go:334] "Generic (PLEG): container finished" podID="b12b0f68-97d1-4253-b83b-692dbb30d970" containerID="cadf275943a0d51df428d9e8f0db5f89e2c97e8e7f7a9f3ee7ee2e2d69b75447" exitCode=0 Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.556430 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3e51-account-create-update-k4ppw" event={"ID":"b12b0f68-97d1-4253-b83b-692dbb30d970","Type":"ContainerDied","Data":"cadf275943a0d51df428d9e8f0db5f89e2c97e8e7f7a9f3ee7ee2e2d69b75447"} Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.556464 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3e51-account-create-update-k4ppw" event={"ID":"b12b0f68-97d1-4253-b83b-692dbb30d970","Type":"ContainerStarted","Data":"e54e1e62b1f27b22569ecf8055063901f50fdba152a7300f7ac1b8bfb15acc3f"} Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.558303 4831 generic.go:334] "Generic (PLEG): container finished" podID="59ff8d77-29ce-4d59-919c-4280b2489608" containerID="0cd0a2d44929d2103ada46d13db9172b16fd96eb4d3d73a7610523b81b7a7a5f" exitCode=0 Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.558350 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2ms27" event={"ID":"59ff8d77-29ce-4d59-919c-4280b2489608","Type":"ContainerDied","Data":"0cd0a2d44929d2103ada46d13db9172b16fd96eb4d3d73a7610523b81b7a7a5f"} Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.558377 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2ms27" event={"ID":"59ff8d77-29ce-4d59-919c-4280b2489608","Type":"ContainerStarted","Data":"ddbe7e326110778436ebab8a3dcdfa70fb029b98f054c5100891754d91b1676c"} Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.569218 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-nxpq5" event={"ID":"e07ae49e-a6fb-478a-8b6f-3f8f687f4afd","Type":"ContainerDied","Data":"c14983c3ee527d1c356f19fee494d88ec4a098cd3a3702b16a4a20e798576de3"} Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.569469 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c14983c3ee527d1c356f19fee494d88ec4a098cd3a3702b16a4a20e798576de3" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.569540 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-nxpq5" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.573648 4831 generic.go:334] "Generic (PLEG): container finished" podID="3614a116-7996-4106-a222-e2542d9dd89d" containerID="7ef08bde82922dfbabfe3171b51605013b1ffda34f112a536b00995fa61b5fd4" exitCode=0 Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.573700 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h8zhk" event={"ID":"3614a116-7996-4106-a222-e2542d9dd89d","Type":"ContainerDied","Data":"7ef08bde82922dfbabfe3171b51605013b1ffda34f112a536b00995fa61b5fd4"} Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.577227 4831 generic.go:334] "Generic (PLEG): container finished" podID="66411938-f6a0-4c6a-a478-2b9a451a2275" containerID="18b7e657c68eea07658d1883b7cea150a92aa7f604639f69593adc7e316fa57c" exitCode=0 Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.577375 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8rkh8" event={"ID":"66411938-f6a0-4c6a-a478-2b9a451a2275","Type":"ContainerDied","Data":"18b7e657c68eea07658d1883b7cea150a92aa7f604639f69593adc7e316fa57c"} Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.824735 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-2qhvj"] Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.867573 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pwh2s"] Dec 03 06:51:25 crc kubenswrapper[4831]: E1203 06:51:25.867951 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e07ae49e-a6fb-478a-8b6f-3f8f687f4afd" containerName="glance-db-sync" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.867963 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e07ae49e-a6fb-478a-8b6f-3f8f687f4afd" containerName="glance-db-sync" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.868118 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e07ae49e-a6fb-478a-8b6f-3f8f687f4afd" containerName="glance-db-sync" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.868955 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:25 crc kubenswrapper[4831]: I1203 06:51:25.886151 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pwh2s"] Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.037983 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.038050 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.038074 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.038286 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-config\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.038454 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h44j\" (UniqueName: \"kubernetes.io/projected/9c3895c6-516b-4681-b3c3-0e9291dfe322-kube-api-access-2h44j\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.038707 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.140851 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.140929 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.140969 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.140989 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.141016 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-config\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.141034 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h44j\" (UniqueName: \"kubernetes.io/projected/9c3895c6-516b-4681-b3c3-0e9291dfe322-kube-api-access-2h44j\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.141969 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.142102 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.142726 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.142801 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-config\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.143097 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.194270 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h44j\" (UniqueName: \"kubernetes.io/projected/9c3895c6-516b-4681-b3c3-0e9291dfe322-kube-api-access-2h44j\") pod \"dnsmasq-dns-5f59b8f679-pwh2s\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.490775 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:26 crc kubenswrapper[4831]: I1203 06:51:26.583671 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" podUID="ebd0577c-85d9-4133-92a0-c9d6f78de9c3" containerName="dnsmasq-dns" containerID="cri-o://0bf7cb770f6eac71b38475ad3b0d7333cf291fb89f728f1a8e22c3f5e3044a9d" gracePeriod=10 Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.026932 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3a6d-account-create-update-gb5qw" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.035046 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h8zhk" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.052584 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b468-account-create-update-h97z9" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.139968 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2ms27" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.162913 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqqrj\" (UniqueName: \"kubernetes.io/projected/4a2f4d8e-71bd-4abc-b0db-165cb43fea80-kube-api-access-xqqrj\") pod \"4a2f4d8e-71bd-4abc-b0db-165cb43fea80\" (UID: \"4a2f4d8e-71bd-4abc-b0db-165cb43fea80\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.162978 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b55e481e-04a7-4cc6-bddd-7f8a87ca6b77-operator-scripts\") pod \"b55e481e-04a7-4cc6-bddd-7f8a87ca6b77\" (UID: \"b55e481e-04a7-4cc6-bddd-7f8a87ca6b77\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.163005 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k78z\" (UniqueName: \"kubernetes.io/projected/b55e481e-04a7-4cc6-bddd-7f8a87ca6b77-kube-api-access-6k78z\") pod \"b55e481e-04a7-4cc6-bddd-7f8a87ca6b77\" (UID: \"b55e481e-04a7-4cc6-bddd-7f8a87ca6b77\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.163028 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpxmp\" (UniqueName: \"kubernetes.io/projected/3614a116-7996-4106-a222-e2542d9dd89d-kube-api-access-fpxmp\") pod \"3614a116-7996-4106-a222-e2542d9dd89d\" (UID: \"3614a116-7996-4106-a222-e2542d9dd89d\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.163182 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a2f4d8e-71bd-4abc-b0db-165cb43fea80-operator-scripts\") pod \"4a2f4d8e-71bd-4abc-b0db-165cb43fea80\" (UID: \"4a2f4d8e-71bd-4abc-b0db-165cb43fea80\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.163252 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3614a116-7996-4106-a222-e2542d9dd89d-operator-scripts\") pod \"3614a116-7996-4106-a222-e2542d9dd89d\" (UID: \"3614a116-7996-4106-a222-e2542d9dd89d\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.166577 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b55e481e-04a7-4cc6-bddd-7f8a87ca6b77-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b55e481e-04a7-4cc6-bddd-7f8a87ca6b77" (UID: "b55e481e-04a7-4cc6-bddd-7f8a87ca6b77"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.166666 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3614a116-7996-4106-a222-e2542d9dd89d-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3614a116-7996-4106-a222-e2542d9dd89d" (UID: "3614a116-7996-4106-a222-e2542d9dd89d"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.166671 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a2f4d8e-71bd-4abc-b0db-165cb43fea80-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a2f4d8e-71bd-4abc-b0db-165cb43fea80" (UID: "4a2f4d8e-71bd-4abc-b0db-165cb43fea80"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.172910 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8rkh8" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.173207 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3614a116-7996-4106-a222-e2542d9dd89d-kube-api-access-fpxmp" (OuterVolumeSpecName: "kube-api-access-fpxmp") pod "3614a116-7996-4106-a222-e2542d9dd89d" (UID: "3614a116-7996-4106-a222-e2542d9dd89d"). InnerVolumeSpecName "kube-api-access-fpxmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.176680 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a2f4d8e-71bd-4abc-b0db-165cb43fea80-kube-api-access-xqqrj" (OuterVolumeSpecName: "kube-api-access-xqqrj") pod "4a2f4d8e-71bd-4abc-b0db-165cb43fea80" (UID: "4a2f4d8e-71bd-4abc-b0db-165cb43fea80"). InnerVolumeSpecName "kube-api-access-xqqrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.177481 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b55e481e-04a7-4cc6-bddd-7f8a87ca6b77-kube-api-access-6k78z" (OuterVolumeSpecName: "kube-api-access-6k78z") pod "b55e481e-04a7-4cc6-bddd-7f8a87ca6b77" (UID: "b55e481e-04a7-4cc6-bddd-7f8a87ca6b77"). InnerVolumeSpecName "kube-api-access-6k78z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.181104 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3e51-account-create-update-k4ppw" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.187597 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.266578 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mgxd\" (UniqueName: \"kubernetes.io/projected/59ff8d77-29ce-4d59-919c-4280b2489608-kube-api-access-2mgxd\") pod \"59ff8d77-29ce-4d59-919c-4280b2489608\" (UID: \"59ff8d77-29ce-4d59-919c-4280b2489608\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.266615 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jmflk\" (UniqueName: \"kubernetes.io/projected/b12b0f68-97d1-4253-b83b-692dbb30d970-kube-api-access-jmflk\") pod \"b12b0f68-97d1-4253-b83b-692dbb30d970\" (UID: \"b12b0f68-97d1-4253-b83b-692dbb30d970\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.266636 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66411938-f6a0-4c6a-a478-2b9a451a2275-operator-scripts\") pod \"66411938-f6a0-4c6a-a478-2b9a451a2275\" (UID: \"66411938-f6a0-4c6a-a478-2b9a451a2275\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.266672 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-config\") pod \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.266729 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-dns-swift-storage-0\") pod \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.266759 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8lhq\" (UniqueName: \"kubernetes.io/projected/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-kube-api-access-h8lhq\") pod \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.266777 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ff8d77-29ce-4d59-919c-4280b2489608-operator-scripts\") pod \"59ff8d77-29ce-4d59-919c-4280b2489608\" (UID: \"59ff8d77-29ce-4d59-919c-4280b2489608\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.266820 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-ovsdbserver-sb\") pod \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.266843 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5g97k\" (UniqueName: \"kubernetes.io/projected/66411938-f6a0-4c6a-a478-2b9a451a2275-kube-api-access-5g97k\") pod \"66411938-f6a0-4c6a-a478-2b9a451a2275\" (UID: \"66411938-f6a0-4c6a-a478-2b9a451a2275\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.266861 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b12b0f68-97d1-4253-b83b-692dbb30d970-operator-scripts\") pod \"b12b0f68-97d1-4253-b83b-692dbb30d970\" (UID: \"b12b0f68-97d1-4253-b83b-692dbb30d970\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.266903 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-dns-svc\") pod \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.266920 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-ovsdbserver-nb\") pod \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\" (UID: \"ebd0577c-85d9-4133-92a0-c9d6f78de9c3\") " Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.267253 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3614a116-7996-4106-a222-e2542d9dd89d-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.267263 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xqqrj\" (UniqueName: \"kubernetes.io/projected/4a2f4d8e-71bd-4abc-b0db-165cb43fea80-kube-api-access-xqqrj\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.267272 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b55e481e-04a7-4cc6-bddd-7f8a87ca6b77-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.267281 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k78z\" (UniqueName: \"kubernetes.io/projected/b55e481e-04a7-4cc6-bddd-7f8a87ca6b77-kube-api-access-6k78z\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.267290 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpxmp\" (UniqueName: \"kubernetes.io/projected/3614a116-7996-4106-a222-e2542d9dd89d-kube-api-access-fpxmp\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.267298 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a2f4d8e-71bd-4abc-b0db-165cb43fea80-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.270447 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/66411938-f6a0-4c6a-a478-2b9a451a2275-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "66411938-f6a0-4c6a-a478-2b9a451a2275" (UID: "66411938-f6a0-4c6a-a478-2b9a451a2275"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.270769 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-kube-api-access-h8lhq" (OuterVolumeSpecName: "kube-api-access-h8lhq") pod "ebd0577c-85d9-4133-92a0-c9d6f78de9c3" (UID: "ebd0577c-85d9-4133-92a0-c9d6f78de9c3"). InnerVolumeSpecName "kube-api-access-h8lhq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.273117 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b12b0f68-97d1-4253-b83b-692dbb30d970-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b12b0f68-97d1-4253-b83b-692dbb30d970" (UID: "b12b0f68-97d1-4253-b83b-692dbb30d970"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.275553 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12b0f68-97d1-4253-b83b-692dbb30d970-kube-api-access-jmflk" (OuterVolumeSpecName: "kube-api-access-jmflk") pod "b12b0f68-97d1-4253-b83b-692dbb30d970" (UID: "b12b0f68-97d1-4253-b83b-692dbb30d970"). InnerVolumeSpecName "kube-api-access-jmflk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.275778 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59ff8d77-29ce-4d59-919c-4280b2489608-kube-api-access-2mgxd" (OuterVolumeSpecName: "kube-api-access-2mgxd") pod "59ff8d77-29ce-4d59-919c-4280b2489608" (UID: "59ff8d77-29ce-4d59-919c-4280b2489608"). InnerVolumeSpecName "kube-api-access-2mgxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.275841 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/59ff8d77-29ce-4d59-919c-4280b2489608-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "59ff8d77-29ce-4d59-919c-4280b2489608" (UID: "59ff8d77-29ce-4d59-919c-4280b2489608"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.284523 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/66411938-f6a0-4c6a-a478-2b9a451a2275-kube-api-access-5g97k" (OuterVolumeSpecName: "kube-api-access-5g97k") pod "66411938-f6a0-4c6a-a478-2b9a451a2275" (UID: "66411938-f6a0-4c6a-a478-2b9a451a2275"). InnerVolumeSpecName "kube-api-access-5g97k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.325568 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ebd0577c-85d9-4133-92a0-c9d6f78de9c3" (UID: "ebd0577c-85d9-4133-92a0-c9d6f78de9c3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.339924 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ebd0577c-85d9-4133-92a0-c9d6f78de9c3" (UID: "ebd0577c-85d9-4133-92a0-c9d6f78de9c3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.345045 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-config" (OuterVolumeSpecName: "config") pod "ebd0577c-85d9-4133-92a0-c9d6f78de9c3" (UID: "ebd0577c-85d9-4133-92a0-c9d6f78de9c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.345826 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pwh2s"] Dec 03 06:51:27 crc kubenswrapper[4831]: W1203 06:51:27.347177 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c3895c6_516b_4681_b3c3_0e9291dfe322.slice/crio-f8a9f1707ad07bdeff1e1ee42e52a17df79e02faa58bcf8cf43507966af343c5 WatchSource:0}: Error finding container f8a9f1707ad07bdeff1e1ee42e52a17df79e02faa58bcf8cf43507966af343c5: Status 404 returned error can't find the container with id f8a9f1707ad07bdeff1e1ee42e52a17df79e02faa58bcf8cf43507966af343c5 Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.360859 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ebd0577c-85d9-4133-92a0-c9d6f78de9c3" (UID: "ebd0577c-85d9-4133-92a0-c9d6f78de9c3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.366888 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "ebd0577c-85d9-4133-92a0-c9d6f78de9c3" (UID: "ebd0577c-85d9-4133-92a0-c9d6f78de9c3"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.368289 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8lhq\" (UniqueName: \"kubernetes.io/projected/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-kube-api-access-h8lhq\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.368349 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/59ff8d77-29ce-4d59-919c-4280b2489608-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.368358 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.368367 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5g97k\" (UniqueName: \"kubernetes.io/projected/66411938-f6a0-4c6a-a478-2b9a451a2275-kube-api-access-5g97k\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.368379 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b12b0f68-97d1-4253-b83b-692dbb30d970-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.368390 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.368398 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.368406 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mgxd\" (UniqueName: \"kubernetes.io/projected/59ff8d77-29ce-4d59-919c-4280b2489608-kube-api-access-2mgxd\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.368415 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jmflk\" (UniqueName: \"kubernetes.io/projected/b12b0f68-97d1-4253-b83b-692dbb30d970-kube-api-access-jmflk\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.368423 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/66411938-f6a0-4c6a-a478-2b9a451a2275-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.368431 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.368438 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/ebd0577c-85d9-4133-92a0-c9d6f78de9c3-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.594760 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-8rkh8" event={"ID":"66411938-f6a0-4c6a-a478-2b9a451a2275","Type":"ContainerDied","Data":"bd05536290e72e348b8efb2dfb064255757ecb64bc9b95b58518a519f8e64ef0"} Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.594810 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bd05536290e72e348b8efb2dfb064255757ecb64bc9b95b58518a519f8e64ef0" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.594851 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-8rkh8" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.602184 4831 generic.go:334] "Generic (PLEG): container finished" podID="9c3895c6-516b-4681-b3c3-0e9291dfe322" containerID="c6d0fe5894f63e35b1a7232d25b36b47f61870d8b5406acfc75dfc2ca5424ee7" exitCode=0 Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.602286 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" event={"ID":"9c3895c6-516b-4681-b3c3-0e9291dfe322","Type":"ContainerDied","Data":"c6d0fe5894f63e35b1a7232d25b36b47f61870d8b5406acfc75dfc2ca5424ee7"} Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.602340 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" event={"ID":"9c3895c6-516b-4681-b3c3-0e9291dfe322","Type":"ContainerStarted","Data":"f8a9f1707ad07bdeff1e1ee42e52a17df79e02faa58bcf8cf43507966af343c5"} Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.606301 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-3a6d-account-create-update-gb5qw" event={"ID":"4a2f4d8e-71bd-4abc-b0db-165cb43fea80","Type":"ContainerDied","Data":"df640fa6200e6d0969ca0eef9146dc5b41e707fdc0aa94c3f9bf5eb62cd156b0"} Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.606363 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df640fa6200e6d0969ca0eef9146dc5b41e707fdc0aa94c3f9bf5eb62cd156b0" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.606373 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-3a6d-account-create-update-gb5qw" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.610013 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b468-account-create-update-h97z9" event={"ID":"b55e481e-04a7-4cc6-bddd-7f8a87ca6b77","Type":"ContainerDied","Data":"034629bb80b99cb32702afbd9273965123715babe7fea3f5ca70cc28540b1bb2"} Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.610060 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="034629bb80b99cb32702afbd9273965123715babe7fea3f5ca70cc28540b1bb2" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.610196 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b468-account-create-update-h97z9" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.613805 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-3e51-account-create-update-k4ppw" event={"ID":"b12b0f68-97d1-4253-b83b-692dbb30d970","Type":"ContainerDied","Data":"e54e1e62b1f27b22569ecf8055063901f50fdba152a7300f7ac1b8bfb15acc3f"} Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.613850 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e54e1e62b1f27b22569ecf8055063901f50fdba152a7300f7ac1b8bfb15acc3f" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.613946 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-3e51-account-create-update-k4ppw" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.617613 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-2ms27" event={"ID":"59ff8d77-29ce-4d59-919c-4280b2489608","Type":"ContainerDied","Data":"ddbe7e326110778436ebab8a3dcdfa70fb029b98f054c5100891754d91b1676c"} Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.617883 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddbe7e326110778436ebab8a3dcdfa70fb029b98f054c5100891754d91b1676c" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.618246 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-2ms27" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.630846 4831 generic.go:334] "Generic (PLEG): container finished" podID="ebd0577c-85d9-4133-92a0-c9d6f78de9c3" containerID="0bf7cb770f6eac71b38475ad3b0d7333cf291fb89f728f1a8e22c3f5e3044a9d" exitCode=0 Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.630889 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" event={"ID":"ebd0577c-85d9-4133-92a0-c9d6f78de9c3","Type":"ContainerDied","Data":"0bf7cb770f6eac71b38475ad3b0d7333cf291fb89f728f1a8e22c3f5e3044a9d"} Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.630935 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.630969 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-2qhvj" event={"ID":"ebd0577c-85d9-4133-92a0-c9d6f78de9c3","Type":"ContainerDied","Data":"e7019e118932256266272f3eece8363dc69c2973aac2312ca819d95a31a5ba78"} Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.630995 4831 scope.go:117] "RemoveContainer" containerID="0bf7cb770f6eac71b38475ad3b0d7333cf291fb89f728f1a8e22c3f5e3044a9d" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.633943 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-h8zhk" event={"ID":"3614a116-7996-4106-a222-e2542d9dd89d","Type":"ContainerDied","Data":"445c23513b1a2fafbfacc5ddd89aa6339ff1d781828de5d1d04693201692e385"} Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.633982 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="445c23513b1a2fafbfacc5ddd89aa6339ff1d781828de5d1d04693201692e385" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.634042 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-h8zhk" Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.685548 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-2qhvj"] Dec 03 06:51:27 crc kubenswrapper[4831]: I1203 06:51:27.693505 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-2qhvj"] Dec 03 06:51:29 crc kubenswrapper[4831]: I1203 06:51:29.022920 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ebd0577c-85d9-4133-92a0-c9d6f78de9c3" path="/var/lib/kubelet/pods/ebd0577c-85d9-4133-92a0-c9d6f78de9c3/volumes" Dec 03 06:51:30 crc kubenswrapper[4831]: I1203 06:51:30.154803 4831 scope.go:117] "RemoveContainer" containerID="7922af8f40fb25a7e49caff84628ea592a9c1b4338928ca8cb7af36d792af898" Dec 03 06:51:30 crc kubenswrapper[4831]: I1203 06:51:30.344594 4831 scope.go:117] "RemoveContainer" containerID="0bf7cb770f6eac71b38475ad3b0d7333cf291fb89f728f1a8e22c3f5e3044a9d" Dec 03 06:51:30 crc kubenswrapper[4831]: E1203 06:51:30.345235 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bf7cb770f6eac71b38475ad3b0d7333cf291fb89f728f1a8e22c3f5e3044a9d\": container with ID starting with 0bf7cb770f6eac71b38475ad3b0d7333cf291fb89f728f1a8e22c3f5e3044a9d not found: ID does not exist" containerID="0bf7cb770f6eac71b38475ad3b0d7333cf291fb89f728f1a8e22c3f5e3044a9d" Dec 03 06:51:30 crc kubenswrapper[4831]: I1203 06:51:30.345294 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bf7cb770f6eac71b38475ad3b0d7333cf291fb89f728f1a8e22c3f5e3044a9d"} err="failed to get container status \"0bf7cb770f6eac71b38475ad3b0d7333cf291fb89f728f1a8e22c3f5e3044a9d\": rpc error: code = NotFound desc = could not find container \"0bf7cb770f6eac71b38475ad3b0d7333cf291fb89f728f1a8e22c3f5e3044a9d\": container with ID starting with 0bf7cb770f6eac71b38475ad3b0d7333cf291fb89f728f1a8e22c3f5e3044a9d not found: ID does not exist" Dec 03 06:51:30 crc kubenswrapper[4831]: I1203 06:51:30.345350 4831 scope.go:117] "RemoveContainer" containerID="7922af8f40fb25a7e49caff84628ea592a9c1b4338928ca8cb7af36d792af898" Dec 03 06:51:30 crc kubenswrapper[4831]: E1203 06:51:30.345727 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7922af8f40fb25a7e49caff84628ea592a9c1b4338928ca8cb7af36d792af898\": container with ID starting with 7922af8f40fb25a7e49caff84628ea592a9c1b4338928ca8cb7af36d792af898 not found: ID does not exist" containerID="7922af8f40fb25a7e49caff84628ea592a9c1b4338928ca8cb7af36d792af898" Dec 03 06:51:30 crc kubenswrapper[4831]: I1203 06:51:30.345755 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7922af8f40fb25a7e49caff84628ea592a9c1b4338928ca8cb7af36d792af898"} err="failed to get container status \"7922af8f40fb25a7e49caff84628ea592a9c1b4338928ca8cb7af36d792af898\": rpc error: code = NotFound desc = could not find container \"7922af8f40fb25a7e49caff84628ea592a9c1b4338928ca8cb7af36d792af898\": container with ID starting with 7922af8f40fb25a7e49caff84628ea592a9c1b4338928ca8cb7af36d792af898 not found: ID does not exist" Dec 03 06:51:30 crc kubenswrapper[4831]: I1203 06:51:30.667617 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" event={"ID":"9c3895c6-516b-4681-b3c3-0e9291dfe322","Type":"ContainerStarted","Data":"bf3305a025f14b656d2a04fe1dcbbd105079d7a12d26aa4a54d3b942e9fa3357"} Dec 03 06:51:30 crc kubenswrapper[4831]: I1203 06:51:30.667803 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:30 crc kubenswrapper[4831]: I1203 06:51:30.670565 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vgj9j" event={"ID":"0310482f-ea00-44c5-8450-6b3b3ebe5e5f","Type":"ContainerStarted","Data":"d84b948796c9132bd3b586cb87ffe22a8a11b658e05ec8005cfa4a6023a15577"} Dec 03 06:51:30 crc kubenswrapper[4831]: I1203 06:51:30.699215 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" podStartSLOduration=5.699191681 podStartE2EDuration="5.699191681s" podCreationTimestamp="2025-12-03 06:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:30.693539254 +0000 UTC m=+1228.037122772" watchObservedRunningTime="2025-12-03 06:51:30.699191681 +0000 UTC m=+1228.042775189" Dec 03 06:51:30 crc kubenswrapper[4831]: I1203 06:51:30.720952 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-vgj9j" podStartSLOduration=2.495841217 podStartE2EDuration="7.720933731s" podCreationTimestamp="2025-12-03 06:51:23 +0000 UTC" firstStartedPulling="2025-12-03 06:51:25.001534269 +0000 UTC m=+1222.345117777" lastFinishedPulling="2025-12-03 06:51:30.226626773 +0000 UTC m=+1227.570210291" observedRunningTime="2025-12-03 06:51:30.710205915 +0000 UTC m=+1228.053789453" watchObservedRunningTime="2025-12-03 06:51:30.720933731 +0000 UTC m=+1228.064517239" Dec 03 06:51:32 crc kubenswrapper[4831]: I1203 06:51:32.073659 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:51:33 crc kubenswrapper[4831]: I1203 06:51:33.717043 4831 generic.go:334] "Generic (PLEG): container finished" podID="0310482f-ea00-44c5-8450-6b3b3ebe5e5f" containerID="d84b948796c9132bd3b586cb87ffe22a8a11b658e05ec8005cfa4a6023a15577" exitCode=0 Dec 03 06:51:33 crc kubenswrapper[4831]: I1203 06:51:33.717299 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vgj9j" event={"ID":"0310482f-ea00-44c5-8450-6b3b3ebe5e5f","Type":"ContainerDied","Data":"d84b948796c9132bd3b586cb87ffe22a8a11b658e05ec8005cfa4a6023a15577"} Dec 03 06:51:35 crc kubenswrapper[4831]: I1203 06:51:35.114677 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vgj9j" Dec 03 06:51:35 crc kubenswrapper[4831]: I1203 06:51:35.222529 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tmhsz\" (UniqueName: \"kubernetes.io/projected/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-kube-api-access-tmhsz\") pod \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\" (UID: \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\") " Dec 03 06:51:35 crc kubenswrapper[4831]: I1203 06:51:35.222576 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-combined-ca-bundle\") pod \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\" (UID: \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\") " Dec 03 06:51:35 crc kubenswrapper[4831]: I1203 06:51:35.222654 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-config-data\") pod \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\" (UID: \"0310482f-ea00-44c5-8450-6b3b3ebe5e5f\") " Dec 03 06:51:35 crc kubenswrapper[4831]: I1203 06:51:35.228197 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-kube-api-access-tmhsz" (OuterVolumeSpecName: "kube-api-access-tmhsz") pod "0310482f-ea00-44c5-8450-6b3b3ebe5e5f" (UID: "0310482f-ea00-44c5-8450-6b3b3ebe5e5f"). InnerVolumeSpecName "kube-api-access-tmhsz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:35 crc kubenswrapper[4831]: I1203 06:51:35.245744 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0310482f-ea00-44c5-8450-6b3b3ebe5e5f" (UID: "0310482f-ea00-44c5-8450-6b3b3ebe5e5f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:35 crc kubenswrapper[4831]: I1203 06:51:35.267621 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-config-data" (OuterVolumeSpecName: "config-data") pod "0310482f-ea00-44c5-8450-6b3b3ebe5e5f" (UID: "0310482f-ea00-44c5-8450-6b3b3ebe5e5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:35 crc kubenswrapper[4831]: I1203 06:51:35.325156 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:35 crc kubenswrapper[4831]: I1203 06:51:35.325198 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tmhsz\" (UniqueName: \"kubernetes.io/projected/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-kube-api-access-tmhsz\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:35 crc kubenswrapper[4831]: I1203 06:51:35.325221 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0310482f-ea00-44c5-8450-6b3b3ebe5e5f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:35 crc kubenswrapper[4831]: I1203 06:51:35.742437 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-vgj9j" event={"ID":"0310482f-ea00-44c5-8450-6b3b3ebe5e5f","Type":"ContainerDied","Data":"fcafac7a8e0c0d8211168ebc9b2f5364d8a3a17eb67571416b9df2f9c3c4edf5"} Dec 03 06:51:35 crc kubenswrapper[4831]: I1203 06:51:35.742505 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fcafac7a8e0c0d8211168ebc9b2f5364d8a3a17eb67571416b9df2f9c3c4edf5" Dec 03 06:51:35 crc kubenswrapper[4831]: I1203 06:51:35.742509 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-vgj9j" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.062598 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-h8mjv"] Dec 03 06:51:36 crc kubenswrapper[4831]: E1203 06:51:36.063258 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b12b0f68-97d1-4253-b83b-692dbb30d970" containerName="mariadb-account-create-update" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.063334 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12b0f68-97d1-4253-b83b-692dbb30d970" containerName="mariadb-account-create-update" Dec 03 06:51:36 crc kubenswrapper[4831]: E1203 06:51:36.063396 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a2f4d8e-71bd-4abc-b0db-165cb43fea80" containerName="mariadb-account-create-update" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.063442 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a2f4d8e-71bd-4abc-b0db-165cb43fea80" containerName="mariadb-account-create-update" Dec 03 06:51:36 crc kubenswrapper[4831]: E1203 06:51:36.063496 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0310482f-ea00-44c5-8450-6b3b3ebe5e5f" containerName="keystone-db-sync" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.063541 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0310482f-ea00-44c5-8450-6b3b3ebe5e5f" containerName="keystone-db-sync" Dec 03 06:51:36 crc kubenswrapper[4831]: E1203 06:51:36.064921 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b55e481e-04a7-4cc6-bddd-7f8a87ca6b77" containerName="mariadb-account-create-update" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.064977 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b55e481e-04a7-4cc6-bddd-7f8a87ca6b77" containerName="mariadb-account-create-update" Dec 03 06:51:36 crc kubenswrapper[4831]: E1203 06:51:36.065024 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3614a116-7996-4106-a222-e2542d9dd89d" containerName="mariadb-database-create" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.065068 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3614a116-7996-4106-a222-e2542d9dd89d" containerName="mariadb-database-create" Dec 03 06:51:36 crc kubenswrapper[4831]: E1203 06:51:36.065130 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59ff8d77-29ce-4d59-919c-4280b2489608" containerName="mariadb-database-create" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.065175 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="59ff8d77-29ce-4d59-919c-4280b2489608" containerName="mariadb-database-create" Dec 03 06:51:36 crc kubenswrapper[4831]: E1203 06:51:36.065236 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="66411938-f6a0-4c6a-a478-2b9a451a2275" containerName="mariadb-database-create" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.065285 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="66411938-f6a0-4c6a-a478-2b9a451a2275" containerName="mariadb-database-create" Dec 03 06:51:36 crc kubenswrapper[4831]: E1203 06:51:36.065358 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd0577c-85d9-4133-92a0-c9d6f78de9c3" containerName="init" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.065405 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd0577c-85d9-4133-92a0-c9d6f78de9c3" containerName="init" Dec 03 06:51:36 crc kubenswrapper[4831]: E1203 06:51:36.065460 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ebd0577c-85d9-4133-92a0-c9d6f78de9c3" containerName="dnsmasq-dns" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.065508 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ebd0577c-85d9-4133-92a0-c9d6f78de9c3" containerName="dnsmasq-dns" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.065805 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b55e481e-04a7-4cc6-bddd-7f8a87ca6b77" containerName="mariadb-account-create-update" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.065860 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ebd0577c-85d9-4133-92a0-c9d6f78de9c3" containerName="dnsmasq-dns" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.065915 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b12b0f68-97d1-4253-b83b-692dbb30d970" containerName="mariadb-account-create-update" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.065984 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a2f4d8e-71bd-4abc-b0db-165cb43fea80" containerName="mariadb-account-create-update" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.066040 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="59ff8d77-29ce-4d59-919c-4280b2489608" containerName="mariadb-database-create" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.066104 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="66411938-f6a0-4c6a-a478-2b9a451a2275" containerName="mariadb-database-create" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.066163 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0310482f-ea00-44c5-8450-6b3b3ebe5e5f" containerName="keystone-db-sync" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.066217 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3614a116-7996-4106-a222-e2542d9dd89d" containerName="mariadb-database-create" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.066883 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.072229 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.072459 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.072612 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.072699 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.072842 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v8w6b" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.079134 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-h8mjv"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.085433 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pwh2s"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.085656 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" podUID="9c3895c6-516b-4681-b3c3-0e9291dfe322" containerName="dnsmasq-dns" containerID="cri-o://bf3305a025f14b656d2a04fe1dcbbd105079d7a12d26aa4a54d3b942e9fa3357" gracePeriod=10 Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.087521 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.208553 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-k2hwh"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.209955 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.250339 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-credential-keys\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.250389 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dw5r\" (UniqueName: \"kubernetes.io/projected/1331d712-2aa9-4aff-80dc-aed681290923-kube-api-access-8dw5r\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.250418 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-fernet-keys\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.250449 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-scripts\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.250466 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-config-data\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.250509 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-combined-ca-bundle\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.258372 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-k2hwh"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.364261 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-config\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.364348 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-credential-keys\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.364376 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.364405 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dw5r\" (UniqueName: \"kubernetes.io/projected/1331d712-2aa9-4aff-80dc-aed681290923-kube-api-access-8dw5r\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.364421 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-fernet-keys\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.364454 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-scripts\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.364469 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-config-data\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.364495 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.364523 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.364539 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-combined-ca-bundle\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.364571 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x99n5\" (UniqueName: \"kubernetes.io/projected/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-kube-api-access-x99n5\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.364594 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.381217 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-fernet-keys\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.386006 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-config-data\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.391133 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-combined-ca-bundle\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.391362 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-credential-keys\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.405569 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-scripts\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.454926 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dw5r\" (UniqueName: \"kubernetes.io/projected/1331d712-2aa9-4aff-80dc-aed681290923-kube-api-access-8dw5r\") pod \"keystone-bootstrap-h8mjv\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.465623 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.465875 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.465924 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.465972 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x99n5\" (UniqueName: \"kubernetes.io/projected/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-kube-api-access-x99n5\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.465999 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.466063 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-config\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.467170 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-config\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.467881 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.469695 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.470358 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.482898 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.491951 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-dkxbk"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.493244 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.494184 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" podUID="9c3895c6-516b-4681-b3c3-0e9291dfe322" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.136:5353: connect: connection refused" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.498025 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.498710 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.498868 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qh756" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.500072 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x99n5\" (UniqueName: \"kubernetes.io/projected/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-kube-api-access-x99n5\") pod \"dnsmasq-dns-bbf5cc879-k2hwh\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.535627 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-68mzw"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.540702 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-68mzw" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.542857 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.552734 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-r8xlx" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.555746 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.556401 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.576834 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-68mzw"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.586639 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dkxbk"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.627373 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.629577 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.642835 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.643124 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.665171 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.674262 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6e74a76-928d-4a03-ae60-6749fafef9ae-config\") pod \"neutron-db-sync-68mzw\" (UID: \"b6e74a76-928d-4a03-ae60-6749fafef9ae\") " pod="openstack/neutron-db-sync-68mzw" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.674301 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66ksk\" (UniqueName: \"kubernetes.io/projected/6e478744-468b-40e8-b4a3-236bdd2bd5ca-kube-api-access-66ksk\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.674365 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-db-sync-config-data\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.674402 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e74a76-928d-4a03-ae60-6749fafef9ae-combined-ca-bundle\") pod \"neutron-db-sync-68mzw\" (UID: \"b6e74a76-928d-4a03-ae60-6749fafef9ae\") " pod="openstack/neutron-db-sync-68mzw" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.674429 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-combined-ca-bundle\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.674450 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-scripts\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.674469 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zgwvx\" (UniqueName: \"kubernetes.io/projected/b6e74a76-928d-4a03-ae60-6749fafef9ae-kube-api-access-zgwvx\") pod \"neutron-db-sync-68mzw\" (UID: \"b6e74a76-928d-4a03-ae60-6749fafef9ae\") " pod="openstack/neutron-db-sync-68mzw" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.674530 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-config-data\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.674571 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e478744-468b-40e8-b4a3-236bdd2bd5ca-etc-machine-id\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.693818 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.704836 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-hh6jm"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.709767 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.730970 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.731212 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9lcvr" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.731349 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.748161 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-t4m8r"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.751911 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t4m8r" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.755841 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-n9qq9" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.758627 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776258 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-config-data\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776308 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e478744-468b-40e8-b4a3-236bdd2bd5ca-etc-machine-id\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776372 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6e74a76-928d-4a03-ae60-6749fafef9ae-config\") pod \"neutron-db-sync-68mzw\" (UID: \"b6e74a76-928d-4a03-ae60-6749fafef9ae\") " pod="openstack/neutron-db-sync-68mzw" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776392 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66ksk\" (UniqueName: \"kubernetes.io/projected/6e478744-468b-40e8-b4a3-236bdd2bd5ca-kube-api-access-66ksk\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776409 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-db-sync-config-data\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776500 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e74a76-928d-4a03-ae60-6749fafef9ae-combined-ca-bundle\") pod \"neutron-db-sync-68mzw\" (UID: \"b6e74a76-928d-4a03-ae60-6749fafef9ae\") " pod="openstack/neutron-db-sync-68mzw" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776522 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-combined-ca-bundle\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776568 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-scripts\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776584 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7a6ac9f-6470-4d84-a939-831f270eae54-log-httpd\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776604 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-scripts\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776647 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zgwvx\" (UniqueName: \"kubernetes.io/projected/b6e74a76-928d-4a03-ae60-6749fafef9ae-kube-api-access-zgwvx\") pod \"neutron-db-sync-68mzw\" (UID: \"b6e74a76-928d-4a03-ae60-6749fafef9ae\") " pod="openstack/neutron-db-sync-68mzw" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776711 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t685g\" (UniqueName: \"kubernetes.io/projected/a7a6ac9f-6470-4d84-a939-831f270eae54-kube-api-access-t685g\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776747 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776766 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776803 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-config-data\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776822 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7a6ac9f-6470-4d84-a939-831f270eae54-run-httpd\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.776926 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e478744-468b-40e8-b4a3-236bdd2bd5ca-etc-machine-id\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.783327 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e74a76-928d-4a03-ae60-6749fafef9ae-combined-ca-bundle\") pod \"neutron-db-sync-68mzw\" (UID: \"b6e74a76-928d-4a03-ae60-6749fafef9ae\") " pod="openstack/neutron-db-sync-68mzw" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.784529 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-db-sync-config-data\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.788024 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-combined-ca-bundle\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.789817 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6e74a76-928d-4a03-ae60-6749fafef9ae-config\") pod \"neutron-db-sync-68mzw\" (UID: \"b6e74a76-928d-4a03-ae60-6749fafef9ae\") " pod="openstack/neutron-db-sync-68mzw" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.790449 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-scripts\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.793656 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-config-data\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.798341 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zgwvx\" (UniqueName: \"kubernetes.io/projected/b6e74a76-928d-4a03-ae60-6749fafef9ae-kube-api-access-zgwvx\") pod \"neutron-db-sync-68mzw\" (UID: \"b6e74a76-928d-4a03-ae60-6749fafef9ae\") " pod="openstack/neutron-db-sync-68mzw" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.804348 4831 generic.go:334] "Generic (PLEG): container finished" podID="9c3895c6-516b-4681-b3c3-0e9291dfe322" containerID="bf3305a025f14b656d2a04fe1dcbbd105079d7a12d26aa4a54d3b942e9fa3357" exitCode=0 Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.804583 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66ksk\" (UniqueName: \"kubernetes.io/projected/6e478744-468b-40e8-b4a3-236bdd2bd5ca-kube-api-access-66ksk\") pod \"cinder-db-sync-dkxbk\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.770942 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-t4m8r"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.808346 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" event={"ID":"9c3895c6-516b-4681-b3c3-0e9291dfe322","Type":"ContainerDied","Data":"bf3305a025f14b656d2a04fe1dcbbd105079d7a12d26aa4a54d3b942e9fa3357"} Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.808378 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hh6jm"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.808390 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-k2hwh"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.819978 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xv9jp"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.821289 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.838915 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xv9jp"] Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.878815 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-scripts\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.878883 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58a3c00-1c5b-4c17-88d7-459798e81d76-logs\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.878942 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-scripts\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.878964 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7a6ac9f-6470-4d84-a939-831f270eae54-log-httpd\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.879033 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f2a305c-36bf-477e-8468-919407ea5d90-db-sync-config-data\") pod \"barbican-db-sync-t4m8r\" (UID: \"9f2a305c-36bf-477e-8468-919407ea5d90\") " pod="openstack/barbican-db-sync-t4m8r" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.879070 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t685g\" (UniqueName: \"kubernetes.io/projected/a7a6ac9f-6470-4d84-a939-831f270eae54-kube-api-access-t685g\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.879094 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-config-data\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.879125 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9tmm\" (UniqueName: \"kubernetes.io/projected/c58a3c00-1c5b-4c17-88d7-459798e81d76-kube-api-access-g9tmm\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.879145 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlg7g\" (UniqueName: \"kubernetes.io/projected/9f2a305c-36bf-477e-8468-919407ea5d90-kube-api-access-nlg7g\") pod \"barbican-db-sync-t4m8r\" (UID: \"9f2a305c-36bf-477e-8468-919407ea5d90\") " pod="openstack/barbican-db-sync-t4m8r" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.879164 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-combined-ca-bundle\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.879193 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.879217 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.879238 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f2a305c-36bf-477e-8468-919407ea5d90-combined-ca-bundle\") pod \"barbican-db-sync-t4m8r\" (UID: \"9f2a305c-36bf-477e-8468-919407ea5d90\") " pod="openstack/barbican-db-sync-t4m8r" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.879265 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7a6ac9f-6470-4d84-a939-831f270eae54-run-httpd\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.879294 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-config-data\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.879940 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7a6ac9f-6470-4d84-a939-831f270eae54-log-httpd\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.881663 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.882704 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7a6ac9f-6470-4d84-a939-831f270eae54-run-httpd\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.887107 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.887188 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-scripts\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.887236 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-config-data\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.889799 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.897340 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t685g\" (UniqueName: \"kubernetes.io/projected/a7a6ac9f-6470-4d84-a939-831f270eae54-kube-api-access-t685g\") pod \"ceilometer-0\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " pod="openstack/ceilometer-0" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.899289 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-68mzw" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988295 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9tmm\" (UniqueName: \"kubernetes.io/projected/c58a3c00-1c5b-4c17-88d7-459798e81d76-kube-api-access-g9tmm\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988409 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-combined-ca-bundle\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988426 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nlg7g\" (UniqueName: \"kubernetes.io/projected/9f2a305c-36bf-477e-8468-919407ea5d90-kube-api-access-nlg7g\") pod \"barbican-db-sync-t4m8r\" (UID: \"9f2a305c-36bf-477e-8468-919407ea5d90\") " pod="openstack/barbican-db-sync-t4m8r" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988452 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f2a305c-36bf-477e-8468-919407ea5d90-combined-ca-bundle\") pod \"barbican-db-sync-t4m8r\" (UID: \"9f2a305c-36bf-477e-8468-919407ea5d90\") " pod="openstack/barbican-db-sync-t4m8r" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988497 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988520 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-scripts\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988555 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58a3c00-1c5b-4c17-88d7-459798e81d76-logs\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988572 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988594 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-config\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988631 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqdwk\" (UniqueName: \"kubernetes.io/projected/6fe82dbc-16d3-4d94-b356-d0b862ba2019-kube-api-access-zqdwk\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988663 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f2a305c-36bf-477e-8468-919407ea5d90-db-sync-config-data\") pod \"barbican-db-sync-t4m8r\" (UID: \"9f2a305c-36bf-477e-8468-919407ea5d90\") " pod="openstack/barbican-db-sync-t4m8r" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988691 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988713 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.988734 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-config-data\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.989539 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58a3c00-1c5b-4c17-88d7-459798e81d76-logs\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.993430 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-combined-ca-bundle\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.995803 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-scripts\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.996269 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f2a305c-36bf-477e-8468-919407ea5d90-db-sync-config-data\") pod \"barbican-db-sync-t4m8r\" (UID: \"9f2a305c-36bf-477e-8468-919407ea5d90\") " pod="openstack/barbican-db-sync-t4m8r" Dec 03 06:51:36 crc kubenswrapper[4831]: I1203 06:51:36.996765 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f2a305c-36bf-477e-8468-919407ea5d90-combined-ca-bundle\") pod \"barbican-db-sync-t4m8r\" (UID: \"9f2a305c-36bf-477e-8468-919407ea5d90\") " pod="openstack/barbican-db-sync-t4m8r" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.003019 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-config-data\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.004790 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlg7g\" (UniqueName: \"kubernetes.io/projected/9f2a305c-36bf-477e-8468-919407ea5d90-kube-api-access-nlg7g\") pod \"barbican-db-sync-t4m8r\" (UID: \"9f2a305c-36bf-477e-8468-919407ea5d90\") " pod="openstack/barbican-db-sync-t4m8r" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.013912 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9tmm\" (UniqueName: \"kubernetes.io/projected/c58a3c00-1c5b-4c17-88d7-459798e81d76-kube-api-access-g9tmm\") pod \"placement-db-sync-hh6jm\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.057575 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.086871 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hh6jm" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.090492 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.090530 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.090582 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.090618 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.090638 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-config\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.090674 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqdwk\" (UniqueName: \"kubernetes.io/projected/6fe82dbc-16d3-4d94-b356-d0b862ba2019-kube-api-access-zqdwk\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.092925 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.093334 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.093546 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.095551 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-config\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.096923 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.098886 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t4m8r" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.123095 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqdwk\" (UniqueName: \"kubernetes.io/projected/6fe82dbc-16d3-4d94-b356-d0b862ba2019-kube-api-access-zqdwk\") pod \"dnsmasq-dns-56df8fb6b7-xv9jp\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.139860 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.198172 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-k2hwh"] Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.232760 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.234377 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.237453 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.240461 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.240709 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.240941 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b8246" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.253442 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.369824 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-h8mjv"] Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.380957 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.383018 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.384444 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.386101 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.390295 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.410371 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5g68\" (UniqueName: \"kubernetes.io/projected/80da9f17-8958-4454-a183-2c4761b6847d-kube-api-access-n5g68\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.410430 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.410457 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80da9f17-8958-4454-a183-2c4761b6847d-logs\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.410486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.410523 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-scripts\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.410559 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.410642 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-config-data\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.410680 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/80da9f17-8958-4454-a183-2c4761b6847d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.507688 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-dkxbk"] Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513482 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134c032a-7dc1-45df-9512-849be8d385bf-logs\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513549 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5g68\" (UniqueName: \"kubernetes.io/projected/80da9f17-8958-4454-a183-2c4761b6847d-kube-api-access-n5g68\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513586 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513605 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513624 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80da9f17-8958-4454-a183-2c4761b6847d-logs\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513642 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513666 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-scripts\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513692 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513717 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134c032a-7dc1-45df-9512-849be8d385bf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513757 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513773 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcnxm\" (UniqueName: \"kubernetes.io/projected/134c032a-7dc1-45df-9512-849be8d385bf-kube-api-access-bcnxm\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513792 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513810 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513827 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-config-data\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513852 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/80da9f17-8958-4454-a183-2c4761b6847d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.513876 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.514896 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80da9f17-8958-4454-a183-2c4761b6847d-logs\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.515346 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.517803 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.521331 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/80da9f17-8958-4454-a183-2c4761b6847d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: W1203 06:51:37.525566 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e478744_468b_40e8_b4a3_236bdd2bd5ca.slice/crio-e9c6795a9a67c4e369d9d4937a274d77480327f5f5809ee87d1cd6e489970565 WatchSource:0}: Error finding container e9c6795a9a67c4e369d9d4937a274d77480327f5f5809ee87d1cd6e489970565: Status 404 returned error can't find the container with id e9c6795a9a67c4e369d9d4937a274d77480327f5f5809ee87d1cd6e489970565 Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.525818 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-config-data\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.528361 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.531407 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-scripts\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.533791 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5g68\" (UniqueName: \"kubernetes.io/projected/80da9f17-8958-4454-a183-2c4761b6847d-kube-api-access-n5g68\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.585388 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.615396 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134c032a-7dc1-45df-9512-849be8d385bf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.615463 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.615488 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcnxm\" (UniqueName: \"kubernetes.io/projected/134c032a-7dc1-45df-9512-849be8d385bf-kube-api-access-bcnxm\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.615509 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.615536 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.615584 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.615608 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134c032a-7dc1-45df-9512-849be8d385bf-logs\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.615650 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.615960 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.617068 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134c032a-7dc1-45df-9512-849be8d385bf-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.622573 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-config-data\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.622709 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-scripts\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.623173 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134c032a-7dc1-45df-9512-849be8d385bf-logs\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.624426 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.626911 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.639103 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcnxm\" (UniqueName: \"kubernetes.io/projected/134c032a-7dc1-45df-9512-849be8d385bf-kube-api-access-bcnxm\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.650363 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-68mzw"] Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.654943 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.827786 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" event={"ID":"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5","Type":"ContainerStarted","Data":"bdd0cc3e22b8006e797ab9c429ac9547fb31dee5a2287d676e3f78df5fdbbf1e"} Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.829288 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-68mzw" event={"ID":"b6e74a76-928d-4a03-ae60-6749fafef9ae","Type":"ContainerStarted","Data":"c80b230cea3f188974fefc687d1bc1955c19c712068cda5da38d9b1876d5caaf"} Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.831707 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dkxbk" event={"ID":"6e478744-468b-40e8-b4a3-236bdd2bd5ca","Type":"ContainerStarted","Data":"e9c6795a9a67c4e369d9d4937a274d77480327f5f5809ee87d1cd6e489970565"} Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.832612 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h8mjv" event={"ID":"1331d712-2aa9-4aff-80dc-aed681290923","Type":"ContainerStarted","Data":"884a095a7cb8f0f9c2a977af65755d61387af687a1b25fbd9d0104baebfb209e"} Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.837694 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" event={"ID":"9c3895c6-516b-4681-b3c3-0e9291dfe322","Type":"ContainerDied","Data":"f8a9f1707ad07bdeff1e1ee42e52a17df79e02faa58bcf8cf43507966af343c5"} Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.837752 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8a9f1707ad07bdeff1e1ee42e52a17df79e02faa58bcf8cf43507966af343c5" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.850268 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-hh6jm"] Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.871275 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.876604 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-t4m8r"] Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.898986 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xv9jp"] Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.911553 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.916628 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.938901 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.997459 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-dns-svc\") pod \"9c3895c6-516b-4681-b3c3-0e9291dfe322\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.997898 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-ovsdbserver-nb\") pod \"9c3895c6-516b-4681-b3c3-0e9291dfe322\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.997955 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-dns-swift-storage-0\") pod \"9c3895c6-516b-4681-b3c3-0e9291dfe322\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.997982 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-ovsdbserver-sb\") pod \"9c3895c6-516b-4681-b3c3-0e9291dfe322\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.998013 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h44j\" (UniqueName: \"kubernetes.io/projected/9c3895c6-516b-4681-b3c3-0e9291dfe322-kube-api-access-2h44j\") pod \"9c3895c6-516b-4681-b3c3-0e9291dfe322\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " Dec 03 06:51:37 crc kubenswrapper[4831]: I1203 06:51:37.998065 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-config\") pod \"9c3895c6-516b-4681-b3c3-0e9291dfe322\" (UID: \"9c3895c6-516b-4681-b3c3-0e9291dfe322\") " Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.001912 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c3895c6-516b-4681-b3c3-0e9291dfe322-kube-api-access-2h44j" (OuterVolumeSpecName: "kube-api-access-2h44j") pod "9c3895c6-516b-4681-b3c3-0e9291dfe322" (UID: "9c3895c6-516b-4681-b3c3-0e9291dfe322"). InnerVolumeSpecName "kube-api-access-2h44j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.066797 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "9c3895c6-516b-4681-b3c3-0e9291dfe322" (UID: "9c3895c6-516b-4681-b3c3-0e9291dfe322"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.103203 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-config" (OuterVolumeSpecName: "config") pod "9c3895c6-516b-4681-b3c3-0e9291dfe322" (UID: "9c3895c6-516b-4681-b3c3-0e9291dfe322"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.104222 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.104238 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h44j\" (UniqueName: \"kubernetes.io/projected/9c3895c6-516b-4681-b3c3-0e9291dfe322-kube-api-access-2h44j\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.104248 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.105023 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9c3895c6-516b-4681-b3c3-0e9291dfe322" (UID: "9c3895c6-516b-4681-b3c3-0e9291dfe322"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.106601 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "9c3895c6-516b-4681-b3c3-0e9291dfe322" (UID: "9c3895c6-516b-4681-b3c3-0e9291dfe322"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.111127 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "9c3895c6-516b-4681-b3c3-0e9291dfe322" (UID: "9c3895c6-516b-4681-b3c3-0e9291dfe322"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.205302 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.205345 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.205355 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/9c3895c6-516b-4681-b3c3-0e9291dfe322-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.668459 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.737969 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.756133 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:51:38 crc kubenswrapper[4831]: W1203 06:51:38.786434 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod134c032a_7dc1_45df_9512_849be8d385bf.slice/crio-7ce65f8f315add44511954df2cc85c4d4d119bd2bf255ed524d84ef5986baf05 WatchSource:0}: Error finding container 7ce65f8f315add44511954df2cc85c4d4d119bd2bf255ed524d84ef5986baf05: Status 404 returned error can't find the container with id 7ce65f8f315add44511954df2cc85c4d4d119bd2bf255ed524d84ef5986baf05 Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.800729 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.829998 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.882447 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hh6jm" event={"ID":"c58a3c00-1c5b-4c17-88d7-459798e81d76","Type":"ContainerStarted","Data":"666a64d7877ba5d68da8014e2e260831d511ef7e28dca7a542dae391651d4062"} Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.892587 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-68mzw" event={"ID":"b6e74a76-928d-4a03-ae60-6749fafef9ae","Type":"ContainerStarted","Data":"4eef0053326ed74823bd51d439a4fac226fd4c4da5f8fb1077ece054b1be9bf8"} Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.894562 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134c032a-7dc1-45df-9512-849be8d385bf","Type":"ContainerStarted","Data":"7ce65f8f315add44511954df2cc85c4d4d119bd2bf255ed524d84ef5986baf05"} Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.910143 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-68mzw" podStartSLOduration=2.910099079 podStartE2EDuration="2.910099079s" podCreationTimestamp="2025-12-03 06:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:38.904604597 +0000 UTC m=+1236.248188105" watchObservedRunningTime="2025-12-03 06:51:38.910099079 +0000 UTC m=+1236.253682587" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.911671 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t4m8r" event={"ID":"9f2a305c-36bf-477e-8468-919407ea5d90","Type":"ContainerStarted","Data":"9f9b77991af6a9839b07dc279437ba6060076b65950d70e99b6675607d1bbed8"} Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.936112 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80da9f17-8958-4454-a183-2c4761b6847d","Type":"ContainerStarted","Data":"a42e73f3069f4f8124eedd2a9e090ca6d3741d3f190350ede5dac68980119ddd"} Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.937996 4831 generic.go:334] "Generic (PLEG): container finished" podID="e1fdcf55-846d-4f97-b3e9-fea6afd59eb5" containerID="0db1c249643f7a671087662bf58b0ff63f74038b7bb937213bc5c496743487ad" exitCode=0 Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.938369 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" event={"ID":"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5","Type":"ContainerDied","Data":"0db1c249643f7a671087662bf58b0ff63f74038b7bb937213bc5c496743487ad"} Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.953051 4831 generic.go:334] "Generic (PLEG): container finished" podID="6fe82dbc-16d3-4d94-b356-d0b862ba2019" containerID="193af4264efc564c1aa18e6af7c841667713b4ea0f34fad4c1c23faffa7af9d9" exitCode=0 Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.953121 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" event={"ID":"6fe82dbc-16d3-4d94-b356-d0b862ba2019","Type":"ContainerDied","Data":"193af4264efc564c1aa18e6af7c841667713b4ea0f34fad4c1c23faffa7af9d9"} Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.953147 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" event={"ID":"6fe82dbc-16d3-4d94-b356-d0b862ba2019","Type":"ContainerStarted","Data":"82203b0cf936c248d348a35b548d187a38fc3503a2d3e3adc9e2a3f36770a9b1"} Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.956366 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h8mjv" event={"ID":"1331d712-2aa9-4aff-80dc-aed681290923","Type":"ContainerStarted","Data":"70797ca9857cf493d328084e6e35279fdc2c8eaa7d4a14ba2b9c4f7d662a349c"} Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.967285 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-pwh2s" Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.968857 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7a6ac9f-6470-4d84-a939-831f270eae54","Type":"ContainerStarted","Data":"ad3bbf7df0f8212abe98d4c2e2ee78158e622e9d1d944b9218e52b270c9b9b31"} Dec 03 06:51:38 crc kubenswrapper[4831]: I1203 06:51:38.986852 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-h8mjv" podStartSLOduration=2.986830904 podStartE2EDuration="2.986830904s" podCreationTimestamp="2025-12-03 06:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:38.986749302 +0000 UTC m=+1236.330332820" watchObservedRunningTime="2025-12-03 06:51:38.986830904 +0000 UTC m=+1236.330414412" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.083591 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pwh2s"] Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.083621 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-pwh2s"] Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.336235 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.447053 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-ovsdbserver-sb\") pod \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.447169 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-dns-swift-storage-0\") pod \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.447205 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-dns-svc\") pod \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.447291 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x99n5\" (UniqueName: \"kubernetes.io/projected/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-kube-api-access-x99n5\") pod \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.447331 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-ovsdbserver-nb\") pod \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.447416 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-config\") pod \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\" (UID: \"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5\") " Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.464911 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-kube-api-access-x99n5" (OuterVolumeSpecName: "kube-api-access-x99n5") pod "e1fdcf55-846d-4f97-b3e9-fea6afd59eb5" (UID: "e1fdcf55-846d-4f97-b3e9-fea6afd59eb5"). InnerVolumeSpecName "kube-api-access-x99n5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.476978 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e1fdcf55-846d-4f97-b3e9-fea6afd59eb5" (UID: "e1fdcf55-846d-4f97-b3e9-fea6afd59eb5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.477739 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e1fdcf55-846d-4f97-b3e9-fea6afd59eb5" (UID: "e1fdcf55-846d-4f97-b3e9-fea6afd59eb5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.490767 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e1fdcf55-846d-4f97-b3e9-fea6afd59eb5" (UID: "e1fdcf55-846d-4f97-b3e9-fea6afd59eb5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.492850 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-config" (OuterVolumeSpecName: "config") pod "e1fdcf55-846d-4f97-b3e9-fea6afd59eb5" (UID: "e1fdcf55-846d-4f97-b3e9-fea6afd59eb5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.506124 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e1fdcf55-846d-4f97-b3e9-fea6afd59eb5" (UID: "e1fdcf55-846d-4f97-b3e9-fea6afd59eb5"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.549960 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.549994 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.550007 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.550039 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.550065 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.550076 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x99n5\" (UniqueName: \"kubernetes.io/projected/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5-kube-api-access-x99n5\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.994875 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" event={"ID":"6fe82dbc-16d3-4d94-b356-d0b862ba2019","Type":"ContainerStarted","Data":"d4b25be6a2b9d2dd9447b14b20d352089f64160bad21d040c013974a1a7d4440"} Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.996131 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.999436 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" event={"ID":"e1fdcf55-846d-4f97-b3e9-fea6afd59eb5","Type":"ContainerDied","Data":"bdd0cc3e22b8006e797ab9c429ac9547fb31dee5a2287d676e3f78df5fdbbf1e"} Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.999498 4831 scope.go:117] "RemoveContainer" containerID="0db1c249643f7a671087662bf58b0ff63f74038b7bb937213bc5c496743487ad" Dec 03 06:51:39 crc kubenswrapper[4831]: I1203 06:51:39.999687 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-k2hwh" Dec 03 06:51:40 crc kubenswrapper[4831]: I1203 06:51:40.052264 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" podStartSLOduration=4.052247639 podStartE2EDuration="4.052247639s" podCreationTimestamp="2025-12-03 06:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:40.011876105 +0000 UTC m=+1237.355459613" watchObservedRunningTime="2025-12-03 06:51:40.052247639 +0000 UTC m=+1237.395831147" Dec 03 06:51:40 crc kubenswrapper[4831]: I1203 06:51:40.174027 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-k2hwh"] Dec 03 06:51:40 crc kubenswrapper[4831]: I1203 06:51:40.182465 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-k2hwh"] Dec 03 06:51:41 crc kubenswrapper[4831]: I1203 06:51:41.012514 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="134c032a-7dc1-45df-9512-849be8d385bf" containerName="glance-log" containerID="cri-o://38e992e90f8cdd23f97b52c198fc2885241df14068b5bffcfb667f1dbd9cff24" gracePeriod=30 Dec 03 06:51:41 crc kubenswrapper[4831]: I1203 06:51:41.012579 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="134c032a-7dc1-45df-9512-849be8d385bf" containerName="glance-httpd" containerID="cri-o://7eb566f9dd4995b0244674294fa6dea622a920acb64a0f2f460995ca44f73948" gracePeriod=30 Dec 03 06:51:41 crc kubenswrapper[4831]: I1203 06:51:41.016043 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="80da9f17-8958-4454-a183-2c4761b6847d" containerName="glance-log" containerID="cri-o://d2479529cd432da141540ab78e74d5cc78398e8e90645c6aca60dbbff68c1718" gracePeriod=30 Dec 03 06:51:41 crc kubenswrapper[4831]: I1203 06:51:41.016269 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="80da9f17-8958-4454-a183-2c4761b6847d" containerName="glance-httpd" containerID="cri-o://9269f2010b2163ea7eb048fe35836cfcf7e94e1ae198e8400186fcf8799a0ddd" gracePeriod=30 Dec 03 06:51:41 crc kubenswrapper[4831]: I1203 06:51:41.033053 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c3895c6-516b-4681-b3c3-0e9291dfe322" path="/var/lib/kubelet/pods/9c3895c6-516b-4681-b3c3-0e9291dfe322/volumes" Dec 03 06:51:41 crc kubenswrapper[4831]: I1203 06:51:41.033940 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1fdcf55-846d-4f97-b3e9-fea6afd59eb5" path="/var/lib/kubelet/pods/e1fdcf55-846d-4f97-b3e9-fea6afd59eb5/volumes" Dec 03 06:51:41 crc kubenswrapper[4831]: I1203 06:51:41.041829 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134c032a-7dc1-45df-9512-849be8d385bf","Type":"ContainerStarted","Data":"7eb566f9dd4995b0244674294fa6dea622a920acb64a0f2f460995ca44f73948"} Dec 03 06:51:41 crc kubenswrapper[4831]: I1203 06:51:41.042340 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134c032a-7dc1-45df-9512-849be8d385bf","Type":"ContainerStarted","Data":"38e992e90f8cdd23f97b52c198fc2885241df14068b5bffcfb667f1dbd9cff24"} Dec 03 06:51:41 crc kubenswrapper[4831]: I1203 06:51:41.042353 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80da9f17-8958-4454-a183-2c4761b6847d","Type":"ContainerStarted","Data":"9269f2010b2163ea7eb048fe35836cfcf7e94e1ae198e8400186fcf8799a0ddd"} Dec 03 06:51:41 crc kubenswrapper[4831]: I1203 06:51:41.042365 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80da9f17-8958-4454-a183-2c4761b6847d","Type":"ContainerStarted","Data":"d2479529cd432da141540ab78e74d5cc78398e8e90645c6aca60dbbff68c1718"} Dec 03 06:51:41 crc kubenswrapper[4831]: I1203 06:51:41.050554 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.050533512 podStartE2EDuration="5.050533512s" podCreationTimestamp="2025-12-03 06:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:41.035393717 +0000 UTC m=+1238.378977235" watchObservedRunningTime="2025-12-03 06:51:41.050533512 +0000 UTC m=+1238.394117020" Dec 03 06:51:41 crc kubenswrapper[4831]: I1203 06:51:41.073002 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.072979216 podStartE2EDuration="5.072979216s" podCreationTimestamp="2025-12-03 06:51:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:41.056937992 +0000 UTC m=+1238.400521510" watchObservedRunningTime="2025-12-03 06:51:41.072979216 +0000 UTC m=+1238.416562724" Dec 03 06:51:42 crc kubenswrapper[4831]: I1203 06:51:42.043537 4831 generic.go:334] "Generic (PLEG): container finished" podID="134c032a-7dc1-45df-9512-849be8d385bf" containerID="7eb566f9dd4995b0244674294fa6dea622a920acb64a0f2f460995ca44f73948" exitCode=143 Dec 03 06:51:42 crc kubenswrapper[4831]: I1203 06:51:42.043573 4831 generic.go:334] "Generic (PLEG): container finished" podID="134c032a-7dc1-45df-9512-849be8d385bf" containerID="38e992e90f8cdd23f97b52c198fc2885241df14068b5bffcfb667f1dbd9cff24" exitCode=143 Dec 03 06:51:42 crc kubenswrapper[4831]: I1203 06:51:42.043632 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134c032a-7dc1-45df-9512-849be8d385bf","Type":"ContainerDied","Data":"7eb566f9dd4995b0244674294fa6dea622a920acb64a0f2f460995ca44f73948"} Dec 03 06:51:42 crc kubenswrapper[4831]: I1203 06:51:42.043699 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134c032a-7dc1-45df-9512-849be8d385bf","Type":"ContainerDied","Data":"38e992e90f8cdd23f97b52c198fc2885241df14068b5bffcfb667f1dbd9cff24"} Dec 03 06:51:42 crc kubenswrapper[4831]: I1203 06:51:42.053639 4831 generic.go:334] "Generic (PLEG): container finished" podID="80da9f17-8958-4454-a183-2c4761b6847d" containerID="9269f2010b2163ea7eb048fe35836cfcf7e94e1ae198e8400186fcf8799a0ddd" exitCode=143 Dec 03 06:51:42 crc kubenswrapper[4831]: I1203 06:51:42.053669 4831 generic.go:334] "Generic (PLEG): container finished" podID="80da9f17-8958-4454-a183-2c4761b6847d" containerID="d2479529cd432da141540ab78e74d5cc78398e8e90645c6aca60dbbff68c1718" exitCode=143 Dec 03 06:51:42 crc kubenswrapper[4831]: I1203 06:51:42.054422 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80da9f17-8958-4454-a183-2c4761b6847d","Type":"ContainerDied","Data":"9269f2010b2163ea7eb048fe35836cfcf7e94e1ae198e8400186fcf8799a0ddd"} Dec 03 06:51:42 crc kubenswrapper[4831]: I1203 06:51:42.054447 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80da9f17-8958-4454-a183-2c4761b6847d","Type":"ContainerDied","Data":"d2479529cd432da141540ab78e74d5cc78398e8e90645c6aca60dbbff68c1718"} Dec 03 06:51:43 crc kubenswrapper[4831]: I1203 06:51:43.077676 4831 generic.go:334] "Generic (PLEG): container finished" podID="1331d712-2aa9-4aff-80dc-aed681290923" containerID="70797ca9857cf493d328084e6e35279fdc2c8eaa7d4a14ba2b9c4f7d662a349c" exitCode=0 Dec 03 06:51:43 crc kubenswrapper[4831]: I1203 06:51:43.077734 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h8mjv" event={"ID":"1331d712-2aa9-4aff-80dc-aed681290923","Type":"ContainerDied","Data":"70797ca9857cf493d328084e6e35279fdc2c8eaa7d4a14ba2b9c4f7d662a349c"} Dec 03 06:51:47 crc kubenswrapper[4831]: I1203 06:51:47.143047 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:51:47 crc kubenswrapper[4831]: I1203 06:51:47.200978 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-bkmqn"] Dec 03 06:51:47 crc kubenswrapper[4831]: I1203 06:51:47.201285 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" podUID="ff831122-53fa-4905-a6c7-71f216c98da5" containerName="dnsmasq-dns" containerID="cri-o://003eea5872bfb8451fffe36f9af7621579806c93e61afbb27894cb3cfeb28df9" gracePeriod=10 Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.068473 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" podUID="ff831122-53fa-4905-a6c7-71f216c98da5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: connect: connection refused" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.103042 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.132197 4831 generic.go:334] "Generic (PLEG): container finished" podID="ff831122-53fa-4905-a6c7-71f216c98da5" containerID="003eea5872bfb8451fffe36f9af7621579806c93e61afbb27894cb3cfeb28df9" exitCode=0 Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.132271 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" event={"ID":"ff831122-53fa-4905-a6c7-71f216c98da5","Type":"ContainerDied","Data":"003eea5872bfb8451fffe36f9af7621579806c93e61afbb27894cb3cfeb28df9"} Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.135239 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-h8mjv" event={"ID":"1331d712-2aa9-4aff-80dc-aed681290923","Type":"ContainerDied","Data":"884a095a7cb8f0f9c2a977af65755d61387af687a1b25fbd9d0104baebfb209e"} Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.135289 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="884a095a7cb8f0f9c2a977af65755d61387af687a1b25fbd9d0104baebfb209e" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.135370 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-h8mjv" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.243962 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-scripts\") pod \"1331d712-2aa9-4aff-80dc-aed681290923\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.244035 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-combined-ca-bundle\") pod \"1331d712-2aa9-4aff-80dc-aed681290923\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.244149 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-credential-keys\") pod \"1331d712-2aa9-4aff-80dc-aed681290923\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.244232 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dw5r\" (UniqueName: \"kubernetes.io/projected/1331d712-2aa9-4aff-80dc-aed681290923-kube-api-access-8dw5r\") pod \"1331d712-2aa9-4aff-80dc-aed681290923\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.244259 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-config-data\") pod \"1331d712-2aa9-4aff-80dc-aed681290923\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.244285 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-fernet-keys\") pod \"1331d712-2aa9-4aff-80dc-aed681290923\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.249844 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-scripts" (OuterVolumeSpecName: "scripts") pod "1331d712-2aa9-4aff-80dc-aed681290923" (UID: "1331d712-2aa9-4aff-80dc-aed681290923"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.249986 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1331d712-2aa9-4aff-80dc-aed681290923-kube-api-access-8dw5r" (OuterVolumeSpecName: "kube-api-access-8dw5r") pod "1331d712-2aa9-4aff-80dc-aed681290923" (UID: "1331d712-2aa9-4aff-80dc-aed681290923"). InnerVolumeSpecName "kube-api-access-8dw5r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.250555 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "1331d712-2aa9-4aff-80dc-aed681290923" (UID: "1331d712-2aa9-4aff-80dc-aed681290923"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.251559 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "1331d712-2aa9-4aff-80dc-aed681290923" (UID: "1331d712-2aa9-4aff-80dc-aed681290923"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: E1203 06:51:48.274878 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-config-data podName:1331d712-2aa9-4aff-80dc-aed681290923 nodeName:}" failed. No retries permitted until 2025-12-03 06:51:48.774847352 +0000 UTC m=+1246.118430860 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-config-data") pod "1331d712-2aa9-4aff-80dc-aed681290923" (UID: "1331d712-2aa9-4aff-80dc-aed681290923") : error deleting /var/lib/kubelet/pods/1331d712-2aa9-4aff-80dc-aed681290923/volume-subpaths: remove /var/lib/kubelet/pods/1331d712-2aa9-4aff-80dc-aed681290923/volume-subpaths: no such file or directory Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.291563 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1331d712-2aa9-4aff-80dc-aed681290923" (UID: "1331d712-2aa9-4aff-80dc-aed681290923"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.347479 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dw5r\" (UniqueName: \"kubernetes.io/projected/1331d712-2aa9-4aff-80dc-aed681290923-kube-api-access-8dw5r\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.348305 4831 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.348349 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.348360 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.348369 4831 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.673750 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.857088 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-scripts\") pod \"80da9f17-8958-4454-a183-2c4761b6847d\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.857160 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5g68\" (UniqueName: \"kubernetes.io/projected/80da9f17-8958-4454-a183-2c4761b6847d-kube-api-access-n5g68\") pod \"80da9f17-8958-4454-a183-2c4761b6847d\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.857223 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/80da9f17-8958-4454-a183-2c4761b6847d-httpd-run\") pod \"80da9f17-8958-4454-a183-2c4761b6847d\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.857247 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-config-data\") pod \"80da9f17-8958-4454-a183-2c4761b6847d\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.857372 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-config-data\") pod \"1331d712-2aa9-4aff-80dc-aed681290923\" (UID: \"1331d712-2aa9-4aff-80dc-aed681290923\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.857408 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"80da9f17-8958-4454-a183-2c4761b6847d\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.857458 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-combined-ca-bundle\") pod \"80da9f17-8958-4454-a183-2c4761b6847d\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.857480 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80da9f17-8958-4454-a183-2c4761b6847d-logs\") pod \"80da9f17-8958-4454-a183-2c4761b6847d\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.857511 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-public-tls-certs\") pod \"80da9f17-8958-4454-a183-2c4761b6847d\" (UID: \"80da9f17-8958-4454-a183-2c4761b6847d\") " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.858833 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80da9f17-8958-4454-a183-2c4761b6847d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "80da9f17-8958-4454-a183-2c4761b6847d" (UID: "80da9f17-8958-4454-a183-2c4761b6847d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.859573 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80da9f17-8958-4454-a183-2c4761b6847d-logs" (OuterVolumeSpecName: "logs") pod "80da9f17-8958-4454-a183-2c4761b6847d" (UID: "80da9f17-8958-4454-a183-2c4761b6847d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.860766 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-scripts" (OuterVolumeSpecName: "scripts") pod "80da9f17-8958-4454-a183-2c4761b6847d" (UID: "80da9f17-8958-4454-a183-2c4761b6847d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.861603 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80da9f17-8958-4454-a183-2c4761b6847d-kube-api-access-n5g68" (OuterVolumeSpecName: "kube-api-access-n5g68") pod "80da9f17-8958-4454-a183-2c4761b6847d" (UID: "80da9f17-8958-4454-a183-2c4761b6847d"). InnerVolumeSpecName "kube-api-access-n5g68". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.861980 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-config-data" (OuterVolumeSpecName: "config-data") pod "1331d712-2aa9-4aff-80dc-aed681290923" (UID: "1331d712-2aa9-4aff-80dc-aed681290923"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.862854 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "80da9f17-8958-4454-a183-2c4761b6847d" (UID: "80da9f17-8958-4454-a183-2c4761b6847d"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.882186 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "80da9f17-8958-4454-a183-2c4761b6847d" (UID: "80da9f17-8958-4454-a183-2c4761b6847d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.903862 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "80da9f17-8958-4454-a183-2c4761b6847d" (UID: "80da9f17-8958-4454-a183-2c4761b6847d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.906528 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-config-data" (OuterVolumeSpecName: "config-data") pod "80da9f17-8958-4454-a183-2c4761b6847d" (UID: "80da9f17-8958-4454-a183-2c4761b6847d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.960054 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/80da9f17-8958-4454-a183-2c4761b6847d-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.960102 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.960122 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1331d712-2aa9-4aff-80dc-aed681290923-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.960171 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.960191 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.960213 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80da9f17-8958-4454-a183-2c4761b6847d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.960229 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.960282 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80da9f17-8958-4454-a183-2c4761b6847d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:48 crc kubenswrapper[4831]: I1203 06:51:48.960299 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5g68\" (UniqueName: \"kubernetes.io/projected/80da9f17-8958-4454-a183-2c4761b6847d-kube-api-access-n5g68\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.003008 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.062752 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.156856 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"80da9f17-8958-4454-a183-2c4761b6847d","Type":"ContainerDied","Data":"a42e73f3069f4f8124eedd2a9e090ca6d3741d3f190350ede5dac68980119ddd"} Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.156909 4831 scope.go:117] "RemoveContainer" containerID="9269f2010b2163ea7eb048fe35836cfcf7e94e1ae198e8400186fcf8799a0ddd" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.156945 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.189249 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.197100 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.213083 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-h8mjv"] Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.224773 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-h8mjv"] Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.231600 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:51:49 crc kubenswrapper[4831]: E1203 06:51:49.231996 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1fdcf55-846d-4f97-b3e9-fea6afd59eb5" containerName="init" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.232009 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1fdcf55-846d-4f97-b3e9-fea6afd59eb5" containerName="init" Dec 03 06:51:49 crc kubenswrapper[4831]: E1203 06:51:49.232015 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80da9f17-8958-4454-a183-2c4761b6847d" containerName="glance-log" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.232022 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="80da9f17-8958-4454-a183-2c4761b6847d" containerName="glance-log" Dec 03 06:51:49 crc kubenswrapper[4831]: E1203 06:51:49.232043 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3895c6-516b-4681-b3c3-0e9291dfe322" containerName="dnsmasq-dns" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.232049 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3895c6-516b-4681-b3c3-0e9291dfe322" containerName="dnsmasq-dns" Dec 03 06:51:49 crc kubenswrapper[4831]: E1203 06:51:49.232059 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1331d712-2aa9-4aff-80dc-aed681290923" containerName="keystone-bootstrap" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.232065 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1331d712-2aa9-4aff-80dc-aed681290923" containerName="keystone-bootstrap" Dec 03 06:51:49 crc kubenswrapper[4831]: E1203 06:51:49.232083 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80da9f17-8958-4454-a183-2c4761b6847d" containerName="glance-httpd" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.232088 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="80da9f17-8958-4454-a183-2c4761b6847d" containerName="glance-httpd" Dec 03 06:51:49 crc kubenswrapper[4831]: E1203 06:51:49.232098 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c3895c6-516b-4681-b3c3-0e9291dfe322" containerName="init" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.232103 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c3895c6-516b-4681-b3c3-0e9291dfe322" containerName="init" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.232273 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c3895c6-516b-4681-b3c3-0e9291dfe322" containerName="dnsmasq-dns" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.232283 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1fdcf55-846d-4f97-b3e9-fea6afd59eb5" containerName="init" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.232293 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1331d712-2aa9-4aff-80dc-aed681290923" containerName="keystone-bootstrap" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.232301 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="80da9f17-8958-4454-a183-2c4761b6847d" containerName="glance-log" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.232317 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="80da9f17-8958-4454-a183-2c4761b6847d" containerName="glance-httpd" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.233168 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.235650 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.235836 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.241031 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.289818 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-89rlq"] Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.291110 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.293604 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.294165 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v8w6b" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.294534 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.296543 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.296714 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.307211 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-89rlq"] Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.367769 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.367838 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba03de9b-b96c-4fc0-8534-64e3735dcea4-logs\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.367869 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.367897 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.368046 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba03de9b-b96c-4fc0-8534-64e3735dcea4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.368109 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qh5rw\" (UniqueName: \"kubernetes.io/projected/ba03de9b-b96c-4fc0-8534-64e3735dcea4-kube-api-access-qh5rw\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.368137 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.368193 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.470459 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-config-data\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.470530 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-combined-ca-bundle\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.470567 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-fernet-keys\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.470656 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9l5xl\" (UniqueName: \"kubernetes.io/projected/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-kube-api-access-9l5xl\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.470778 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.470882 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba03de9b-b96c-4fc0-8534-64e3735dcea4-logs\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.470915 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.470958 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.470991 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-credential-keys\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.471052 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-scripts\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.471112 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba03de9b-b96c-4fc0-8534-64e3735dcea4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.471183 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qh5rw\" (UniqueName: \"kubernetes.io/projected/ba03de9b-b96c-4fc0-8534-64e3735dcea4-kube-api-access-qh5rw\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.471228 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.471292 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.471401 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba03de9b-b96c-4fc0-8534-64e3735dcea4-logs\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.471678 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba03de9b-b96c-4fc0-8534-64e3735dcea4-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.471756 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.476047 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.477381 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-scripts\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.478851 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-config-data\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.481185 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.492940 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qh5rw\" (UniqueName: \"kubernetes.io/projected/ba03de9b-b96c-4fc0-8534-64e3735dcea4-kube-api-access-qh5rw\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.502041 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.561808 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.574434 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-credential-keys\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.574794 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-scripts\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.574887 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-config-data\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.574923 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-combined-ca-bundle\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.574961 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-fernet-keys\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.575139 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9l5xl\" (UniqueName: \"kubernetes.io/projected/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-kube-api-access-9l5xl\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.578404 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-scripts\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.578905 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-config-data\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.579484 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-combined-ca-bundle\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.580433 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-credential-keys\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.581274 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-fernet-keys\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.595865 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9l5xl\" (UniqueName: \"kubernetes.io/projected/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-kube-api-access-9l5xl\") pod \"keystone-bootstrap-89rlq\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:49 crc kubenswrapper[4831]: I1203 06:51:49.618002 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:51:51 crc kubenswrapper[4831]: I1203 06:51:51.032980 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1331d712-2aa9-4aff-80dc-aed681290923" path="/var/lib/kubelet/pods/1331d712-2aa9-4aff-80dc-aed681290923/volumes" Dec 03 06:51:51 crc kubenswrapper[4831]: I1203 06:51:51.034834 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80da9f17-8958-4454-a183-2c4761b6847d" path="/var/lib/kubelet/pods/80da9f17-8958-4454-a183-2c4761b6847d/volumes" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.047090 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.053292 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.223883 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"134c032a-7dc1-45df-9512-849be8d385bf","Type":"ContainerDied","Data":"7ce65f8f315add44511954df2cc85c4d4d119bd2bf255ed524d84ef5986baf05"} Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.223932 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.229684 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" event={"ID":"ff831122-53fa-4905-a6c7-71f216c98da5","Type":"ContainerDied","Data":"61e80bbe7a8066cf24d9b0fb26685887c26c087ebe567e68da5103160fec7f26"} Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.229769 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.232252 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-combined-ca-bundle\") pod \"134c032a-7dc1-45df-9512-849be8d385bf\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.232300 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-internal-tls-certs\") pod \"134c032a-7dc1-45df-9512-849be8d385bf\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.232335 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134c032a-7dc1-45df-9512-849be8d385bf-httpd-run\") pod \"134c032a-7dc1-45df-9512-849be8d385bf\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.232375 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"134c032a-7dc1-45df-9512-849be8d385bf\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.233718 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/134c032a-7dc1-45df-9512-849be8d385bf-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "134c032a-7dc1-45df-9512-849be8d385bf" (UID: "134c032a-7dc1-45df-9512-849be8d385bf"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.234872 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-ovsdbserver-sb\") pod \"ff831122-53fa-4905-a6c7-71f216c98da5\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.235003 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-scripts\") pod \"134c032a-7dc1-45df-9512-849be8d385bf\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.235663 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-config-data\") pod \"134c032a-7dc1-45df-9512-849be8d385bf\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.236433 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-config\") pod \"ff831122-53fa-4905-a6c7-71f216c98da5\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.236842 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-ovsdbserver-nb\") pod \"ff831122-53fa-4905-a6c7-71f216c98da5\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.236947 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "134c032a-7dc1-45df-9512-849be8d385bf" (UID: "134c032a-7dc1-45df-9512-849be8d385bf"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.237078 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-dns-svc\") pod \"ff831122-53fa-4905-a6c7-71f216c98da5\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.237153 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjb62\" (UniqueName: \"kubernetes.io/projected/ff831122-53fa-4905-a6c7-71f216c98da5-kube-api-access-sjb62\") pod \"ff831122-53fa-4905-a6c7-71f216c98da5\" (UID: \"ff831122-53fa-4905-a6c7-71f216c98da5\") " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.237252 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcnxm\" (UniqueName: \"kubernetes.io/projected/134c032a-7dc1-45df-9512-849be8d385bf-kube-api-access-bcnxm\") pod \"134c032a-7dc1-45df-9512-849be8d385bf\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.237347 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134c032a-7dc1-45df-9512-849be8d385bf-logs\") pod \"134c032a-7dc1-45df-9512-849be8d385bf\" (UID: \"134c032a-7dc1-45df-9512-849be8d385bf\") " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.237966 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/134c032a-7dc1-45df-9512-849be8d385bf-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.238064 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.240100 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/134c032a-7dc1-45df-9512-849be8d385bf-logs" (OuterVolumeSpecName: "logs") pod "134c032a-7dc1-45df-9512-849be8d385bf" (UID: "134c032a-7dc1-45df-9512-849be8d385bf"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.241440 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-scripts" (OuterVolumeSpecName: "scripts") pod "134c032a-7dc1-45df-9512-849be8d385bf" (UID: "134c032a-7dc1-45df-9512-849be8d385bf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.242660 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff831122-53fa-4905-a6c7-71f216c98da5-kube-api-access-sjb62" (OuterVolumeSpecName: "kube-api-access-sjb62") pod "ff831122-53fa-4905-a6c7-71f216c98da5" (UID: "ff831122-53fa-4905-a6c7-71f216c98da5"). InnerVolumeSpecName "kube-api-access-sjb62". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.261255 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/134c032a-7dc1-45df-9512-849be8d385bf-kube-api-access-bcnxm" (OuterVolumeSpecName: "kube-api-access-bcnxm") pod "134c032a-7dc1-45df-9512-849be8d385bf" (UID: "134c032a-7dc1-45df-9512-849be8d385bf"). InnerVolumeSpecName "kube-api-access-bcnxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.271049 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.278750 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "134c032a-7dc1-45df-9512-849be8d385bf" (UID: "134c032a-7dc1-45df-9512-849be8d385bf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.286289 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ff831122-53fa-4905-a6c7-71f216c98da5" (UID: "ff831122-53fa-4905-a6c7-71f216c98da5"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.287492 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ff831122-53fa-4905-a6c7-71f216c98da5" (UID: "ff831122-53fa-4905-a6c7-71f216c98da5"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.289344 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-config-data" (OuterVolumeSpecName: "config-data") pod "134c032a-7dc1-45df-9512-849be8d385bf" (UID: "134c032a-7dc1-45df-9512-849be8d385bf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.291268 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-config" (OuterVolumeSpecName: "config") pod "ff831122-53fa-4905-a6c7-71f216c98da5" (UID: "ff831122-53fa-4905-a6c7-71f216c98da5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.297567 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "134c032a-7dc1-45df-9512-849be8d385bf" (UID: "134c032a-7dc1-45df-9512-849be8d385bf"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.311868 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ff831122-53fa-4905-a6c7-71f216c98da5" (UID: "ff831122-53fa-4905-a6c7-71f216c98da5"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.339038 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.339253 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjb62\" (UniqueName: \"kubernetes.io/projected/ff831122-53fa-4905-a6c7-71f216c98da5-kube-api-access-sjb62\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.339330 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcnxm\" (UniqueName: \"kubernetes.io/projected/134c032a-7dc1-45df-9512-849be8d385bf-kube-api-access-bcnxm\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.339395 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/134c032a-7dc1-45df-9512-849be8d385bf-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.339454 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.339512 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.339567 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.339621 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.339678 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.339737 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/134c032a-7dc1-45df-9512-849be8d385bf-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.339991 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.340067 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ff831122-53fa-4905-a6c7-71f216c98da5-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.559669 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.566409 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.594731 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-bkmqn"] Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.604563 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-bkmqn"] Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.613632 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:51:56 crc kubenswrapper[4831]: E1203 06:51:56.614045 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff831122-53fa-4905-a6c7-71f216c98da5" containerName="init" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.614066 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff831122-53fa-4905-a6c7-71f216c98da5" containerName="init" Dec 03 06:51:56 crc kubenswrapper[4831]: E1203 06:51:56.614081 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134c032a-7dc1-45df-9512-849be8d385bf" containerName="glance-httpd" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.614091 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="134c032a-7dc1-45df-9512-849be8d385bf" containerName="glance-httpd" Dec 03 06:51:56 crc kubenswrapper[4831]: E1203 06:51:56.614119 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="134c032a-7dc1-45df-9512-849be8d385bf" containerName="glance-log" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.614128 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="134c032a-7dc1-45df-9512-849be8d385bf" containerName="glance-log" Dec 03 06:51:56 crc kubenswrapper[4831]: E1203 06:51:56.614158 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff831122-53fa-4905-a6c7-71f216c98da5" containerName="dnsmasq-dns" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.614166 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff831122-53fa-4905-a6c7-71f216c98da5" containerName="dnsmasq-dns" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.614378 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="134c032a-7dc1-45df-9512-849be8d385bf" containerName="glance-httpd" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.614398 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="134c032a-7dc1-45df-9512-849be8d385bf" containerName="glance-log" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.614416 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff831122-53fa-4905-a6c7-71f216c98da5" containerName="dnsmasq-dns" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.615579 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.620448 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.621049 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.621123 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.745734 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.745935 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e55d9230-8363-41ae-b723-fc4193432067-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.746013 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fw7lr\" (UniqueName: \"kubernetes.io/projected/e55d9230-8363-41ae-b723-fc4193432067-kube-api-access-fw7lr\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.746204 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.746250 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e55d9230-8363-41ae-b723-fc4193432067-logs\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.746284 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.746329 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.746381 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.847488 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.847586 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e55d9230-8363-41ae-b723-fc4193432067-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.847613 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fw7lr\" (UniqueName: \"kubernetes.io/projected/e55d9230-8363-41ae-b723-fc4193432067-kube-api-access-fw7lr\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.847661 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.847681 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e55d9230-8363-41ae-b723-fc4193432067-logs\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.847713 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.847730 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.847781 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.848398 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e55d9230-8363-41ae-b723-fc4193432067-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.848571 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.849046 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e55d9230-8363-41ae-b723-fc4193432067-logs\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.853282 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.854694 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.855264 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.860044 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.863193 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fw7lr\" (UniqueName: \"kubernetes.io/projected/e55d9230-8363-41ae-b723-fc4193432067-kube-api-access-fw7lr\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.878979 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:51:56 crc kubenswrapper[4831]: I1203 06:51:56.941750 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:51:57 crc kubenswrapper[4831]: I1203 06:51:57.025584 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="134c032a-7dc1-45df-9512-849be8d385bf" path="/var/lib/kubelet/pods/134c032a-7dc1-45df-9512-849be8d385bf/volumes" Dec 03 06:51:57 crc kubenswrapper[4831]: I1203 06:51:57.026209 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff831122-53fa-4905-a6c7-71f216c98da5" path="/var/lib/kubelet/pods/ff831122-53fa-4905-a6c7-71f216c98da5/volumes" Dec 03 06:51:57 crc kubenswrapper[4831]: I1203 06:51:57.912585 4831 scope.go:117] "RemoveContainer" containerID="d2479529cd432da141540ab78e74d5cc78398e8e90645c6aca60dbbff68c1718" Dec 03 06:51:57 crc kubenswrapper[4831]: E1203 06:51:57.948169 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Dec 03 06:51:57 crc kubenswrapper[4831]: E1203 06:51:57.948370 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-66ksk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-dkxbk_openstack(6e478744-468b-40e8-b4a3-236bdd2bd5ca): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 06:51:57 crc kubenswrapper[4831]: E1203 06:51:57.950237 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-dkxbk" podUID="6e478744-468b-40e8-b4a3-236bdd2bd5ca" Dec 03 06:51:58 crc kubenswrapper[4831]: I1203 06:51:58.011034 4831 scope.go:117] "RemoveContainer" containerID="7eb566f9dd4995b0244674294fa6dea622a920acb64a0f2f460995ca44f73948" Dec 03 06:51:58 crc kubenswrapper[4831]: I1203 06:51:58.069104 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-b8fbc5445-bkmqn" podUID="ff831122-53fa-4905-a6c7-71f216c98da5" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.122:5353: i/o timeout" Dec 03 06:51:58 crc kubenswrapper[4831]: I1203 06:51:58.165073 4831 scope.go:117] "RemoveContainer" containerID="38e992e90f8cdd23f97b52c198fc2885241df14068b5bffcfb667f1dbd9cff24" Dec 03 06:51:58 crc kubenswrapper[4831]: I1203 06:51:58.230128 4831 scope.go:117] "RemoveContainer" containerID="003eea5872bfb8451fffe36f9af7621579806c93e61afbb27894cb3cfeb28df9" Dec 03 06:51:58 crc kubenswrapper[4831]: I1203 06:51:58.249757 4831 scope.go:117] "RemoveContainer" containerID="44812a208760edb2a83837f399a6e36dc6ace94363dcb8fc4fa5e091ea75c01c" Dec 03 06:51:58 crc kubenswrapper[4831]: E1203 06:51:58.254110 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-dkxbk" podUID="6e478744-468b-40e8-b4a3-236bdd2bd5ca" Dec 03 06:51:58 crc kubenswrapper[4831]: I1203 06:51:58.589825 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:51:58 crc kubenswrapper[4831]: W1203 06:51:58.593788 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba03de9b_b96c_4fc0_8534_64e3735dcea4.slice/crio-4f7cb01c1d871723b053658961e9dfdd54591494dcd22ad0d535c96cdcfdae0a WatchSource:0}: Error finding container 4f7cb01c1d871723b053658961e9dfdd54591494dcd22ad0d535c96cdcfdae0a: Status 404 returned error can't find the container with id 4f7cb01c1d871723b053658961e9dfdd54591494dcd22ad0d535c96cdcfdae0a Dec 03 06:51:58 crc kubenswrapper[4831]: I1203 06:51:58.598122 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-89rlq"] Dec 03 06:51:58 crc kubenswrapper[4831]: W1203 06:51:58.598325 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfcbcaae5_50c9_4679_9af5_d9cf5ba7a6ff.slice/crio-8e36aff3afb3cc0625d16e4edf281c98740b541332199eba5940f39279057010 WatchSource:0}: Error finding container 8e36aff3afb3cc0625d16e4edf281c98740b541332199eba5940f39279057010: Status 404 returned error can't find the container with id 8e36aff3afb3cc0625d16e4edf281c98740b541332199eba5940f39279057010 Dec 03 06:51:58 crc kubenswrapper[4831]: I1203 06:51:58.702493 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:51:59 crc kubenswrapper[4831]: I1203 06:51:59.261692 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba03de9b-b96c-4fc0-8534-64e3735dcea4","Type":"ContainerStarted","Data":"2dfdf1c75dd66cea93f44482b31954d0a8d7559f695a599313d41eadeeb88bdb"} Dec 03 06:51:59 crc kubenswrapper[4831]: I1203 06:51:59.262042 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba03de9b-b96c-4fc0-8534-64e3735dcea4","Type":"ContainerStarted","Data":"4f7cb01c1d871723b053658961e9dfdd54591494dcd22ad0d535c96cdcfdae0a"} Dec 03 06:51:59 crc kubenswrapper[4831]: I1203 06:51:59.266124 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7a6ac9f-6470-4d84-a939-831f270eae54","Type":"ContainerStarted","Data":"997a1617657d6929bd74782299aff40305a25d414d295a9191fde264fe18730a"} Dec 03 06:51:59 crc kubenswrapper[4831]: I1203 06:51:59.267449 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hh6jm" event={"ID":"c58a3c00-1c5b-4c17-88d7-459798e81d76","Type":"ContainerStarted","Data":"5ef858aa3c34c29bab2a25c1ab6ce3b92a27fd595642d016617e9623d2c38b49"} Dec 03 06:51:59 crc kubenswrapper[4831]: I1203 06:51:59.269525 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e55d9230-8363-41ae-b723-fc4193432067","Type":"ContainerStarted","Data":"870f2219e2f89ef890aa24f9fc2083e4e392bbfe523a139b2efad054382ed7b5"} Dec 03 06:51:59 crc kubenswrapper[4831]: I1203 06:51:59.273410 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-89rlq" event={"ID":"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff","Type":"ContainerStarted","Data":"202e6afb367fa1688e99525965f011edec7e64075502bdc543a3d94545c54fcf"} Dec 03 06:51:59 crc kubenswrapper[4831]: I1203 06:51:59.273459 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-89rlq" event={"ID":"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff","Type":"ContainerStarted","Data":"8e36aff3afb3cc0625d16e4edf281c98740b541332199eba5940f39279057010"} Dec 03 06:51:59 crc kubenswrapper[4831]: I1203 06:51:59.277609 4831 generic.go:334] "Generic (PLEG): container finished" podID="b6e74a76-928d-4a03-ae60-6749fafef9ae" containerID="4eef0053326ed74823bd51d439a4fac226fd4c4da5f8fb1077ece054b1be9bf8" exitCode=0 Dec 03 06:51:59 crc kubenswrapper[4831]: I1203 06:51:59.277670 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-68mzw" event={"ID":"b6e74a76-928d-4a03-ae60-6749fafef9ae","Type":"ContainerDied","Data":"4eef0053326ed74823bd51d439a4fac226fd4c4da5f8fb1077ece054b1be9bf8"} Dec 03 06:51:59 crc kubenswrapper[4831]: I1203 06:51:59.280036 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t4m8r" event={"ID":"9f2a305c-36bf-477e-8468-919407ea5d90","Type":"ContainerStarted","Data":"491f85725d63c87d76c7a6e62d6c33ebed834612994f604e9a07a041f9273116"} Dec 03 06:51:59 crc kubenswrapper[4831]: I1203 06:51:59.295124 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-hh6jm" podStartSLOduration=5.235768406 podStartE2EDuration="23.295089661s" podCreationTimestamp="2025-12-03 06:51:36 +0000 UTC" firstStartedPulling="2025-12-03 06:51:37.865574118 +0000 UTC m=+1235.209157626" lastFinishedPulling="2025-12-03 06:51:55.924895353 +0000 UTC m=+1253.268478881" observedRunningTime="2025-12-03 06:51:59.287568644 +0000 UTC m=+1256.631152162" watchObservedRunningTime="2025-12-03 06:51:59.295089661 +0000 UTC m=+1256.638673169" Dec 03 06:51:59 crc kubenswrapper[4831]: I1203 06:51:59.331535 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-89rlq" podStartSLOduration=10.331517412 podStartE2EDuration="10.331517412s" podCreationTimestamp="2025-12-03 06:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:51:59.31964203 +0000 UTC m=+1256.663225538" watchObservedRunningTime="2025-12-03 06:51:59.331517412 +0000 UTC m=+1256.675100910" Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.290129 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba03de9b-b96c-4fc0-8534-64e3735dcea4","Type":"ContainerStarted","Data":"ac319c6003550dd5b09a7a590a5ff5ccb1bfaf5e3bbacedc96c230bef222f23a"} Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.294127 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7a6ac9f-6470-4d84-a939-831f270eae54","Type":"ContainerStarted","Data":"9b09581aa4cf02f951f3945c173c08573f08c12c6fb8a54a186f1e8f7b1c734a"} Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.298331 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e55d9230-8363-41ae-b723-fc4193432067","Type":"ContainerStarted","Data":"c7e9b0a650265fb0d7e98932b0ace54706b2cc22c62a95bf09a219d4be3ee1bb"} Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.298381 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e55d9230-8363-41ae-b723-fc4193432067","Type":"ContainerStarted","Data":"eae2681a7008672e1bf1e656ca7a562f6fce2305da8067c939311505d06146cc"} Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.323100 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-t4m8r" podStartSLOduration=4.271418048 podStartE2EDuration="24.323076294s" podCreationTimestamp="2025-12-03 06:51:36 +0000 UTC" firstStartedPulling="2025-12-03 06:51:37.902129764 +0000 UTC m=+1235.245713272" lastFinishedPulling="2025-12-03 06:51:57.95378797 +0000 UTC m=+1255.297371518" observedRunningTime="2025-12-03 06:51:59.340096691 +0000 UTC m=+1256.683680219" watchObservedRunningTime="2025-12-03 06:52:00.323076294 +0000 UTC m=+1257.666659802" Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.324828 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=11.324814478 podStartE2EDuration="11.324814478s" podCreationTimestamp="2025-12-03 06:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:00.315121244 +0000 UTC m=+1257.658704772" watchObservedRunningTime="2025-12-03 06:52:00.324814478 +0000 UTC m=+1257.668397986" Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.351474 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.351455032 podStartE2EDuration="4.351455032s" podCreationTimestamp="2025-12-03 06:51:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:00.3408343 +0000 UTC m=+1257.684417808" watchObservedRunningTime="2025-12-03 06:52:00.351455032 +0000 UTC m=+1257.695038540" Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.694664 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-68mzw" Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.815987 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgwvx\" (UniqueName: \"kubernetes.io/projected/b6e74a76-928d-4a03-ae60-6749fafef9ae-kube-api-access-zgwvx\") pod \"b6e74a76-928d-4a03-ae60-6749fafef9ae\" (UID: \"b6e74a76-928d-4a03-ae60-6749fafef9ae\") " Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.816043 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6e74a76-928d-4a03-ae60-6749fafef9ae-config\") pod \"b6e74a76-928d-4a03-ae60-6749fafef9ae\" (UID: \"b6e74a76-928d-4a03-ae60-6749fafef9ae\") " Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.816124 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e74a76-928d-4a03-ae60-6749fafef9ae-combined-ca-bundle\") pod \"b6e74a76-928d-4a03-ae60-6749fafef9ae\" (UID: \"b6e74a76-928d-4a03-ae60-6749fafef9ae\") " Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.824491 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6e74a76-928d-4a03-ae60-6749fafef9ae-kube-api-access-zgwvx" (OuterVolumeSpecName: "kube-api-access-zgwvx") pod "b6e74a76-928d-4a03-ae60-6749fafef9ae" (UID: "b6e74a76-928d-4a03-ae60-6749fafef9ae"). InnerVolumeSpecName "kube-api-access-zgwvx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.847584 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e74a76-928d-4a03-ae60-6749fafef9ae-config" (OuterVolumeSpecName: "config") pod "b6e74a76-928d-4a03-ae60-6749fafef9ae" (UID: "b6e74a76-928d-4a03-ae60-6749fafef9ae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.892537 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6e74a76-928d-4a03-ae60-6749fafef9ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6e74a76-928d-4a03-ae60-6749fafef9ae" (UID: "b6e74a76-928d-4a03-ae60-6749fafef9ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.918095 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgwvx\" (UniqueName: \"kubernetes.io/projected/b6e74a76-928d-4a03-ae60-6749fafef9ae-kube-api-access-zgwvx\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.918138 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/b6e74a76-928d-4a03-ae60-6749fafef9ae-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:00 crc kubenswrapper[4831]: I1203 06:52:00.918151 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6e74a76-928d-4a03-ae60-6749fafef9ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.312390 4831 generic.go:334] "Generic (PLEG): container finished" podID="c58a3c00-1c5b-4c17-88d7-459798e81d76" containerID="5ef858aa3c34c29bab2a25c1ab6ce3b92a27fd595642d016617e9623d2c38b49" exitCode=0 Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.312482 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hh6jm" event={"ID":"c58a3c00-1c5b-4c17-88d7-459798e81d76","Type":"ContainerDied","Data":"5ef858aa3c34c29bab2a25c1ab6ce3b92a27fd595642d016617e9623d2c38b49"} Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.314630 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-68mzw" event={"ID":"b6e74a76-928d-4a03-ae60-6749fafef9ae","Type":"ContainerDied","Data":"c80b230cea3f188974fefc687d1bc1955c19c712068cda5da38d9b1876d5caaf"} Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.314657 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c80b230cea3f188974fefc687d1bc1955c19c712068cda5da38d9b1876d5caaf" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.314660 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-68mzw" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.525139 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-2mnp8"] Dec 03 06:52:01 crc kubenswrapper[4831]: E1203 06:52:01.525754 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6e74a76-928d-4a03-ae60-6749fafef9ae" containerName="neutron-db-sync" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.525766 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6e74a76-928d-4a03-ae60-6749fafef9ae" containerName="neutron-db-sync" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.525949 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6e74a76-928d-4a03-ae60-6749fafef9ae" containerName="neutron-db-sync" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.526775 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.556251 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-2mnp8"] Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.624699 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-7db8f48c84-p8t9t"] Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.626380 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.630505 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.630653 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-r8xlx" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.630890 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.630995 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.642428 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.642492 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-dns-svc\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.642512 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l28js\" (UniqueName: \"kubernetes.io/projected/1a20f210-ae44-4272-865e-b3f869095c9a-kube-api-access-l28js\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.642564 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.642605 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-config\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.642684 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.653933 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7db8f48c84-p8t9t"] Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.744939 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-ovndb-tls-certs\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.745033 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.745139 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26txw\" (UniqueName: \"kubernetes.io/projected/25c0f5bd-0634-4b2e-90e6-b73305b0d259-kube-api-access-26txw\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.745198 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-config\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.745236 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-combined-ca-bundle\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.745357 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-httpd-config\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.745411 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.745471 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-dns-svc\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.745500 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l28js\" (UniqueName: \"kubernetes.io/projected/1a20f210-ae44-4272-865e-b3f869095c9a-kube-api-access-l28js\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.745595 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.745662 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-config\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.746048 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-ovsdbserver-nb\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.746225 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-dns-swift-storage-0\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.746361 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-dns-svc\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.746593 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-config\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.746940 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-ovsdbserver-sb\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.769036 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l28js\" (UniqueName: \"kubernetes.io/projected/1a20f210-ae44-4272-865e-b3f869095c9a-kube-api-access-l28js\") pod \"dnsmasq-dns-6b7b667979-2mnp8\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.841382 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.848777 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-ovndb-tls-certs\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.848960 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26txw\" (UniqueName: \"kubernetes.io/projected/25c0f5bd-0634-4b2e-90e6-b73305b0d259-kube-api-access-26txw\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.848986 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-config\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.849038 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-combined-ca-bundle\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.849425 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-httpd-config\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.855481 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-combined-ca-bundle\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.856517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-config\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.856521 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-ovndb-tls-certs\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.862094 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-httpd-config\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.870211 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26txw\" (UniqueName: \"kubernetes.io/projected/25c0f5bd-0634-4b2e-90e6-b73305b0d259-kube-api-access-26txw\") pod \"neutron-7db8f48c84-p8t9t\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:01 crc kubenswrapper[4831]: I1203 06:52:01.958065 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:02 crc kubenswrapper[4831]: I1203 06:52:02.326346 4831 generic.go:334] "Generic (PLEG): container finished" podID="9f2a305c-36bf-477e-8468-919407ea5d90" containerID="491f85725d63c87d76c7a6e62d6c33ebed834612994f604e9a07a041f9273116" exitCode=0 Dec 03 06:52:02 crc kubenswrapper[4831]: I1203 06:52:02.326419 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t4m8r" event={"ID":"9f2a305c-36bf-477e-8468-919407ea5d90","Type":"ContainerDied","Data":"491f85725d63c87d76c7a6e62d6c33ebed834612994f604e9a07a041f9273116"} Dec 03 06:52:02 crc kubenswrapper[4831]: I1203 06:52:02.328218 4831 generic.go:334] "Generic (PLEG): container finished" podID="fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff" containerID="202e6afb367fa1688e99525965f011edec7e64075502bdc543a3d94545c54fcf" exitCode=0 Dec 03 06:52:02 crc kubenswrapper[4831]: I1203 06:52:02.328289 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-89rlq" event={"ID":"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff","Type":"ContainerDied","Data":"202e6afb367fa1688e99525965f011edec7e64075502bdc543a3d94545c54fcf"} Dec 03 06:52:02 crc kubenswrapper[4831]: I1203 06:52:02.403210 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-2mnp8"] Dec 03 06:52:02 crc kubenswrapper[4831]: I1203 06:52:02.606737 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-7db8f48c84-p8t9t"] Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.328671 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-844cdc6797-kqpvp"] Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.330258 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.332544 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.336629 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.344029 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-844cdc6797-kqpvp"] Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.361974 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" event={"ID":"1a20f210-ae44-4272-865e-b3f869095c9a","Type":"ContainerStarted","Data":"005632b9f2c35ac38dc5fc3fe7f6aad4ec07ed05bc430bc4aa664e77af659f8d"} Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.514063 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-httpd-config\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.514110 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-internal-tls-certs\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.514130 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-combined-ca-bundle\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.514147 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrh9v\" (UniqueName: \"kubernetes.io/projected/1f442a70-f040-4b0e-853d-6ce1f4caf63d-kube-api-access-jrh9v\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.514276 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-config\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.514325 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-ovndb-tls-certs\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.514529 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-public-tls-certs\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.615844 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-httpd-config\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.615906 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-internal-tls-certs\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.615935 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-combined-ca-bundle\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.615956 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jrh9v\" (UniqueName: \"kubernetes.io/projected/1f442a70-f040-4b0e-853d-6ce1f4caf63d-kube-api-access-jrh9v\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.616030 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-config\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.616080 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-ovndb-tls-certs\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.616124 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-public-tls-certs\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.629127 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-public-tls-certs\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.629309 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-config\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.629832 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-internal-tls-certs\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.631379 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-ovndb-tls-certs\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.634260 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-httpd-config\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.639000 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-combined-ca-bundle\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.639179 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrh9v\" (UniqueName: \"kubernetes.io/projected/1f442a70-f040-4b0e-853d-6ce1f4caf63d-kube-api-access-jrh9v\") pod \"neutron-844cdc6797-kqpvp\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:04 crc kubenswrapper[4831]: I1203 06:52:04.654974 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:06 crc kubenswrapper[4831]: W1203 06:52:06.652065 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25c0f5bd_0634_4b2e_90e6_b73305b0d259.slice/crio-4096d24682bd7302d931f7b92f01775e586d38038bb12b2fd29a04121ff55d64 WatchSource:0}: Error finding container 4096d24682bd7302d931f7b92f01775e586d38038bb12b2fd29a04121ff55d64: Status 404 returned error can't find the container with id 4096d24682bd7302d931f7b92f01775e586d38038bb12b2fd29a04121ff55d64 Dec 03 06:52:06 crc kubenswrapper[4831]: I1203 06:52:06.943518 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 06:52:06 crc kubenswrapper[4831]: I1203 06:52:06.944126 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 06:52:06 crc kubenswrapper[4831]: I1203 06:52:06.985093 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 06:52:06 crc kubenswrapper[4831]: I1203 06:52:06.987369 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.014306 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hh6jm" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.125525 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.151191 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t4m8r" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.160993 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g9tmm\" (UniqueName: \"kubernetes.io/projected/c58a3c00-1c5b-4c17-88d7-459798e81d76-kube-api-access-g9tmm\") pod \"c58a3c00-1c5b-4c17-88d7-459798e81d76\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.161199 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-config-data\") pod \"c58a3c00-1c5b-4c17-88d7-459798e81d76\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.161344 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-combined-ca-bundle\") pod \"c58a3c00-1c5b-4c17-88d7-459798e81d76\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.161381 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58a3c00-1c5b-4c17-88d7-459798e81d76-logs\") pod \"c58a3c00-1c5b-4c17-88d7-459798e81d76\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.161455 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-scripts\") pod \"c58a3c00-1c5b-4c17-88d7-459798e81d76\" (UID: \"c58a3c00-1c5b-4c17-88d7-459798e81d76\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.163074 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c58a3c00-1c5b-4c17-88d7-459798e81d76-logs" (OuterVolumeSpecName: "logs") pod "c58a3c00-1c5b-4c17-88d7-459798e81d76" (UID: "c58a3c00-1c5b-4c17-88d7-459798e81d76"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.171636 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c58a3c00-1c5b-4c17-88d7-459798e81d76-kube-api-access-g9tmm" (OuterVolumeSpecName: "kube-api-access-g9tmm") pod "c58a3c00-1c5b-4c17-88d7-459798e81d76" (UID: "c58a3c00-1c5b-4c17-88d7-459798e81d76"). InnerVolumeSpecName "kube-api-access-g9tmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.178576 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-scripts" (OuterVolumeSpecName: "scripts") pod "c58a3c00-1c5b-4c17-88d7-459798e81d76" (UID: "c58a3c00-1c5b-4c17-88d7-459798e81d76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.248822 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c58a3c00-1c5b-4c17-88d7-459798e81d76" (UID: "c58a3c00-1c5b-4c17-88d7-459798e81d76"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.263019 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-config-data" (OuterVolumeSpecName: "config-data") pod "c58a3c00-1c5b-4c17-88d7-459798e81d76" (UID: "c58a3c00-1c5b-4c17-88d7-459798e81d76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.264609 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9l5xl\" (UniqueName: \"kubernetes.io/projected/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-kube-api-access-9l5xl\") pod \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.264726 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f2a305c-36bf-477e-8468-919407ea5d90-db-sync-config-data\") pod \"9f2a305c-36bf-477e-8468-919407ea5d90\" (UID: \"9f2a305c-36bf-477e-8468-919407ea5d90\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.264787 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-combined-ca-bundle\") pod \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.264856 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-fernet-keys\") pod \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.264901 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-scripts\") pod \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.264955 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlg7g\" (UniqueName: \"kubernetes.io/projected/9f2a305c-36bf-477e-8468-919407ea5d90-kube-api-access-nlg7g\") pod \"9f2a305c-36bf-477e-8468-919407ea5d90\" (UID: \"9f2a305c-36bf-477e-8468-919407ea5d90\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.264987 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-credential-keys\") pod \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.265088 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-config-data\") pod \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\" (UID: \"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.265112 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f2a305c-36bf-477e-8468-919407ea5d90-combined-ca-bundle\") pod \"9f2a305c-36bf-477e-8468-919407ea5d90\" (UID: \"9f2a305c-36bf-477e-8468-919407ea5d90\") " Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.265528 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c58a3c00-1c5b-4c17-88d7-459798e81d76-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.265549 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.265561 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g9tmm\" (UniqueName: \"kubernetes.io/projected/c58a3c00-1c5b-4c17-88d7-459798e81d76-kube-api-access-g9tmm\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.265574 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.265586 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c58a3c00-1c5b-4c17-88d7-459798e81d76-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.274547 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff" (UID: "fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.274821 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f2a305c-36bf-477e-8468-919407ea5d90-kube-api-access-nlg7g" (OuterVolumeSpecName: "kube-api-access-nlg7g") pod "9f2a305c-36bf-477e-8468-919407ea5d90" (UID: "9f2a305c-36bf-477e-8468-919407ea5d90"). InnerVolumeSpecName "kube-api-access-nlg7g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.279240 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-kube-api-access-9l5xl" (OuterVolumeSpecName: "kube-api-access-9l5xl") pod "fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff" (UID: "fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff"). InnerVolumeSpecName "kube-api-access-9l5xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.280432 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f2a305c-36bf-477e-8468-919407ea5d90-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "9f2a305c-36bf-477e-8468-919407ea5d90" (UID: "9f2a305c-36bf-477e-8468-919407ea5d90"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.282986 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff" (UID: "fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.283052 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-scripts" (OuterVolumeSpecName: "scripts") pod "fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff" (UID: "fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.289144 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-844cdc6797-kqpvp"] Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.303714 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f2a305c-36bf-477e-8468-919407ea5d90-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f2a305c-36bf-477e-8468-919407ea5d90" (UID: "9f2a305c-36bf-477e-8468-919407ea5d90"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.308367 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-config-data" (OuterVolumeSpecName: "config-data") pod "fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff" (UID: "fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.309707 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff" (UID: "fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.367273 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.367337 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f2a305c-36bf-477e-8468-919407ea5d90-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.367354 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9l5xl\" (UniqueName: \"kubernetes.io/projected/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-kube-api-access-9l5xl\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.367367 4831 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/9f2a305c-36bf-477e-8468-919407ea5d90-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.367381 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.367392 4831 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.367402 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.367414 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nlg7g\" (UniqueName: \"kubernetes.io/projected/9f2a305c-36bf-477e-8468-919407ea5d90-kube-api-access-nlg7g\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.367424 4831 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.392182 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7db8f48c84-p8t9t" event={"ID":"25c0f5bd-0634-4b2e-90e6-b73305b0d259","Type":"ContainerStarted","Data":"90f51f3d66a5492835a792da32b949970ec9443e67e8846c79aa7e8c93ed8ce2"} Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.392228 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7db8f48c84-p8t9t" event={"ID":"25c0f5bd-0634-4b2e-90e6-b73305b0d259","Type":"ContainerStarted","Data":"4096d24682bd7302d931f7b92f01775e586d38038bb12b2fd29a04121ff55d64"} Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.393504 4831 generic.go:334] "Generic (PLEG): container finished" podID="1a20f210-ae44-4272-865e-b3f869095c9a" containerID="ac495b650a58fae08b320708e8ce66243955ea12cc8c123f3c49db0a5e57be5e" exitCode=0 Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.393739 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" event={"ID":"1a20f210-ae44-4272-865e-b3f869095c9a","Type":"ContainerDied","Data":"ac495b650a58fae08b320708e8ce66243955ea12cc8c123f3c49db0a5e57be5e"} Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.395407 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-t4m8r" event={"ID":"9f2a305c-36bf-477e-8468-919407ea5d90","Type":"ContainerDied","Data":"9f9b77991af6a9839b07dc279437ba6060076b65950d70e99b6675607d1bbed8"} Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.395430 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f9b77991af6a9839b07dc279437ba6060076b65950d70e99b6675607d1bbed8" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.395451 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-t4m8r" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.406526 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7a6ac9f-6470-4d84-a939-831f270eae54","Type":"ContainerStarted","Data":"ac234ed316a0604a189aab36ffc6354eac66ef142b09c418abc4c7d297ee7f65"} Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.408541 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-844cdc6797-kqpvp" event={"ID":"1f442a70-f040-4b0e-853d-6ce1f4caf63d","Type":"ContainerStarted","Data":"734823a44542390e6448c01e16a0d2f0ddfac87e3c04e07062f8b210d94e02cb"} Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.417302 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-hh6jm" event={"ID":"c58a3c00-1c5b-4c17-88d7-459798e81d76","Type":"ContainerDied","Data":"666a64d7877ba5d68da8014e2e260831d511ef7e28dca7a542dae391651d4062"} Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.417384 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="666a64d7877ba5d68da8014e2e260831d511ef7e28dca7a542dae391651d4062" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.417477 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-hh6jm" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.426701 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-89rlq" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.426749 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-89rlq" event={"ID":"fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff","Type":"ContainerDied","Data":"8e36aff3afb3cc0625d16e4edf281c98740b541332199eba5940f39279057010"} Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.426782 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8e36aff3afb3cc0625d16e4edf281c98740b541332199eba5940f39279057010" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.427178 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 06:52:07 crc kubenswrapper[4831]: I1203 06:52:07.427454 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.148644 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5d9bffbcdd-ztjkw"] Dec 03 06:52:08 crc kubenswrapper[4831]: E1203 06:52:08.149338 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c58a3c00-1c5b-4c17-88d7-459798e81d76" containerName="placement-db-sync" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.149355 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c58a3c00-1c5b-4c17-88d7-459798e81d76" containerName="placement-db-sync" Dec 03 06:52:08 crc kubenswrapper[4831]: E1203 06:52:08.149391 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff" containerName="keystone-bootstrap" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.149400 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff" containerName="keystone-bootstrap" Dec 03 06:52:08 crc kubenswrapper[4831]: E1203 06:52:08.149415 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f2a305c-36bf-477e-8468-919407ea5d90" containerName="barbican-db-sync" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.149424 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f2a305c-36bf-477e-8468-919407ea5d90" containerName="barbican-db-sync" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.149636 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c58a3c00-1c5b-4c17-88d7-459798e81d76" containerName="placement-db-sync" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.149652 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f2a305c-36bf-477e-8468-919407ea5d90" containerName="barbican-db-sync" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.149669 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff" containerName="keystone-bootstrap" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.150754 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.153978 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.154205 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.155145 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-9lcvr" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.155441 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.155690 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.171124 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5d9bffbcdd-ztjkw"] Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.304950 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-combined-ca-bundle\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.305226 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgdxp\" (UniqueName: \"kubernetes.io/projected/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-kube-api-access-tgdxp\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.305422 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-config-data\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.305596 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-internal-tls-certs\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.305710 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-scripts\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.305815 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-logs\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.305938 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-public-tls-certs\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.332377 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-665dcf9f4f-8g7pd"] Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.333464 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.338991 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-v8w6b" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.339214 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.339295 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.339461 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.339642 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.339814 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.346258 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-665dcf9f4f-8g7pd"] Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.407926 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-combined-ca-bundle\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.409897 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgdxp\" (UniqueName: \"kubernetes.io/projected/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-kube-api-access-tgdxp\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.410307 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-config-data\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.412801 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-internal-tls-certs\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.416941 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-internal-tls-certs\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.417251 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-combined-ca-bundle\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.417632 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-scripts\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.418076 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-logs\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.418253 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-public-tls-certs\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.418764 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-logs\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.421153 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-config-data\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.436940 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-public-tls-certs\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.437459 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-scripts\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.485058 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgdxp\" (UniqueName: \"kubernetes.io/projected/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-kube-api-access-tgdxp\") pod \"placement-5d9bffbcdd-ztjkw\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.522214 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-55749f9879-hprsg"] Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.527709 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.528541 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-fernet-keys\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.528586 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-internal-tls-certs\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.528636 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-config-data\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.528675 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-combined-ca-bundle\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.528706 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-public-tls-certs\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.528726 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-credential-keys\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.528747 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-scripts\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.528763 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsrdw\" (UniqueName: \"kubernetes.io/projected/5067e964-1daa-4bbd-8e2b-872ce1067389-kube-api-access-jsrdw\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.528990 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-844cdc6797-kqpvp" event={"ID":"1f442a70-f040-4b0e-853d-6ce1f4caf63d","Type":"ContainerStarted","Data":"504d875e8e97248f8475ef8011f761759ddfc7620a7fb784353a5a8aabe3869e"} Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.529030 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-844cdc6797-kqpvp" event={"ID":"1f442a70-f040-4b0e-853d-6ce1f4caf63d","Type":"ContainerStarted","Data":"dbe9b6c70e9b9ed94d9e8daa47ea19329c02389842e6765ffe90adb685ffb61d"} Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.529905 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.535193 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.543376 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-n9qq9" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.544192 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.592461 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7db8f48c84-p8t9t" event={"ID":"25c0f5bd-0634-4b2e-90e6-b73305b0d259","Type":"ContainerStarted","Data":"9311b1db5e01c49b0e90298d85e8c6bed68df9a1d8cd2a6b67631d71acfbd32b"} Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.593307 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.598599 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc"] Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.601563 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.621601 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-55749f9879-hprsg"] Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.628840 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.629844 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4tm6\" (UniqueName: \"kubernetes.io/projected/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-kube-api-access-m4tm6\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.629878 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-logs\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.629912 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-combined-ca-bundle\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.629942 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-config-data\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.629971 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-config-data-custom\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.630002 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-fernet-keys\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.630019 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-internal-tls-certs\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.630106 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-config-data\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.630156 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-combined-ca-bundle\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.630179 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-public-tls-certs\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.630198 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-credential-keys\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.630216 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-scripts\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.630230 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsrdw\" (UniqueName: \"kubernetes.io/projected/5067e964-1daa-4bbd-8e2b-872ce1067389-kube-api-access-jsrdw\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.636201 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" event={"ID":"1a20f210-ae44-4272-865e-b3f869095c9a","Type":"ContainerStarted","Data":"7e968b4c32be5e3c94399f9fe7bc7bad168ca361483a11aec10344d0683a1faf"} Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.636461 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.637195 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-fernet-keys\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.645914 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-public-tls-certs\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.652992 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-config-data\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.654900 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-internal-tls-certs\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.654975 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc"] Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.655217 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-scripts\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.655904 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-combined-ca-bundle\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.656660 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-credential-keys\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.665574 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsrdw\" (UniqueName: \"kubernetes.io/projected/5067e964-1daa-4bbd-8e2b-872ce1067389-kube-api-access-jsrdw\") pod \"keystone-665dcf9f4f-8g7pd\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.730218 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-2mnp8"] Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.732122 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-logs\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.732174 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-combined-ca-bundle\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.732246 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-config-data-custom\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.732270 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-config-data\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.732271 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-844cdc6797-kqpvp" podStartSLOduration=4.732259573 podStartE2EDuration="4.732259573s" podCreationTimestamp="2025-12-03 06:52:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:08.629234155 +0000 UTC m=+1265.972817683" watchObservedRunningTime="2025-12-03 06:52:08.732259573 +0000 UTC m=+1266.075843081" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.732325 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-config-data\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.732347 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-config-data-custom\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.732385 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdb7n\" (UniqueName: \"kubernetes.io/projected/fd770919-d80e-4947-b52d-673beb117374-kube-api-access-xdb7n\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.732431 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd770919-d80e-4947-b52d-673beb117374-logs\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.732447 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-combined-ca-bundle\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.732576 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4tm6\" (UniqueName: \"kubernetes.io/projected/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-kube-api-access-m4tm6\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.733171 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-logs\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.746359 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-combined-ca-bundle\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.751607 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-config-data-custom\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.757404 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jjzjd"] Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.758720 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.762048 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-config-data\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.769345 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4tm6\" (UniqueName: \"kubernetes.io/projected/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-kube-api-access-m4tm6\") pod \"barbican-worker-55749f9879-hprsg\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.771824 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jjzjd"] Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.784458 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.786890 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-d4fb4bbcd-8782k"] Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.788694 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.791656 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-7db8f48c84-p8t9t" podStartSLOduration=7.791634294 podStartE2EDuration="7.791634294s" podCreationTimestamp="2025-12-03 06:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:08.687787489 +0000 UTC m=+1266.031370997" watchObservedRunningTime="2025-12-03 06:52:08.791634294 +0000 UTC m=+1266.135217802" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.794600 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.840940 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d4fb4bbcd-8782k"] Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.854205 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-config-data\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.854271 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdb7n\" (UniqueName: \"kubernetes.io/projected/fd770919-d80e-4947-b52d-673beb117374-kube-api-access-xdb7n\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.854308 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd770919-d80e-4947-b52d-673beb117374-logs\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.854338 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-combined-ca-bundle\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.854308 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" podStartSLOduration=7.854291176 podStartE2EDuration="7.854291176s" podCreationTimestamp="2025-12-03 06:52:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:08.715842458 +0000 UTC m=+1266.059425966" watchObservedRunningTime="2025-12-03 06:52:08.854291176 +0000 UTC m=+1266.197874684" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.854430 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-config-data-custom\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.859527 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd770919-d80e-4947-b52d-673beb117374-logs\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.862704 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-config-data\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.872503 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-config-data-custom\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.873447 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-combined-ca-bundle\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.883479 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdb7n\" (UniqueName: \"kubernetes.io/projected/fd770919-d80e-4947-b52d-673beb117374-kube-api-access-xdb7n\") pod \"barbican-keystone-listener-6bc44fb5cd-xtqcc\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.932469 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.938824 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.967845 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.977574 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-config-data-custom\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.977628 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.977735 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54778ced-f817-4738-9066-37b7328451d1-logs\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.977805 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-config\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.977836 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.977857 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.977899 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-config-data\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.977971 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h969\" (UniqueName: \"kubernetes.io/projected/54778ced-f817-4738-9066-37b7328451d1-kube-api-access-4h969\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.978076 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-combined-ca-bundle\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.978099 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6mxq\" (UniqueName: \"kubernetes.io/projected/54839c01-76cd-4ab5-89a0-77494bfb730e-kube-api-access-p6mxq\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:08 crc kubenswrapper[4831]: I1203 06:52:08.978126 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.080329 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6mxq\" (UniqueName: \"kubernetes.io/projected/54839c01-76cd-4ab5-89a0-77494bfb730e-kube-api-access-p6mxq\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.080378 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.080437 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-config-data-custom\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.080459 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.080500 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54778ced-f817-4738-9066-37b7328451d1-logs\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.080526 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-config\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.080547 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.080563 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.080584 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-config-data\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.080611 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h969\" (UniqueName: \"kubernetes.io/projected/54778ced-f817-4738-9066-37b7328451d1-kube-api-access-4h969\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.080639 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-combined-ca-bundle\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.081673 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54778ced-f817-4738-9066-37b7328451d1-logs\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.082701 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.084942 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-combined-ca-bundle\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.085552 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-config\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.086080 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.086673 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.086665 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.088437 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-config-data-custom\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.091964 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-config-data\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.101389 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6mxq\" (UniqueName: \"kubernetes.io/projected/54839c01-76cd-4ab5-89a0-77494bfb730e-kube-api-access-p6mxq\") pod \"dnsmasq-dns-848cf88cfc-jjzjd\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.105948 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h969\" (UniqueName: \"kubernetes.io/projected/54778ced-f817-4738-9066-37b7328451d1-kube-api-access-4h969\") pod \"barbican-api-d4fb4bbcd-8782k\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.253106 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.292275 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.334532 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5d9bffbcdd-ztjkw"] Dec 03 06:52:09 crc kubenswrapper[4831]: W1203 06:52:09.342501 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcccc3a0b_98c7_4930_a7b5_3c1320a5ee69.slice/crio-48ad030d7e7240a5a4a994cc89a03c1b56483b26d918f8e50821b561778b7b57 WatchSource:0}: Error finding container 48ad030d7e7240a5a4a994cc89a03c1b56483b26d918f8e50821b561778b7b57: Status 404 returned error can't find the container with id 48ad030d7e7240a5a4a994cc89a03c1b56483b26d918f8e50821b561778b7b57 Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.576073 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.576148 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.586008 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-55749f9879-hprsg"] Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.602107 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc"] Dec 03 06:52:09 crc kubenswrapper[4831]: W1203 06:52:09.609770 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3a92648c_9b8b_4fcf_b028_3f59ce2ebf27.slice/crio-2c73abbc92e0058e3d6c321b667f77cdef9878197d405a2c30654264d9aee524 WatchSource:0}: Error finding container 2c73abbc92e0058e3d6c321b667f77cdef9878197d405a2c30654264d9aee524: Status 404 returned error can't find the container with id 2c73abbc92e0058e3d6c321b667f77cdef9878197d405a2c30654264d9aee524 Dec 03 06:52:09 crc kubenswrapper[4831]: W1203 06:52:09.631388 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd770919_d80e_4947_b52d_673beb117374.slice/crio-e98936f85844d71031d7dda15ce52290cc961d207f2d56462def738170a46451 WatchSource:0}: Error finding container e98936f85844d71031d7dda15ce52290cc961d207f2d56462def738170a46451: Status 404 returned error can't find the container with id e98936f85844d71031d7dda15ce52290cc961d207f2d56462def738170a46451 Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.657739 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.661459 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d9bffbcdd-ztjkw" event={"ID":"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69","Type":"ContainerStarted","Data":"48ad030d7e7240a5a4a994cc89a03c1b56483b26d918f8e50821b561778b7b57"} Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.664121 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" event={"ID":"fd770919-d80e-4947-b52d-673beb117374","Type":"ContainerStarted","Data":"e98936f85844d71031d7dda15ce52290cc961d207f2d56462def738170a46451"} Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.673495 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55749f9879-hprsg" event={"ID":"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27","Type":"ContainerStarted","Data":"2c73abbc92e0058e3d6c321b667f77cdef9878197d405a2c30654264d9aee524"} Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.674680 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.688752 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.763361 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.763452 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:52:09 crc kubenswrapper[4831]: I1203 06:52:09.850487 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-665dcf9f4f-8g7pd"] Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.011476 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jjzjd"] Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.048861 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-d4fb4bbcd-8782k"] Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.122941 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.713053 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-665dcf9f4f-8g7pd" event={"ID":"5067e964-1daa-4bbd-8e2b-872ce1067389","Type":"ContainerStarted","Data":"8f1ceb54accf6cc95db136b64ce1f4080a7894f0ee7b6160b877d7044f1c4354"} Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.713385 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-665dcf9f4f-8g7pd" event={"ID":"5067e964-1daa-4bbd-8e2b-872ce1067389","Type":"ContainerStarted","Data":"c92eb12f4c11ef063a6c7bd73114db57fdf21e3ba90802835bee2899ce4b49fa"} Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.714955 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.736600 4831 generic.go:334] "Generic (PLEG): container finished" podID="54839c01-76cd-4ab5-89a0-77494bfb730e" containerID="a5627561cbf5dbdf46932ea85ec31175048d56585662c3edefd03f310966ee8a" exitCode=0 Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.736733 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" event={"ID":"54839c01-76cd-4ab5-89a0-77494bfb730e","Type":"ContainerDied","Data":"a5627561cbf5dbdf46932ea85ec31175048d56585662c3edefd03f310966ee8a"} Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.736761 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" event={"ID":"54839c01-76cd-4ab5-89a0-77494bfb730e","Type":"ContainerStarted","Data":"b844f412962b72bdc9c8760f7a4e79e326fb38adc4c925a6a52d16d6d7c004ce"} Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.751447 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4fb4bbcd-8782k" event={"ID":"54778ced-f817-4738-9066-37b7328451d1","Type":"ContainerStarted","Data":"3dd0510e00e24e326f421df52e6b7aafe3de33f2df90d54e1402c10da898c408"} Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.751496 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4fb4bbcd-8782k" event={"ID":"54778ced-f817-4738-9066-37b7328451d1","Type":"ContainerStarted","Data":"91ba9faa465fc86c0b9f71f4c2dbf5415a39ffffaf26eb6dab7e9a9747eec908"} Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.756179 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-665dcf9f4f-8g7pd" podStartSLOduration=2.756163434 podStartE2EDuration="2.756163434s" podCreationTimestamp="2025-12-03 06:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:10.737188229 +0000 UTC m=+1268.080771737" watchObservedRunningTime="2025-12-03 06:52:10.756163434 +0000 UTC m=+1268.099746952" Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.785567 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d9bffbcdd-ztjkw" event={"ID":"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69","Type":"ContainerStarted","Data":"4dfb2678a75458c31ce4e501a156049718eef44858bc8bb12bca6a8c8f4adbfa"} Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.785611 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d9bffbcdd-ztjkw" event={"ID":"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69","Type":"ContainerStarted","Data":"0c9038e6a18689fceba5e0a50ac6d4a0a041c704037c77f6242ca7ae37b78999"} Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.786644 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" podUID="1a20f210-ae44-4272-865e-b3f869095c9a" containerName="dnsmasq-dns" containerID="cri-o://7e968b4c32be5e3c94399f9fe7bc7bad168ca361483a11aec10344d0683a1faf" gracePeriod=10 Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.786751 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.786768 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.786777 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 06:52:10 crc kubenswrapper[4831]: I1203 06:52:10.834852 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5d9bffbcdd-ztjkw" podStartSLOduration=2.834835019 podStartE2EDuration="2.834835019s" podCreationTimestamp="2025-12-03 06:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:10.82786285 +0000 UTC m=+1268.171446368" watchObservedRunningTime="2025-12-03 06:52:10.834835019 +0000 UTC m=+1268.178418527" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.244060 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-574cdc6988-72ggg"] Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.257739 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.258806 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-574cdc6988-72ggg"] Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.263722 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.274728 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.333837 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-combined-ca-bundle\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.333896 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-public-tls-certs\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.333942 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-config-data\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.333983 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-config-data-custom\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.334005 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-internal-tls-certs\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.334024 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czslk\" (UniqueName: \"kubernetes.io/projected/770b98aa-f177-4c7b-b37e-1664c039f47d-kube-api-access-czslk\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.334050 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/770b98aa-f177-4c7b-b37e-1664c039f47d-logs\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.465221 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-config-data-custom\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.465274 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-internal-tls-certs\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.465295 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czslk\" (UniqueName: \"kubernetes.io/projected/770b98aa-f177-4c7b-b37e-1664c039f47d-kube-api-access-czslk\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.465338 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/770b98aa-f177-4c7b-b37e-1664c039f47d-logs\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.465612 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-combined-ca-bundle\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.465689 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-public-tls-certs\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.465742 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-config-data\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.469585 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/770b98aa-f177-4c7b-b37e-1664c039f47d-logs\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.478654 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-public-tls-certs\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.479361 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-combined-ca-bundle\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.488475 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-config-data-custom\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.489622 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-config-data\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.493768 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-internal-tls-certs\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.500801 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czslk\" (UniqueName: \"kubernetes.io/projected/770b98aa-f177-4c7b-b37e-1664c039f47d-kube-api-access-czslk\") pod \"barbican-api-574cdc6988-72ggg\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.508990 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.668457 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l28js\" (UniqueName: \"kubernetes.io/projected/1a20f210-ae44-4272-865e-b3f869095c9a-kube-api-access-l28js\") pod \"1a20f210-ae44-4272-865e-b3f869095c9a\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.668518 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-ovsdbserver-nb\") pod \"1a20f210-ae44-4272-865e-b3f869095c9a\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.668570 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-ovsdbserver-sb\") pod \"1a20f210-ae44-4272-865e-b3f869095c9a\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.668609 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-config\") pod \"1a20f210-ae44-4272-865e-b3f869095c9a\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.668695 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-dns-svc\") pod \"1a20f210-ae44-4272-865e-b3f869095c9a\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.668717 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-dns-swift-storage-0\") pod \"1a20f210-ae44-4272-865e-b3f869095c9a\" (UID: \"1a20f210-ae44-4272-865e-b3f869095c9a\") " Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.677900 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a20f210-ae44-4272-865e-b3f869095c9a-kube-api-access-l28js" (OuterVolumeSpecName: "kube-api-access-l28js") pod "1a20f210-ae44-4272-865e-b3f869095c9a" (UID: "1a20f210-ae44-4272-865e-b3f869095c9a"). InnerVolumeSpecName "kube-api-access-l28js". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.749280 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "1a20f210-ae44-4272-865e-b3f869095c9a" (UID: "1a20f210-ae44-4272-865e-b3f869095c9a"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.750928 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "1a20f210-ae44-4272-865e-b3f869095c9a" (UID: "1a20f210-ae44-4272-865e-b3f869095c9a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.765997 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.766756 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-config" (OuterVolumeSpecName: "config") pod "1a20f210-ae44-4272-865e-b3f869095c9a" (UID: "1a20f210-ae44-4272-865e-b3f869095c9a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.786369 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l28js\" (UniqueName: \"kubernetes.io/projected/1a20f210-ae44-4272-865e-b3f869095c9a-kube-api-access-l28js\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.786393 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.786403 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.786412 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.809974 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" event={"ID":"54839c01-76cd-4ab5-89a0-77494bfb730e","Type":"ContainerStarted","Data":"a39b9a0e8af48a1703c82b148772583d762fe1941b1f05de5c2dddab58397fa3"} Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.811058 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.818333 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "1a20f210-ae44-4272-865e-b3f869095c9a" (UID: "1a20f210-ae44-4272-865e-b3f869095c9a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.828044 4831 generic.go:334] "Generic (PLEG): container finished" podID="1a20f210-ae44-4272-865e-b3f869095c9a" containerID="7e968b4c32be5e3c94399f9fe7bc7bad168ca361483a11aec10344d0683a1faf" exitCode=0 Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.828242 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.828558 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" event={"ID":"1a20f210-ae44-4272-865e-b3f869095c9a","Type":"ContainerDied","Data":"7e968b4c32be5e3c94399f9fe7bc7bad168ca361483a11aec10344d0683a1faf"} Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.828684 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6b7b667979-2mnp8" event={"ID":"1a20f210-ae44-4272-865e-b3f869095c9a","Type":"ContainerDied","Data":"005632b9f2c35ac38dc5fc3fe7f6aad4ec07ed05bc430bc4aa664e77af659f8d"} Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.828728 4831 scope.go:117] "RemoveContainer" containerID="7e968b4c32be5e3c94399f9fe7bc7bad168ca361483a11aec10344d0683a1faf" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.846032 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.846064 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4fb4bbcd-8782k" event={"ID":"54778ced-f817-4738-9066-37b7328451d1","Type":"ContainerStarted","Data":"1953e076b46a3e3c62ba0f755d17e277b6461b49fcb04e0d7a8393e70269ba8d"} Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.852034 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" podStartSLOduration=3.851091634 podStartE2EDuration="3.851091634s" podCreationTimestamp="2025-12-03 06:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:11.840894104 +0000 UTC m=+1269.184477622" watchObservedRunningTime="2025-12-03 06:52:11.851091634 +0000 UTC m=+1269.194675142" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.885934 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1a20f210-ae44-4272-865e-b3f869095c9a" (UID: "1a20f210-ae44-4272-865e-b3f869095c9a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.888778 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-d4fb4bbcd-8782k" podStartSLOduration=3.888755814 podStartE2EDuration="3.888755814s" podCreationTimestamp="2025-12-03 06:52:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:11.869465779 +0000 UTC m=+1269.213049287" watchObservedRunningTime="2025-12-03 06:52:11.888755814 +0000 UTC m=+1269.232339322" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.888821 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.889134 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/1a20f210-ae44-4272-865e-b3f869095c9a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.953284 4831 scope.go:117] "RemoveContainer" containerID="ac495b650a58fae08b320708e8ce66243955ea12cc8c123f3c49db0a5e57be5e" Dec 03 06:52:11 crc kubenswrapper[4831]: I1203 06:52:11.997474 4831 scope.go:117] "RemoveContainer" containerID="7e968b4c32be5e3c94399f9fe7bc7bad168ca361483a11aec10344d0683a1faf" Dec 03 06:52:12 crc kubenswrapper[4831]: E1203 06:52:12.004463 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e968b4c32be5e3c94399f9fe7bc7bad168ca361483a11aec10344d0683a1faf\": container with ID starting with 7e968b4c32be5e3c94399f9fe7bc7bad168ca361483a11aec10344d0683a1faf not found: ID does not exist" containerID="7e968b4c32be5e3c94399f9fe7bc7bad168ca361483a11aec10344d0683a1faf" Dec 03 06:52:12 crc kubenswrapper[4831]: I1203 06:52:12.004551 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e968b4c32be5e3c94399f9fe7bc7bad168ca361483a11aec10344d0683a1faf"} err="failed to get container status \"7e968b4c32be5e3c94399f9fe7bc7bad168ca361483a11aec10344d0683a1faf\": rpc error: code = NotFound desc = could not find container \"7e968b4c32be5e3c94399f9fe7bc7bad168ca361483a11aec10344d0683a1faf\": container with ID starting with 7e968b4c32be5e3c94399f9fe7bc7bad168ca361483a11aec10344d0683a1faf not found: ID does not exist" Dec 03 06:52:12 crc kubenswrapper[4831]: I1203 06:52:12.004604 4831 scope.go:117] "RemoveContainer" containerID="ac495b650a58fae08b320708e8ce66243955ea12cc8c123f3c49db0a5e57be5e" Dec 03 06:52:12 crc kubenswrapper[4831]: E1203 06:52:12.005097 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac495b650a58fae08b320708e8ce66243955ea12cc8c123f3c49db0a5e57be5e\": container with ID starting with ac495b650a58fae08b320708e8ce66243955ea12cc8c123f3c49db0a5e57be5e not found: ID does not exist" containerID="ac495b650a58fae08b320708e8ce66243955ea12cc8c123f3c49db0a5e57be5e" Dec 03 06:52:12 crc kubenswrapper[4831]: I1203 06:52:12.005131 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac495b650a58fae08b320708e8ce66243955ea12cc8c123f3c49db0a5e57be5e"} err="failed to get container status \"ac495b650a58fae08b320708e8ce66243955ea12cc8c123f3c49db0a5e57be5e\": rpc error: code = NotFound desc = could not find container \"ac495b650a58fae08b320708e8ce66243955ea12cc8c123f3c49db0a5e57be5e\": container with ID starting with ac495b650a58fae08b320708e8ce66243955ea12cc8c123f3c49db0a5e57be5e not found: ID does not exist" Dec 03 06:52:12 crc kubenswrapper[4831]: I1203 06:52:12.289192 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-2mnp8"] Dec 03 06:52:12 crc kubenswrapper[4831]: I1203 06:52:12.305892 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6b7b667979-2mnp8"] Dec 03 06:52:12 crc kubenswrapper[4831]: I1203 06:52:12.552520 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-574cdc6988-72ggg"] Dec 03 06:52:12 crc kubenswrapper[4831]: I1203 06:52:12.611952 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 06:52:12 crc kubenswrapper[4831]: I1203 06:52:12.887640 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:52:12 crc kubenswrapper[4831]: I1203 06:52:12.890045 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574cdc6988-72ggg" event={"ID":"770b98aa-f177-4c7b-b37e-1664c039f47d","Type":"ContainerStarted","Data":"473b00dcb7558e67e9aa08359e3ab4519754512b16d181ab50164a20e690d308"} Dec 03 06:52:12 crc kubenswrapper[4831]: I1203 06:52:12.894101 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:12 crc kubenswrapper[4831]: I1203 06:52:12.894290 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:13 crc kubenswrapper[4831]: I1203 06:52:13.045880 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a20f210-ae44-4272-865e-b3f869095c9a" path="/var/lib/kubelet/pods/1a20f210-ae44-4272-865e-b3f869095c9a/volumes" Dec 03 06:52:13 crc kubenswrapper[4831]: I1203 06:52:13.874702 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 06:52:13 crc kubenswrapper[4831]: I1203 06:52:13.911482 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55749f9879-hprsg" event={"ID":"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27","Type":"ContainerStarted","Data":"9e502551a8c629987632ae4db8d5bb2dcc87f5d6796281aca903d0518128dc14"} Dec 03 06:52:13 crc kubenswrapper[4831]: I1203 06:52:13.918524 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" event={"ID":"fd770919-d80e-4947-b52d-673beb117374","Type":"ContainerStarted","Data":"aca1e2538c4e0fa0e9737c5d8550f625f94f8366334b8f2873812e5895924cfc"} Dec 03 06:52:13 crc kubenswrapper[4831]: I1203 06:52:13.929840 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574cdc6988-72ggg" event={"ID":"770b98aa-f177-4c7b-b37e-1664c039f47d","Type":"ContainerStarted","Data":"bfdaec1e442053ec75746ee712e4fee25a008c9a2e11ba56c206f7698abbcfcd"} Dec 03 06:52:14 crc kubenswrapper[4831]: I1203 06:52:14.948355 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" event={"ID":"fd770919-d80e-4947-b52d-673beb117374","Type":"ContainerStarted","Data":"4de12fb31b83a10dc830b04fa625356818e600bca291e7e4c8c179b0fba4550c"} Dec 03 06:52:14 crc kubenswrapper[4831]: I1203 06:52:14.950465 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574cdc6988-72ggg" event={"ID":"770b98aa-f177-4c7b-b37e-1664c039f47d","Type":"ContainerStarted","Data":"9c37d8393b980bdf5ae1d421b8b8297aec83f883d91765e900851761b2758842"} Dec 03 06:52:14 crc kubenswrapper[4831]: I1203 06:52:14.951143 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:14 crc kubenswrapper[4831]: I1203 06:52:14.951172 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:14 crc kubenswrapper[4831]: I1203 06:52:14.955975 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55749f9879-hprsg" event={"ID":"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27","Type":"ContainerStarted","Data":"499ef1cada2eb4986f003c7c29c7c54c82c99a884b5bcf1f1b4f4fbda65a3300"} Dec 03 06:52:14 crc kubenswrapper[4831]: I1203 06:52:14.958727 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dkxbk" event={"ID":"6e478744-468b-40e8-b4a3-236bdd2bd5ca","Type":"ContainerStarted","Data":"361dfa918d7a67eda1a1a06e0059d6ef2b6bf45cf14229417deb0087cee8f873"} Dec 03 06:52:15 crc kubenswrapper[4831]: I1203 06:52:14.999271 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" podStartSLOduration=3.338314395 podStartE2EDuration="6.999252784s" podCreationTimestamp="2025-12-03 06:52:08 +0000 UTC" firstStartedPulling="2025-12-03 06:52:09.64870383 +0000 UTC m=+1266.992287338" lastFinishedPulling="2025-12-03 06:52:13.309642229 +0000 UTC m=+1270.653225727" observedRunningTime="2025-12-03 06:52:14.978722951 +0000 UTC m=+1272.322306459" watchObservedRunningTime="2025-12-03 06:52:14.999252784 +0000 UTC m=+1272.342836292" Dec 03 06:52:15 crc kubenswrapper[4831]: I1203 06:52:15.004158 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-574cdc6988-72ggg" podStartSLOduration=4.004143387 podStartE2EDuration="4.004143387s" podCreationTimestamp="2025-12-03 06:52:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:14.997178339 +0000 UTC m=+1272.340761847" watchObservedRunningTime="2025-12-03 06:52:15.004143387 +0000 UTC m=+1272.347726895" Dec 03 06:52:15 crc kubenswrapper[4831]: I1203 06:52:15.038449 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-dkxbk" podStartSLOduration=3.007025787 podStartE2EDuration="39.038425492s" podCreationTimestamp="2025-12-03 06:51:36 +0000 UTC" firstStartedPulling="2025-12-03 06:51:37.531380096 +0000 UTC m=+1234.874963604" lastFinishedPulling="2025-12-03 06:52:13.562779801 +0000 UTC m=+1270.906363309" observedRunningTime="2025-12-03 06:52:15.02656351 +0000 UTC m=+1272.370147018" watchObservedRunningTime="2025-12-03 06:52:15.038425492 +0000 UTC m=+1272.382009000" Dec 03 06:52:15 crc kubenswrapper[4831]: I1203 06:52:15.048644 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-55749f9879-hprsg" podStartSLOduration=3.389221702 podStartE2EDuration="7.048620982s" podCreationTimestamp="2025-12-03 06:52:08 +0000 UTC" firstStartedPulling="2025-12-03 06:52:09.632305326 +0000 UTC m=+1266.975888834" lastFinishedPulling="2025-12-03 06:52:13.291704606 +0000 UTC m=+1270.635288114" observedRunningTime="2025-12-03 06:52:15.048591591 +0000 UTC m=+1272.392175109" watchObservedRunningTime="2025-12-03 06:52:15.048620982 +0000 UTC m=+1272.392204490" Dec 03 06:52:19 crc kubenswrapper[4831]: I1203 06:52:19.255537 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:19 crc kubenswrapper[4831]: I1203 06:52:19.310289 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xv9jp"] Dec 03 06:52:19 crc kubenswrapper[4831]: I1203 06:52:19.310549 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" podUID="6fe82dbc-16d3-4d94-b356-d0b862ba2019" containerName="dnsmasq-dns" containerID="cri-o://d4b25be6a2b9d2dd9447b14b20d352089f64160bad21d040c013974a1a7d4440" gracePeriod=10 Dec 03 06:52:20 crc kubenswrapper[4831]: I1203 06:52:20.013566 4831 generic.go:334] "Generic (PLEG): container finished" podID="6fe82dbc-16d3-4d94-b356-d0b862ba2019" containerID="d4b25be6a2b9d2dd9447b14b20d352089f64160bad21d040c013974a1a7d4440" exitCode=0 Dec 03 06:52:20 crc kubenswrapper[4831]: I1203 06:52:20.013683 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" event={"ID":"6fe82dbc-16d3-4d94-b356-d0b862ba2019","Type":"ContainerDied","Data":"d4b25be6a2b9d2dd9447b14b20d352089f64160bad21d040c013974a1a7d4440"} Dec 03 06:52:20 crc kubenswrapper[4831]: I1203 06:52:20.867641 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.001811 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-ovsdbserver-nb\") pod \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.001980 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zqdwk\" (UniqueName: \"kubernetes.io/projected/6fe82dbc-16d3-4d94-b356-d0b862ba2019-kube-api-access-zqdwk\") pod \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.002148 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-config\") pod \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.002226 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-dns-swift-storage-0\") pod \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.002263 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-ovsdbserver-sb\") pod \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.002309 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-dns-svc\") pod \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\" (UID: \"6fe82dbc-16d3-4d94-b356-d0b862ba2019\") " Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.024609 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fe82dbc-16d3-4d94-b356-d0b862ba2019-kube-api-access-zqdwk" (OuterVolumeSpecName: "kube-api-access-zqdwk") pod "6fe82dbc-16d3-4d94-b356-d0b862ba2019" (UID: "6fe82dbc-16d3-4d94-b356-d0b862ba2019"). InnerVolumeSpecName "kube-api-access-zqdwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.032405 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.055288 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6fe82dbc-16d3-4d94-b356-d0b862ba2019" (UID: "6fe82dbc-16d3-4d94-b356-d0b862ba2019"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.094749 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6fe82dbc-16d3-4d94-b356-d0b862ba2019" (UID: "6fe82dbc-16d3-4d94-b356-d0b862ba2019"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.101044 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6fe82dbc-16d3-4d94-b356-d0b862ba2019" (UID: "6fe82dbc-16d3-4d94-b356-d0b862ba2019"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.104503 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zqdwk\" (UniqueName: \"kubernetes.io/projected/6fe82dbc-16d3-4d94-b356-d0b862ba2019-kube-api-access-zqdwk\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.104525 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.104536 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.104544 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.117813 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-config" (OuterVolumeSpecName: "config") pod "6fe82dbc-16d3-4d94-b356-d0b862ba2019" (UID: "6fe82dbc-16d3-4d94-b356-d0b862ba2019"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.117959 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6fe82dbc-16d3-4d94-b356-d0b862ba2019" (UID: "6fe82dbc-16d3-4d94-b356-d0b862ba2019"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.162169 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-xv9jp" event={"ID":"6fe82dbc-16d3-4d94-b356-d0b862ba2019","Type":"ContainerDied","Data":"82203b0cf936c248d348a35b548d187a38fc3503a2d3e3adc9e2a3f36770a9b1"} Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.162376 4831 scope.go:117] "RemoveContainer" containerID="d4b25be6a2b9d2dd9447b14b20d352089f64160bad21d040c013974a1a7d4440" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.183510 4831 scope.go:117] "RemoveContainer" containerID="193af4264efc564c1aa18e6af7c841667713b4ea0f34fad4c1c23faffa7af9d9" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.205944 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.205969 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6fe82dbc-16d3-4d94-b356-d0b862ba2019-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.364988 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xv9jp"] Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.372049 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-xv9jp"] Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.372952 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:21 crc kubenswrapper[4831]: I1203 06:52:21.500070 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:22 crc kubenswrapper[4831]: I1203 06:52:22.054824 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7a6ac9f-6470-4d84-a939-831f270eae54","Type":"ContainerStarted","Data":"3438496b6bdd26d4cbbce4820298a70507fec7c916c5e396a89809c17892488b"} Dec 03 06:52:22 crc kubenswrapper[4831]: I1203 06:52:22.055343 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="ceilometer-central-agent" containerID="cri-o://997a1617657d6929bd74782299aff40305a25d414d295a9191fde264fe18730a" gracePeriod=30 Dec 03 06:52:22 crc kubenswrapper[4831]: I1203 06:52:22.055412 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 06:52:22 crc kubenswrapper[4831]: I1203 06:52:22.055465 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="proxy-httpd" containerID="cri-o://3438496b6bdd26d4cbbce4820298a70507fec7c916c5e396a89809c17892488b" gracePeriod=30 Dec 03 06:52:22 crc kubenswrapper[4831]: I1203 06:52:22.055523 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="sg-core" containerID="cri-o://ac234ed316a0604a189aab36ffc6354eac66ef142b09c418abc4c7d297ee7f65" gracePeriod=30 Dec 03 06:52:22 crc kubenswrapper[4831]: I1203 06:52:22.055572 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="ceilometer-notification-agent" containerID="cri-o://9b09581aa4cf02f951f3945c173c08573f08c12c6fb8a54a186f1e8f7b1c734a" gracePeriod=30 Dec 03 06:52:22 crc kubenswrapper[4831]: I1203 06:52:22.103490 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.145382033 podStartE2EDuration="46.103472922s" podCreationTimestamp="2025-12-03 06:51:36 +0000 UTC" firstStartedPulling="2025-12-03 06:51:37.91477681 +0000 UTC m=+1235.258360318" lastFinishedPulling="2025-12-03 06:52:20.872867699 +0000 UTC m=+1278.216451207" observedRunningTime="2025-12-03 06:52:22.101627633 +0000 UTC m=+1279.445211151" watchObservedRunningTime="2025-12-03 06:52:22.103472922 +0000 UTC m=+1279.447056420" Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.026993 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fe82dbc-16d3-4d94-b356-d0b862ba2019" path="/var/lib/kubelet/pods/6fe82dbc-16d3-4d94-b356-d0b862ba2019/volumes" Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.081581 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerID="3438496b6bdd26d4cbbce4820298a70507fec7c916c5e396a89809c17892488b" exitCode=0 Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.081638 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerID="ac234ed316a0604a189aab36ffc6354eac66ef142b09c418abc4c7d297ee7f65" exitCode=2 Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.081652 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerID="997a1617657d6929bd74782299aff40305a25d414d295a9191fde264fe18730a" exitCode=0 Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.081678 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7a6ac9f-6470-4d84-a939-831f270eae54","Type":"ContainerDied","Data":"3438496b6bdd26d4cbbce4820298a70507fec7c916c5e396a89809c17892488b"} Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.081746 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7a6ac9f-6470-4d84-a939-831f270eae54","Type":"ContainerDied","Data":"ac234ed316a0604a189aab36ffc6354eac66ef142b09c418abc4c7d297ee7f65"} Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.081769 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7a6ac9f-6470-4d84-a939-831f270eae54","Type":"ContainerDied","Data":"997a1617657d6929bd74782299aff40305a25d414d295a9191fde264fe18730a"} Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.083155 4831 generic.go:334] "Generic (PLEG): container finished" podID="6e478744-468b-40e8-b4a3-236bdd2bd5ca" containerID="361dfa918d7a67eda1a1a06e0059d6ef2b6bf45cf14229417deb0087cee8f873" exitCode=0 Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.083196 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dkxbk" event={"ID":"6e478744-468b-40e8-b4a3-236bdd2bd5ca","Type":"ContainerDied","Data":"361dfa918d7a67eda1a1a06e0059d6ef2b6bf45cf14229417deb0087cee8f873"} Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.449753 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.451798 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.574376 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d4fb4bbcd-8782k"] Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.574939 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d4fb4bbcd-8782k" podUID="54778ced-f817-4738-9066-37b7328451d1" containerName="barbican-api-log" containerID="cri-o://3dd0510e00e24e326f421df52e6b7aafe3de33f2df90d54e1402c10da898c408" gracePeriod=30 Dec 03 06:52:23 crc kubenswrapper[4831]: I1203 06:52:23.575135 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-d4fb4bbcd-8782k" podUID="54778ced-f817-4738-9066-37b7328451d1" containerName="barbican-api" containerID="cri-o://1953e076b46a3e3c62ba0f755d17e277b6461b49fcb04e0d7a8393e70269ba8d" gracePeriod=30 Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.092815 4831 generic.go:334] "Generic (PLEG): container finished" podID="54778ced-f817-4738-9066-37b7328451d1" containerID="3dd0510e00e24e326f421df52e6b7aafe3de33f2df90d54e1402c10da898c408" exitCode=143 Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.093629 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4fb4bbcd-8782k" event={"ID":"54778ced-f817-4738-9066-37b7328451d1","Type":"ContainerDied","Data":"3dd0510e00e24e326f421df52e6b7aafe3de33f2df90d54e1402c10da898c408"} Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.442400 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.582288 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-db-sync-config-data\") pod \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.582346 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-config-data\") pod \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.582362 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-scripts\") pod \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.582509 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66ksk\" (UniqueName: \"kubernetes.io/projected/6e478744-468b-40e8-b4a3-236bdd2bd5ca-kube-api-access-66ksk\") pod \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.582552 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e478744-468b-40e8-b4a3-236bdd2bd5ca-etc-machine-id\") pod \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.582601 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-combined-ca-bundle\") pod \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\" (UID: \"6e478744-468b-40e8-b4a3-236bdd2bd5ca\") " Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.583766 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6e478744-468b-40e8-b4a3-236bdd2bd5ca-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "6e478744-468b-40e8-b4a3-236bdd2bd5ca" (UID: "6e478744-468b-40e8-b4a3-236bdd2bd5ca"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.588258 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-scripts" (OuterVolumeSpecName: "scripts") pod "6e478744-468b-40e8-b4a3-236bdd2bd5ca" (UID: "6e478744-468b-40e8-b4a3-236bdd2bd5ca"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.591565 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e478744-468b-40e8-b4a3-236bdd2bd5ca-kube-api-access-66ksk" (OuterVolumeSpecName: "kube-api-access-66ksk") pod "6e478744-468b-40e8-b4a3-236bdd2bd5ca" (UID: "6e478744-468b-40e8-b4a3-236bdd2bd5ca"). InnerVolumeSpecName "kube-api-access-66ksk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.594933 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "6e478744-468b-40e8-b4a3-236bdd2bd5ca" (UID: "6e478744-468b-40e8-b4a3-236bdd2bd5ca"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.613247 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6e478744-468b-40e8-b4a3-236bdd2bd5ca" (UID: "6e478744-468b-40e8-b4a3-236bdd2bd5ca"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.636653 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-config-data" (OuterVolumeSpecName: "config-data") pod "6e478744-468b-40e8-b4a3-236bdd2bd5ca" (UID: "6e478744-468b-40e8-b4a3-236bdd2bd5ca"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.685376 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66ksk\" (UniqueName: \"kubernetes.io/projected/6e478744-468b-40e8-b4a3-236bdd2bd5ca-kube-api-access-66ksk\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.685413 4831 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e478744-468b-40e8-b4a3-236bdd2bd5ca-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.685422 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.685430 4831 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.685470 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:24 crc kubenswrapper[4831]: I1203 06:52:24.685480 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e478744-468b-40e8-b4a3-236bdd2bd5ca-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.102456 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-dkxbk" event={"ID":"6e478744-468b-40e8-b4a3-236bdd2bd5ca","Type":"ContainerDied","Data":"e9c6795a9a67c4e369d9d4937a274d77480327f5f5809ee87d1cd6e489970565"} Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.102491 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9c6795a9a67c4e369d9d4937a274d77480327f5f5809ee87d1cd6e489970565" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.102621 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-dkxbk" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.514803 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 06:52:25 crc kubenswrapper[4831]: E1203 06:52:25.515558 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e478744-468b-40e8-b4a3-236bdd2bd5ca" containerName="cinder-db-sync" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.515583 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e478744-468b-40e8-b4a3-236bdd2bd5ca" containerName="cinder-db-sync" Dec 03 06:52:25 crc kubenswrapper[4831]: E1203 06:52:25.515604 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe82dbc-16d3-4d94-b356-d0b862ba2019" containerName="init" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.515612 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe82dbc-16d3-4d94-b356-d0b862ba2019" containerName="init" Dec 03 06:52:25 crc kubenswrapper[4831]: E1203 06:52:25.515633 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fe82dbc-16d3-4d94-b356-d0b862ba2019" containerName="dnsmasq-dns" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.515641 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fe82dbc-16d3-4d94-b356-d0b862ba2019" containerName="dnsmasq-dns" Dec 03 06:52:25 crc kubenswrapper[4831]: E1203 06:52:25.515665 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a20f210-ae44-4272-865e-b3f869095c9a" containerName="init" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.515673 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a20f210-ae44-4272-865e-b3f869095c9a" containerName="init" Dec 03 06:52:25 crc kubenswrapper[4831]: E1203 06:52:25.515684 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a20f210-ae44-4272-865e-b3f869095c9a" containerName="dnsmasq-dns" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.515691 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a20f210-ae44-4272-865e-b3f869095c9a" containerName="dnsmasq-dns" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.515920 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a20f210-ae44-4272-865e-b3f869095c9a" containerName="dnsmasq-dns" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.515942 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e478744-468b-40e8-b4a3-236bdd2bd5ca" containerName="cinder-db-sync" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.515969 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fe82dbc-16d3-4d94-b356-d0b862ba2019" containerName="dnsmasq-dns" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.517159 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.522994 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.523233 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-qh756" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.523567 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.524141 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.552505 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.605564 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-scripts\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.605656 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-config-data\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.605714 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.605838 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1e42523-8ecb-451e-adc8-fb87212c45ba-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.605870 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.605909 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtdw8\" (UniqueName: \"kubernetes.io/projected/e1e42523-8ecb-451e-adc8-fb87212c45ba-kube-api-access-gtdw8\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.608849 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bggbl"] Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.610808 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.626892 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bggbl"] Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.708224 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-scripts\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.708354 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-dns-svc\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.708385 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.708414 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-config-data\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.708586 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.708744 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.708773 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.708798 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-config\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.708839 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1e42523-8ecb-451e-adc8-fb87212c45ba-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.708863 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.708987 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj87h\" (UniqueName: \"kubernetes.io/projected/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-kube-api-access-wj87h\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.709087 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtdw8\" (UniqueName: \"kubernetes.io/projected/e1e42523-8ecb-451e-adc8-fb87212c45ba-kube-api-access-gtdw8\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.708983 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1e42523-8ecb-451e-adc8-fb87212c45ba-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.714251 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-config-data\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.714959 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.717760 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.718077 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-scripts\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.733120 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtdw8\" (UniqueName: \"kubernetes.io/projected/e1e42523-8ecb-451e-adc8-fb87212c45ba-kube-api-access-gtdw8\") pod \"cinder-scheduler-0\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.773994 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.775449 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.778648 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.793637 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.812375 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13682415-6e71-4b15-970c-311c9d163f3d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.812428 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.812445 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-config-data\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.812462 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.812479 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-config\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.812501 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13682415-6e71-4b15-970c-311c9d163f3d-logs\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.812531 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj87h\" (UniqueName: \"kubernetes.io/projected/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-kube-api-access-wj87h\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.812566 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-scripts\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.812592 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdh4h\" (UniqueName: \"kubernetes.io/projected/13682415-6e71-4b15-970c-311c9d163f3d-kube-api-access-kdh4h\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.812621 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.812646 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-dns-svc\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.812667 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.812692 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-config-data-custom\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.813393 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.813635 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.813725 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-config\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.814190 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-dns-svc\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.814448 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.829685 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj87h\" (UniqueName: \"kubernetes.io/projected/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-kube-api-access-wj87h\") pod \"dnsmasq-dns-6578955fd5-bggbl\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.865225 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.919824 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdh4h\" (UniqueName: \"kubernetes.io/projected/13682415-6e71-4b15-970c-311c9d163f3d-kube-api-access-kdh4h\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.920364 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.920417 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-config-data-custom\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.920532 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13682415-6e71-4b15-970c-311c9d163f3d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.920568 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-config-data\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.920599 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13682415-6e71-4b15-970c-311c9d163f3d-logs\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.920644 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-scripts\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.925113 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.925431 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13682415-6e71-4b15-970c-311c9d163f3d-logs\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.925488 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13682415-6e71-4b15-970c-311c9d163f3d-etc-machine-id\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.926974 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-scripts\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.932091 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-config-data-custom\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.933420 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-config-data\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.933938 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:25 crc kubenswrapper[4831]: I1203 06:52:25.945935 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdh4h\" (UniqueName: \"kubernetes.io/projected/13682415-6e71-4b15-970c-311c9d163f3d-kube-api-access-kdh4h\") pod \"cinder-api-0\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " pod="openstack/cinder-api-0" Dec 03 06:52:26 crc kubenswrapper[4831]: I1203 06:52:26.130971 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 06:52:26 crc kubenswrapper[4831]: I1203 06:52:26.346073 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 06:52:26 crc kubenswrapper[4831]: W1203 06:52:26.356449 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1e42523_8ecb_451e_adc8_fb87212c45ba.slice/crio-7eb47b71b8f545a12b2c6fc36e81a73cd1adfa8cb56e4b1566f4a89d2f24d6c0 WatchSource:0}: Error finding container 7eb47b71b8f545a12b2c6fc36e81a73cd1adfa8cb56e4b1566f4a89d2f24d6c0: Status 404 returned error can't find the container with id 7eb47b71b8f545a12b2c6fc36e81a73cd1adfa8cb56e4b1566f4a89d2f24d6c0 Dec 03 06:52:26 crc kubenswrapper[4831]: I1203 06:52:26.430127 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bggbl"] Dec 03 06:52:26 crc kubenswrapper[4831]: W1203 06:52:26.432208 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda3cf71a6_daa0_4f51_ba7d_f1663e5669e7.slice/crio-2a243ee5e2ac9b1265ede5fccbf297a17120811b5ab1827cff58d87adf39a0f2 WatchSource:0}: Error finding container 2a243ee5e2ac9b1265ede5fccbf297a17120811b5ab1827cff58d87adf39a0f2: Status 404 returned error can't find the container with id 2a243ee5e2ac9b1265ede5fccbf297a17120811b5ab1827cff58d87adf39a0f2 Dec 03 06:52:26 crc kubenswrapper[4831]: I1203 06:52:26.617808 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.032050 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d4fb4bbcd-8782k" podUID="54778ced-f817-4738-9066-37b7328451d1" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:39928->10.217.0.158:9311: read: connection reset by peer" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.032115 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-d4fb4bbcd-8782k" podUID="54778ced-f817-4738-9066-37b7328451d1" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.158:9311/healthcheck\": read tcp 10.217.0.2:39922->10.217.0.158:9311: read: connection reset by peer" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.128014 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"13682415-6e71-4b15-970c-311c9d163f3d","Type":"ContainerStarted","Data":"ea811b6b8612da7581d425f4eca7a3ee7e3569912f7ee74575da9cf3ebe66436"} Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.129229 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1e42523-8ecb-451e-adc8-fb87212c45ba","Type":"ContainerStarted","Data":"7eb47b71b8f545a12b2c6fc36e81a73cd1adfa8cb56e4b1566f4a89d2f24d6c0"} Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.138591 4831 generic.go:334] "Generic (PLEG): container finished" podID="a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" containerID="189867c0bb890f5deb91e700fea6eb59d7952dd3df91cce7a5a46136b51ef4e8" exitCode=0 Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.139724 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" event={"ID":"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7","Type":"ContainerDied","Data":"189867c0bb890f5deb91e700fea6eb59d7952dd3df91cce7a5a46136b51ef4e8"} Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.139749 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" event={"ID":"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7","Type":"ContainerStarted","Data":"2a243ee5e2ac9b1265ede5fccbf297a17120811b5ab1827cff58d87adf39a0f2"} Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.148252 4831 generic.go:334] "Generic (PLEG): container finished" podID="54778ced-f817-4738-9066-37b7328451d1" containerID="1953e076b46a3e3c62ba0f755d17e277b6461b49fcb04e0d7a8393e70269ba8d" exitCode=0 Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.148361 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4fb4bbcd-8782k" event={"ID":"54778ced-f817-4738-9066-37b7328451d1","Type":"ContainerDied","Data":"1953e076b46a3e3c62ba0f755d17e277b6461b49fcb04e0d7a8393e70269ba8d"} Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.203275 4831 generic.go:334] "Generic (PLEG): container finished" podID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerID="9b09581aa4cf02f951f3945c173c08573f08c12c6fb8a54a186f1e8f7b1c734a" exitCode=0 Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.203356 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7a6ac9f-6470-4d84-a939-831f270eae54","Type":"ContainerDied","Data":"9b09581aa4cf02f951f3945c173c08573f08c12c6fb8a54a186f1e8f7b1c734a"} Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.414577 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.462168 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-combined-ca-bundle\") pod \"a7a6ac9f-6470-4d84-a939-831f270eae54\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.462284 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7a6ac9f-6470-4d84-a939-831f270eae54-log-httpd\") pod \"a7a6ac9f-6470-4d84-a939-831f270eae54\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.462331 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7a6ac9f-6470-4d84-a939-831f270eae54-run-httpd\") pod \"a7a6ac9f-6470-4d84-a939-831f270eae54\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.462416 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-scripts\") pod \"a7a6ac9f-6470-4d84-a939-831f270eae54\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.462486 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-config-data\") pod \"a7a6ac9f-6470-4d84-a939-831f270eae54\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.462532 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t685g\" (UniqueName: \"kubernetes.io/projected/a7a6ac9f-6470-4d84-a939-831f270eae54-kube-api-access-t685g\") pod \"a7a6ac9f-6470-4d84-a939-831f270eae54\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.462585 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-sg-core-conf-yaml\") pod \"a7a6ac9f-6470-4d84-a939-831f270eae54\" (UID: \"a7a6ac9f-6470-4d84-a939-831f270eae54\") " Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.464335 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7a6ac9f-6470-4d84-a939-831f270eae54-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a7a6ac9f-6470-4d84-a939-831f270eae54" (UID: "a7a6ac9f-6470-4d84-a939-831f270eae54"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.473598 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a7a6ac9f-6470-4d84-a939-831f270eae54-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a7a6ac9f-6470-4d84-a939-831f270eae54" (UID: "a7a6ac9f-6470-4d84-a939-831f270eae54"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.476266 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7a6ac9f-6470-4d84-a939-831f270eae54-kube-api-access-t685g" (OuterVolumeSpecName: "kube-api-access-t685g") pod "a7a6ac9f-6470-4d84-a939-831f270eae54" (UID: "a7a6ac9f-6470-4d84-a939-831f270eae54"). InnerVolumeSpecName "kube-api-access-t685g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.481730 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-scripts" (OuterVolumeSpecName: "scripts") pod "a7a6ac9f-6470-4d84-a939-831f270eae54" (UID: "a7a6ac9f-6470-4d84-a939-831f270eae54"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.498272 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a7a6ac9f-6470-4d84-a939-831f270eae54" (UID: "a7a6ac9f-6470-4d84-a939-831f270eae54"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.528969 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.561127 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a7a6ac9f-6470-4d84-a939-831f270eae54" (UID: "a7a6ac9f-6470-4d84-a939-831f270eae54"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.566852 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-combined-ca-bundle\") pod \"54778ced-f817-4738-9066-37b7328451d1\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.566923 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54778ced-f817-4738-9066-37b7328451d1-logs\") pod \"54778ced-f817-4738-9066-37b7328451d1\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.566980 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4h969\" (UniqueName: \"kubernetes.io/projected/54778ced-f817-4738-9066-37b7328451d1-kube-api-access-4h969\") pod \"54778ced-f817-4738-9066-37b7328451d1\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.567040 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-config-data\") pod \"54778ced-f817-4738-9066-37b7328451d1\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.567080 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-config-data-custom\") pod \"54778ced-f817-4738-9066-37b7328451d1\" (UID: \"54778ced-f817-4738-9066-37b7328451d1\") " Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.567498 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7a6ac9f-6470-4d84-a939-831f270eae54-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.567519 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a7a6ac9f-6470-4d84-a939-831f270eae54-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.567530 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.567542 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t685g\" (UniqueName: \"kubernetes.io/projected/a7a6ac9f-6470-4d84-a939-831f270eae54-kube-api-access-t685g\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.567555 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.567566 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.574790 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "54778ced-f817-4738-9066-37b7328451d1" (UID: "54778ced-f817-4738-9066-37b7328451d1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.584376 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54778ced-f817-4738-9066-37b7328451d1-logs" (OuterVolumeSpecName: "logs") pod "54778ced-f817-4738-9066-37b7328451d1" (UID: "54778ced-f817-4738-9066-37b7328451d1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.591598 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54778ced-f817-4738-9066-37b7328451d1-kube-api-access-4h969" (OuterVolumeSpecName: "kube-api-access-4h969") pod "54778ced-f817-4738-9066-37b7328451d1" (UID: "54778ced-f817-4738-9066-37b7328451d1"). InnerVolumeSpecName "kube-api-access-4h969". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.631431 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "54778ced-f817-4738-9066-37b7328451d1" (UID: "54778ced-f817-4738-9066-37b7328451d1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.634462 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-config-data" (OuterVolumeSpecName: "config-data") pod "a7a6ac9f-6470-4d84-a939-831f270eae54" (UID: "a7a6ac9f-6470-4d84-a939-831f270eae54"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.662464 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.664506 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-config-data" (OuterVolumeSpecName: "config-data") pod "54778ced-f817-4738-9066-37b7328451d1" (UID: "54778ced-f817-4738-9066-37b7328451d1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.668873 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.668921 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/54778ced-f817-4738-9066-37b7328451d1-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.668932 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4h969\" (UniqueName: \"kubernetes.io/projected/54778ced-f817-4738-9066-37b7328451d1-kube-api-access-4h969\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.668942 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.668952 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/54778ced-f817-4738-9066-37b7328451d1-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:27 crc kubenswrapper[4831]: I1203 06:52:27.668961 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7a6ac9f-6470-4d84-a939-831f270eae54-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.215896 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-d4fb4bbcd-8782k" event={"ID":"54778ced-f817-4738-9066-37b7328451d1","Type":"ContainerDied","Data":"91ba9faa465fc86c0b9f71f4c2dbf5415a39ffffaf26eb6dab7e9a9747eec908"} Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.216494 4831 scope.go:117] "RemoveContainer" containerID="1953e076b46a3e3c62ba0f755d17e277b6461b49fcb04e0d7a8393e70269ba8d" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.216523 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-d4fb4bbcd-8782k" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.242639 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a7a6ac9f-6470-4d84-a939-831f270eae54","Type":"ContainerDied","Data":"ad3bbf7df0f8212abe98d4c2e2ee78158e622e9d1d944b9218e52b270c9b9b31"} Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.242760 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.250670 4831 scope.go:117] "RemoveContainer" containerID="3dd0510e00e24e326f421df52e6b7aafe3de33f2df90d54e1402c10da898c408" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.264614 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"13682415-6e71-4b15-970c-311c9d163f3d","Type":"ContainerStarted","Data":"3acd0095176c1b2f915f4b35353a5df5914991204a2e41e06ecb4f52ceb13ed7"} Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.264743 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"13682415-6e71-4b15-970c-311c9d163f3d","Type":"ContainerStarted","Data":"42fce37b570c2e1294e4c9f9bd73d5d6db29cab63f6d1331ab2e259c3b95d6e4"} Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.264926 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="13682415-6e71-4b15-970c-311c9d163f3d" containerName="cinder-api-log" containerID="cri-o://42fce37b570c2e1294e4c9f9bd73d5d6db29cab63f6d1331ab2e259c3b95d6e4" gracePeriod=30 Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.265429 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="13682415-6e71-4b15-970c-311c9d163f3d" containerName="cinder-api" containerID="cri-o://3acd0095176c1b2f915f4b35353a5df5914991204a2e41e06ecb4f52ceb13ed7" gracePeriod=30 Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.265278 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.268002 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-d4fb4bbcd-8782k"] Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.271222 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1e42523-8ecb-451e-adc8-fb87212c45ba","Type":"ContainerStarted","Data":"35dfa6ea45d1f2ed6c6395aa895492cdd13e56de7e2e1fe334a6243c2a8153a3"} Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.283982 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" event={"ID":"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7","Type":"ContainerStarted","Data":"37442f25c3406732dc8f977e015ae0191d021a30df623fa7e2cf9a8112d10750"} Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.284376 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.292290 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-d4fb4bbcd-8782k"] Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.323006 4831 scope.go:117] "RemoveContainer" containerID="3438496b6bdd26d4cbbce4820298a70507fec7c916c5e396a89809c17892488b" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.323669 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.342371 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.348243 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.348224716 podStartE2EDuration="3.348224716s" podCreationTimestamp="2025-12-03 06:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:28.29474112 +0000 UTC m=+1285.638324628" watchObservedRunningTime="2025-12-03 06:52:28.348224716 +0000 UTC m=+1285.691808214" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.366950 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:28 crc kubenswrapper[4831]: E1203 06:52:28.367623 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="ceilometer-notification-agent" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.367643 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="ceilometer-notification-agent" Dec 03 06:52:28 crc kubenswrapper[4831]: E1203 06:52:28.367658 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="sg-core" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.367663 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="sg-core" Dec 03 06:52:28 crc kubenswrapper[4831]: E1203 06:52:28.367672 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="ceilometer-central-agent" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.367678 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="ceilometer-central-agent" Dec 03 06:52:28 crc kubenswrapper[4831]: E1203 06:52:28.367701 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54778ced-f817-4738-9066-37b7328451d1" containerName="barbican-api-log" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.367707 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="54778ced-f817-4738-9066-37b7328451d1" containerName="barbican-api-log" Dec 03 06:52:28 crc kubenswrapper[4831]: E1203 06:52:28.367723 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="proxy-httpd" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.367730 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="proxy-httpd" Dec 03 06:52:28 crc kubenswrapper[4831]: E1203 06:52:28.367753 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54778ced-f817-4738-9066-37b7328451d1" containerName="barbican-api" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.367781 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="54778ced-f817-4738-9066-37b7328451d1" containerName="barbican-api" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.368068 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="54778ced-f817-4738-9066-37b7328451d1" containerName="barbican-api" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.368096 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="ceilometer-notification-agent" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.368116 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="ceilometer-central-agent" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.368128 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="proxy-httpd" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.368148 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" containerName="sg-core" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.368188 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="54778ced-f817-4738-9066-37b7328451d1" containerName="barbican-api-log" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.370116 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.375870 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.376894 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.381131 4831 scope.go:117] "RemoveContainer" containerID="ac234ed316a0604a189aab36ffc6354eac66ef142b09c418abc4c7d297ee7f65" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.391626 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.423971 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" podStartSLOduration=3.423953439 podStartE2EDuration="3.423953439s" podCreationTimestamp="2025-12-03 06:52:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:28.338598574 +0000 UTC m=+1285.682182112" watchObservedRunningTime="2025-12-03 06:52:28.423953439 +0000 UTC m=+1285.767536947" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.454489 4831 scope.go:117] "RemoveContainer" containerID="9b09581aa4cf02f951f3945c173c08573f08c12c6fb8a54a186f1e8f7b1c734a" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.484128 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.484178 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/561be991-dcaa-43e8-aa44-36bc520372fe-run-httpd\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.484278 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/561be991-dcaa-43e8-aa44-36bc520372fe-log-httpd\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.484333 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-scripts\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.484385 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-config-data\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.484417 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.484434 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7npm\" (UniqueName: \"kubernetes.io/projected/561be991-dcaa-43e8-aa44-36bc520372fe-kube-api-access-t7npm\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.498299 4831 scope.go:117] "RemoveContainer" containerID="997a1617657d6929bd74782299aff40305a25d414d295a9191fde264fe18730a" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.585859 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/561be991-dcaa-43e8-aa44-36bc520372fe-log-httpd\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.586118 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-scripts\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.586139 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-config-data\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.586176 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.586193 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7npm\" (UniqueName: \"kubernetes.io/projected/561be991-dcaa-43e8-aa44-36bc520372fe-kube-api-access-t7npm\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.586254 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.586276 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/561be991-dcaa-43e8-aa44-36bc520372fe-run-httpd\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.586678 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/561be991-dcaa-43e8-aa44-36bc520372fe-run-httpd\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.587877 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/561be991-dcaa-43e8-aa44-36bc520372fe-log-httpd\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.592149 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.592464 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.592579 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-scripts\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.593411 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-config-data\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.602843 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7npm\" (UniqueName: \"kubernetes.io/projected/561be991-dcaa-43e8-aa44-36bc520372fe-kube-api-access-t7npm\") pod \"ceilometer-0\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " pod="openstack/ceilometer-0" Dec 03 06:52:28 crc kubenswrapper[4831]: I1203 06:52:28.756502 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:29 crc kubenswrapper[4831]: I1203 06:52:29.053579 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54778ced-f817-4738-9066-37b7328451d1" path="/var/lib/kubelet/pods/54778ced-f817-4738-9066-37b7328451d1/volumes" Dec 03 06:52:29 crc kubenswrapper[4831]: I1203 06:52:29.059744 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a7a6ac9f-6470-4d84-a939-831f270eae54" path="/var/lib/kubelet/pods/a7a6ac9f-6470-4d84-a939-831f270eae54/volumes" Dec 03 06:52:29 crc kubenswrapper[4831]: W1203 06:52:29.255693 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod561be991_dcaa_43e8_aa44_36bc520372fe.slice/crio-6d43312896af3441a531a6af22bcf56de3eb731f79367632cd6c8297b81c1d07 WatchSource:0}: Error finding container 6d43312896af3441a531a6af22bcf56de3eb731f79367632cd6c8297b81c1d07: Status 404 returned error can't find the container with id 6d43312896af3441a531a6af22bcf56de3eb731f79367632cd6c8297b81c1d07 Dec 03 06:52:29 crc kubenswrapper[4831]: I1203 06:52:29.256216 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:29 crc kubenswrapper[4831]: I1203 06:52:29.294948 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1e42523-8ecb-451e-adc8-fb87212c45ba","Type":"ContainerStarted","Data":"5eec22220ae40fd86b2b181c400ae7357c2281cdd66131de7a9627de56d5844f"} Dec 03 06:52:29 crc kubenswrapper[4831]: I1203 06:52:29.300049 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"561be991-dcaa-43e8-aa44-36bc520372fe","Type":"ContainerStarted","Data":"6d43312896af3441a531a6af22bcf56de3eb731f79367632cd6c8297b81c1d07"} Dec 03 06:52:29 crc kubenswrapper[4831]: I1203 06:52:29.301679 4831 generic.go:334] "Generic (PLEG): container finished" podID="13682415-6e71-4b15-970c-311c9d163f3d" containerID="42fce37b570c2e1294e4c9f9bd73d5d6db29cab63f6d1331ab2e259c3b95d6e4" exitCode=143 Dec 03 06:52:29 crc kubenswrapper[4831]: I1203 06:52:29.302547 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"13682415-6e71-4b15-970c-311c9d163f3d","Type":"ContainerDied","Data":"42fce37b570c2e1294e4c9f9bd73d5d6db29cab63f6d1331ab2e259c3b95d6e4"} Dec 03 06:52:30 crc kubenswrapper[4831]: I1203 06:52:30.339576 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"561be991-dcaa-43e8-aa44-36bc520372fe","Type":"ContainerStarted","Data":"29aa0c2499b14fae3bcfd8bdd98b9a6c961c0d42b19897f90300f4136cde6e18"} Dec 03 06:52:30 crc kubenswrapper[4831]: I1203 06:52:30.865764 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 06:52:31 crc kubenswrapper[4831]: I1203 06:52:31.371273 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"561be991-dcaa-43e8-aa44-36bc520372fe","Type":"ContainerStarted","Data":"5a11208a0d747fa6881cb7bb233e160fdf096c558cf7a3477399b883f603c094"} Dec 03 06:52:31 crc kubenswrapper[4831]: I1203 06:52:31.977873 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:32 crc kubenswrapper[4831]: I1203 06:52:32.010109 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=6.315806408 podStartE2EDuration="7.010077233s" podCreationTimestamp="2025-12-03 06:52:25 +0000 UTC" firstStartedPulling="2025-12-03 06:52:26.358568229 +0000 UTC m=+1283.702151747" lastFinishedPulling="2025-12-03 06:52:27.052839064 +0000 UTC m=+1284.396422572" observedRunningTime="2025-12-03 06:52:29.337827007 +0000 UTC m=+1286.681410515" watchObservedRunningTime="2025-12-03 06:52:32.010077233 +0000 UTC m=+1289.353660781" Dec 03 06:52:32 crc kubenswrapper[4831]: I1203 06:52:32.385568 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"561be991-dcaa-43e8-aa44-36bc520372fe","Type":"ContainerStarted","Data":"9956faa0c6448ec45f1b30ab0bae883e690c8bf4a9c2c996f13f1c42e18b2b77"} Dec 03 06:52:33 crc kubenswrapper[4831]: I1203 06:52:33.400411 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"561be991-dcaa-43e8-aa44-36bc520372fe","Type":"ContainerStarted","Data":"55fb83467af43f754587c9982b500c4a9bc6a85477a4efbace9ee5a6dc39c59b"} Dec 03 06:52:33 crc kubenswrapper[4831]: I1203 06:52:33.401044 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 06:52:33 crc kubenswrapper[4831]: I1203 06:52:33.431787 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.969457919 podStartE2EDuration="5.431761173s" podCreationTimestamp="2025-12-03 06:52:28 +0000 UTC" firstStartedPulling="2025-12-03 06:52:29.258585653 +0000 UTC m=+1286.602169161" lastFinishedPulling="2025-12-03 06:52:32.720888897 +0000 UTC m=+1290.064472415" observedRunningTime="2025-12-03 06:52:33.429207983 +0000 UTC m=+1290.772791501" watchObservedRunningTime="2025-12-03 06:52:33.431761173 +0000 UTC m=+1290.775344691" Dec 03 06:52:34 crc kubenswrapper[4831]: I1203 06:52:34.675302 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:52:34 crc kubenswrapper[4831]: I1203 06:52:34.762850 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7db8f48c84-p8t9t"] Dec 03 06:52:34 crc kubenswrapper[4831]: I1203 06:52:34.763360 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7db8f48c84-p8t9t" podUID="25c0f5bd-0634-4b2e-90e6-b73305b0d259" containerName="neutron-api" containerID="cri-o://90f51f3d66a5492835a792da32b949970ec9443e67e8846c79aa7e8c93ed8ce2" gracePeriod=30 Dec 03 06:52:34 crc kubenswrapper[4831]: I1203 06:52:34.763827 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-7db8f48c84-p8t9t" podUID="25c0f5bd-0634-4b2e-90e6-b73305b0d259" containerName="neutron-httpd" containerID="cri-o://9311b1db5e01c49b0e90298d85e8c6bed68df9a1d8cd2a6b67631d71acfbd32b" gracePeriod=30 Dec 03 06:52:35 crc kubenswrapper[4831]: I1203 06:52:35.419384 4831 generic.go:334] "Generic (PLEG): container finished" podID="25c0f5bd-0634-4b2e-90e6-b73305b0d259" containerID="9311b1db5e01c49b0e90298d85e8c6bed68df9a1d8cd2a6b67631d71acfbd32b" exitCode=0 Dec 03 06:52:35 crc kubenswrapper[4831]: I1203 06:52:35.419413 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7db8f48c84-p8t9t" event={"ID":"25c0f5bd-0634-4b2e-90e6-b73305b0d259","Type":"ContainerDied","Data":"9311b1db5e01c49b0e90298d85e8c6bed68df9a1d8cd2a6b67631d71acfbd32b"} Dec 03 06:52:35 crc kubenswrapper[4831]: I1203 06:52:35.935710 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.041563 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jjzjd"] Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.042077 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" podUID="54839c01-76cd-4ab5-89a0-77494bfb730e" containerName="dnsmasq-dns" containerID="cri-o://a39b9a0e8af48a1703c82b148772583d762fe1941b1f05de5c2dddab58397fa3" gracePeriod=10 Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.137637 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.227281 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.498452 4831 generic.go:334] "Generic (PLEG): container finished" podID="54839c01-76cd-4ab5-89a0-77494bfb730e" containerID="a39b9a0e8af48a1703c82b148772583d762fe1941b1f05de5c2dddab58397fa3" exitCode=0 Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.498931 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e1e42523-8ecb-451e-adc8-fb87212c45ba" containerName="cinder-scheduler" containerID="cri-o://35dfa6ea45d1f2ed6c6395aa895492cdd13e56de7e2e1fe334a6243c2a8153a3" gracePeriod=30 Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.499256 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" event={"ID":"54839c01-76cd-4ab5-89a0-77494bfb730e","Type":"ContainerDied","Data":"a39b9a0e8af48a1703c82b148772583d762fe1941b1f05de5c2dddab58397fa3"} Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.499462 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e1e42523-8ecb-451e-adc8-fb87212c45ba" containerName="probe" containerID="cri-o://5eec22220ae40fd86b2b181c400ae7357c2281cdd66131de7a9627de56d5844f" gracePeriod=30 Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.650540 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.652985 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-ovsdbserver-sb\") pod \"54839c01-76cd-4ab5-89a0-77494bfb730e\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.653046 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6mxq\" (UniqueName: \"kubernetes.io/projected/54839c01-76cd-4ab5-89a0-77494bfb730e-kube-api-access-p6mxq\") pod \"54839c01-76cd-4ab5-89a0-77494bfb730e\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.653073 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-dns-svc\") pod \"54839c01-76cd-4ab5-89a0-77494bfb730e\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.653129 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-config\") pod \"54839c01-76cd-4ab5-89a0-77494bfb730e\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.653167 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-dns-swift-storage-0\") pod \"54839c01-76cd-4ab5-89a0-77494bfb730e\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.653208 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-ovsdbserver-nb\") pod \"54839c01-76cd-4ab5-89a0-77494bfb730e\" (UID: \"54839c01-76cd-4ab5-89a0-77494bfb730e\") " Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.673673 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54839c01-76cd-4ab5-89a0-77494bfb730e-kube-api-access-p6mxq" (OuterVolumeSpecName: "kube-api-access-p6mxq") pod "54839c01-76cd-4ab5-89a0-77494bfb730e" (UID: "54839c01-76cd-4ab5-89a0-77494bfb730e"). InnerVolumeSpecName "kube-api-access-p6mxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.728045 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "54839c01-76cd-4ab5-89a0-77494bfb730e" (UID: "54839c01-76cd-4ab5-89a0-77494bfb730e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.729838 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "54839c01-76cd-4ab5-89a0-77494bfb730e" (UID: "54839c01-76cd-4ab5-89a0-77494bfb730e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.747912 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-config" (OuterVolumeSpecName: "config") pod "54839c01-76cd-4ab5-89a0-77494bfb730e" (UID: "54839c01-76cd-4ab5-89a0-77494bfb730e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.758145 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.758175 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.758186 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6mxq\" (UniqueName: \"kubernetes.io/projected/54839c01-76cd-4ab5-89a0-77494bfb730e-kube-api-access-p6mxq\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.758194 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.775043 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "54839c01-76cd-4ab5-89a0-77494bfb730e" (UID: "54839c01-76cd-4ab5-89a0-77494bfb730e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.794344 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "54839c01-76cd-4ab5-89a0-77494bfb730e" (UID: "54839c01-76cd-4ab5-89a0-77494bfb730e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.859996 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:36 crc kubenswrapper[4831]: I1203 06:52:36.860038 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/54839c01-76cd-4ab5-89a0-77494bfb730e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:37 crc kubenswrapper[4831]: I1203 06:52:37.509210 4831 generic.go:334] "Generic (PLEG): container finished" podID="e1e42523-8ecb-451e-adc8-fb87212c45ba" containerID="5eec22220ae40fd86b2b181c400ae7357c2281cdd66131de7a9627de56d5844f" exitCode=0 Dec 03 06:52:37 crc kubenswrapper[4831]: I1203 06:52:37.509288 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1e42523-8ecb-451e-adc8-fb87212c45ba","Type":"ContainerDied","Data":"5eec22220ae40fd86b2b181c400ae7357c2281cdd66131de7a9627de56d5844f"} Dec 03 06:52:37 crc kubenswrapper[4831]: I1203 06:52:37.511211 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" event={"ID":"54839c01-76cd-4ab5-89a0-77494bfb730e","Type":"ContainerDied","Data":"b844f412962b72bdc9c8760f7a4e79e326fb38adc4c925a6a52d16d6d7c004ce"} Dec 03 06:52:37 crc kubenswrapper[4831]: I1203 06:52:37.511254 4831 scope.go:117] "RemoveContainer" containerID="a39b9a0e8af48a1703c82b148772583d762fe1941b1f05de5c2dddab58397fa3" Dec 03 06:52:37 crc kubenswrapper[4831]: I1203 06:52:37.511292 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-jjzjd" Dec 03 06:52:37 crc kubenswrapper[4831]: I1203 06:52:37.531884 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jjzjd"] Dec 03 06:52:37 crc kubenswrapper[4831]: I1203 06:52:37.538698 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-jjzjd"] Dec 03 06:52:37 crc kubenswrapper[4831]: I1203 06:52:37.540793 4831 scope.go:117] "RemoveContainer" containerID="a5627561cbf5dbdf46932ea85ec31175048d56585662c3edefd03f310966ee8a" Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.180350 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.527105 4831 generic.go:334] "Generic (PLEG): container finished" podID="25c0f5bd-0634-4b2e-90e6-b73305b0d259" containerID="90f51f3d66a5492835a792da32b949970ec9443e67e8846c79aa7e8c93ed8ce2" exitCode=0 Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.527164 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7db8f48c84-p8t9t" event={"ID":"25c0f5bd-0634-4b2e-90e6-b73305b0d259","Type":"ContainerDied","Data":"90f51f3d66a5492835a792da32b949970ec9443e67e8846c79aa7e8c93ed8ce2"} Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.628224 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.790654 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-httpd-config\") pod \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.790983 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-ovndb-tls-certs\") pod \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.791076 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-combined-ca-bundle\") pod \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.791166 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-config\") pod \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.791212 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26txw\" (UniqueName: \"kubernetes.io/projected/25c0f5bd-0634-4b2e-90e6-b73305b0d259-kube-api-access-26txw\") pod \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\" (UID: \"25c0f5bd-0634-4b2e-90e6-b73305b0d259\") " Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.797371 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25c0f5bd-0634-4b2e-90e6-b73305b0d259-kube-api-access-26txw" (OuterVolumeSpecName: "kube-api-access-26txw") pod "25c0f5bd-0634-4b2e-90e6-b73305b0d259" (UID: "25c0f5bd-0634-4b2e-90e6-b73305b0d259"). InnerVolumeSpecName "kube-api-access-26txw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.803839 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "25c0f5bd-0634-4b2e-90e6-b73305b0d259" (UID: "25c0f5bd-0634-4b2e-90e6-b73305b0d259"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.900026 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-26txw\" (UniqueName: \"kubernetes.io/projected/25c0f5bd-0634-4b2e-90e6-b73305b0d259-kube-api-access-26txw\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.900231 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.950874 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25c0f5bd-0634-4b2e-90e6-b73305b0d259" (UID: "25c0f5bd-0634-4b2e-90e6-b73305b0d259"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.959245 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-config" (OuterVolumeSpecName: "config") pod "25c0f5bd-0634-4b2e-90e6-b73305b0d259" (UID: "25c0f5bd-0634-4b2e-90e6-b73305b0d259"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:38 crc kubenswrapper[4831]: I1203 06:52:38.983495 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "25c0f5bd-0634-4b2e-90e6-b73305b0d259" (UID: "25c0f5bd-0634-4b2e-90e6-b73305b0d259"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.003198 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.003456 4831 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.003555 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25c0f5bd-0634-4b2e-90e6-b73305b0d259-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.025622 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54839c01-76cd-4ab5-89a0-77494bfb730e" path="/var/lib/kubelet/pods/54839c01-76cd-4ab5-89a0-77494bfb730e/volumes" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.164258 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.208368 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-combined-ca-bundle\") pod \"e1e42523-8ecb-451e-adc8-fb87212c45ba\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.208628 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1e42523-8ecb-451e-adc8-fb87212c45ba-etc-machine-id\") pod \"e1e42523-8ecb-451e-adc8-fb87212c45ba\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.208743 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gtdw8\" (UniqueName: \"kubernetes.io/projected/e1e42523-8ecb-451e-adc8-fb87212c45ba-kube-api-access-gtdw8\") pod \"e1e42523-8ecb-451e-adc8-fb87212c45ba\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.208855 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-config-data-custom\") pod \"e1e42523-8ecb-451e-adc8-fb87212c45ba\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.208965 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-scripts\") pod \"e1e42523-8ecb-451e-adc8-fb87212c45ba\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.209068 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-config-data\") pod \"e1e42523-8ecb-451e-adc8-fb87212c45ba\" (UID: \"e1e42523-8ecb-451e-adc8-fb87212c45ba\") " Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.212858 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1e42523-8ecb-451e-adc8-fb87212c45ba-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e1e42523-8ecb-451e-adc8-fb87212c45ba" (UID: "e1e42523-8ecb-451e-adc8-fb87212c45ba"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.215770 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e1e42523-8ecb-451e-adc8-fb87212c45ba" (UID: "e1e42523-8ecb-451e-adc8-fb87212c45ba"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.223589 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1e42523-8ecb-451e-adc8-fb87212c45ba-kube-api-access-gtdw8" (OuterVolumeSpecName: "kube-api-access-gtdw8") pod "e1e42523-8ecb-451e-adc8-fb87212c45ba" (UID: "e1e42523-8ecb-451e-adc8-fb87212c45ba"). InnerVolumeSpecName "kube-api-access-gtdw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.229486 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-scripts" (OuterVolumeSpecName: "scripts") pod "e1e42523-8ecb-451e-adc8-fb87212c45ba" (UID: "e1e42523-8ecb-451e-adc8-fb87212c45ba"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.315501 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1e42523-8ecb-451e-adc8-fb87212c45ba" (UID: "e1e42523-8ecb-451e-adc8-fb87212c45ba"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.316574 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.316592 4831 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e1e42523-8ecb-451e-adc8-fb87212c45ba-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.316601 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gtdw8\" (UniqueName: \"kubernetes.io/projected/e1e42523-8ecb-451e-adc8-fb87212c45ba-kube-api-access-gtdw8\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.316610 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.316620 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.437478 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-config-data" (OuterVolumeSpecName: "config-data") pod "e1e42523-8ecb-451e-adc8-fb87212c45ba" (UID: "e1e42523-8ecb-451e-adc8-fb87212c45ba"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.519063 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1e42523-8ecb-451e-adc8-fb87212c45ba-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.538824 4831 generic.go:334] "Generic (PLEG): container finished" podID="e1e42523-8ecb-451e-adc8-fb87212c45ba" containerID="35dfa6ea45d1f2ed6c6395aa895492cdd13e56de7e2e1fe334a6243c2a8153a3" exitCode=0 Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.538874 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1e42523-8ecb-451e-adc8-fb87212c45ba","Type":"ContainerDied","Data":"35dfa6ea45d1f2ed6c6395aa895492cdd13e56de7e2e1fe334a6243c2a8153a3"} Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.538912 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.538934 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e1e42523-8ecb-451e-adc8-fb87212c45ba","Type":"ContainerDied","Data":"7eb47b71b8f545a12b2c6fc36e81a73cd1adfa8cb56e4b1566f4a89d2f24d6c0"} Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.538999 4831 scope.go:117] "RemoveContainer" containerID="5eec22220ae40fd86b2b181c400ae7357c2281cdd66131de7a9627de56d5844f" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.545536 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-7db8f48c84-p8t9t" event={"ID":"25c0f5bd-0634-4b2e-90e6-b73305b0d259","Type":"ContainerDied","Data":"4096d24682bd7302d931f7b92f01775e586d38038bb12b2fd29a04121ff55d64"} Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.545897 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-7db8f48c84-p8t9t" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.600374 4831 scope.go:117] "RemoveContainer" containerID="35dfa6ea45d1f2ed6c6395aa895492cdd13e56de7e2e1fe334a6243c2a8153a3" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.614124 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-7db8f48c84-p8t9t"] Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.632847 4831 scope.go:117] "RemoveContainer" containerID="5eec22220ae40fd86b2b181c400ae7357c2281cdd66131de7a9627de56d5844f" Dec 03 06:52:39 crc kubenswrapper[4831]: E1203 06:52:39.638304 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eec22220ae40fd86b2b181c400ae7357c2281cdd66131de7a9627de56d5844f\": container with ID starting with 5eec22220ae40fd86b2b181c400ae7357c2281cdd66131de7a9627de56d5844f not found: ID does not exist" containerID="5eec22220ae40fd86b2b181c400ae7357c2281cdd66131de7a9627de56d5844f" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.638375 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eec22220ae40fd86b2b181c400ae7357c2281cdd66131de7a9627de56d5844f"} err="failed to get container status \"5eec22220ae40fd86b2b181c400ae7357c2281cdd66131de7a9627de56d5844f\": rpc error: code = NotFound desc = could not find container \"5eec22220ae40fd86b2b181c400ae7357c2281cdd66131de7a9627de56d5844f\": container with ID starting with 5eec22220ae40fd86b2b181c400ae7357c2281cdd66131de7a9627de56d5844f not found: ID does not exist" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.638407 4831 scope.go:117] "RemoveContainer" containerID="35dfa6ea45d1f2ed6c6395aa895492cdd13e56de7e2e1fe334a6243c2a8153a3" Dec 03 06:52:39 crc kubenswrapper[4831]: E1203 06:52:39.638653 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35dfa6ea45d1f2ed6c6395aa895492cdd13e56de7e2e1fe334a6243c2a8153a3\": container with ID starting with 35dfa6ea45d1f2ed6c6395aa895492cdd13e56de7e2e1fe334a6243c2a8153a3 not found: ID does not exist" containerID="35dfa6ea45d1f2ed6c6395aa895492cdd13e56de7e2e1fe334a6243c2a8153a3" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.638673 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35dfa6ea45d1f2ed6c6395aa895492cdd13e56de7e2e1fe334a6243c2a8153a3"} err="failed to get container status \"35dfa6ea45d1f2ed6c6395aa895492cdd13e56de7e2e1fe334a6243c2a8153a3\": rpc error: code = NotFound desc = could not find container \"35dfa6ea45d1f2ed6c6395aa895492cdd13e56de7e2e1fe334a6243c2a8153a3\": container with ID starting with 35dfa6ea45d1f2ed6c6395aa895492cdd13e56de7e2e1fe334a6243c2a8153a3 not found: ID does not exist" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.638687 4831 scope.go:117] "RemoveContainer" containerID="9311b1db5e01c49b0e90298d85e8c6bed68df9a1d8cd2a6b67631d71acfbd32b" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.663542 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-7db8f48c84-p8t9t"] Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.664244 4831 scope.go:117] "RemoveContainer" containerID="90f51f3d66a5492835a792da32b949970ec9443e67e8846c79aa7e8c93ed8ce2" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.695818 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.715360 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.722360 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 06:52:39 crc kubenswrapper[4831]: E1203 06:52:39.722842 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e42523-8ecb-451e-adc8-fb87212c45ba" containerName="probe" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.722861 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e42523-8ecb-451e-adc8-fb87212c45ba" containerName="probe" Dec 03 06:52:39 crc kubenswrapper[4831]: E1203 06:52:39.722878 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54839c01-76cd-4ab5-89a0-77494bfb730e" containerName="init" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.722885 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="54839c01-76cd-4ab5-89a0-77494bfb730e" containerName="init" Dec 03 06:52:39 crc kubenswrapper[4831]: E1203 06:52:39.722895 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54839c01-76cd-4ab5-89a0-77494bfb730e" containerName="dnsmasq-dns" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.722901 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="54839c01-76cd-4ab5-89a0-77494bfb730e" containerName="dnsmasq-dns" Dec 03 06:52:39 crc kubenswrapper[4831]: E1203 06:52:39.722911 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c0f5bd-0634-4b2e-90e6-b73305b0d259" containerName="neutron-httpd" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.722916 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c0f5bd-0634-4b2e-90e6-b73305b0d259" containerName="neutron-httpd" Dec 03 06:52:39 crc kubenswrapper[4831]: E1203 06:52:39.722946 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25c0f5bd-0634-4b2e-90e6-b73305b0d259" containerName="neutron-api" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.722952 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="25c0f5bd-0634-4b2e-90e6-b73305b0d259" containerName="neutron-api" Dec 03 06:52:39 crc kubenswrapper[4831]: E1203 06:52:39.722960 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1e42523-8ecb-451e-adc8-fb87212c45ba" containerName="cinder-scheduler" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.722966 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1e42523-8ecb-451e-adc8-fb87212c45ba" containerName="cinder-scheduler" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.723124 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e42523-8ecb-451e-adc8-fb87212c45ba" containerName="probe" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.723136 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c0f5bd-0634-4b2e-90e6-b73305b0d259" containerName="neutron-api" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.723145 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="54839c01-76cd-4ab5-89a0-77494bfb730e" containerName="dnsmasq-dns" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.723156 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1e42523-8ecb-451e-adc8-fb87212c45ba" containerName="cinder-scheduler" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.723173 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="25c0f5bd-0634-4b2e-90e6-b73305b0d259" containerName="neutron-httpd" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.724238 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.727825 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.736436 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.830259 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.830297 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa796212-03db-4860-93f4-d2918ed44070-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.830666 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.830746 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-config-data\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.830850 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6687g\" (UniqueName: \"kubernetes.io/projected/fa796212-03db-4860-93f4-d2918ed44070-kube-api-access-6687g\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.830886 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-scripts\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.932201 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-config-data\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.932251 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6687g\" (UniqueName: \"kubernetes.io/projected/fa796212-03db-4860-93f4-d2918ed44070-kube-api-access-6687g\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.932275 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-scripts\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.932351 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.932370 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa796212-03db-4860-93f4-d2918ed44070-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.932470 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.932693 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa796212-03db-4860-93f4-d2918ed44070-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.950067 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.950117 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-scripts\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.950428 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.950906 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-config-data\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:39 crc kubenswrapper[4831]: I1203 06:52:39.953422 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6687g\" (UniqueName: \"kubernetes.io/projected/fa796212-03db-4860-93f4-d2918ed44070-kube-api-access-6687g\") pod \"cinder-scheduler-0\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " pod="openstack/cinder-scheduler-0" Dec 03 06:52:40 crc kubenswrapper[4831]: I1203 06:52:40.040801 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 06:52:40 crc kubenswrapper[4831]: I1203 06:52:40.278913 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:40 crc kubenswrapper[4831]: I1203 06:52:40.280015 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:52:40 crc kubenswrapper[4831]: I1203 06:52:40.520467 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 06:52:40 crc kubenswrapper[4831]: W1203 06:52:40.525219 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa796212_03db_4860_93f4_d2918ed44070.slice/crio-9c34c729095ae2910e230e2b44b9dc0906e76610536e9e1b7bcd5fc60d54ad73 WatchSource:0}: Error finding container 9c34c729095ae2910e230e2b44b9dc0906e76610536e9e1b7bcd5fc60d54ad73: Status 404 returned error can't find the container with id 9c34c729095ae2910e230e2b44b9dc0906e76610536e9e1b7bcd5fc60d54ad73 Dec 03 06:52:40 crc kubenswrapper[4831]: I1203 06:52:40.568160 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fa796212-03db-4860-93f4-d2918ed44070","Type":"ContainerStarted","Data":"9c34c729095ae2910e230e2b44b9dc0906e76610536e9e1b7bcd5fc60d54ad73"} Dec 03 06:52:40 crc kubenswrapper[4831]: I1203 06:52:40.838704 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:52:41 crc kubenswrapper[4831]: I1203 06:52:41.022792 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25c0f5bd-0634-4b2e-90e6-b73305b0d259" path="/var/lib/kubelet/pods/25c0f5bd-0634-4b2e-90e6-b73305b0d259/volumes" Dec 03 06:52:41 crc kubenswrapper[4831]: I1203 06:52:41.023382 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1e42523-8ecb-451e-adc8-fb87212c45ba" path="/var/lib/kubelet/pods/e1e42523-8ecb-451e-adc8-fb87212c45ba/volumes" Dec 03 06:52:41 crc kubenswrapper[4831]: I1203 06:52:41.582910 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fa796212-03db-4860-93f4-d2918ed44070","Type":"ContainerStarted","Data":"e1b665f5de4ff9fea0d7f46233aebcb76bd558b21574f6845a1c4a0745851315"} Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.152718 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.154125 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.156416 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.157079 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-hsz64" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.172916 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.235717 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.274229 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.274386 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-openstack-config-secret\") pod \"openstackclient\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.274489 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-openstack-config\") pod \"openstackclient\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.274667 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfxnz\" (UniqueName: \"kubernetes.io/projected/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-kube-api-access-nfxnz\") pod \"openstackclient\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.375936 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-openstack-config-secret\") pod \"openstackclient\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.376007 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-openstack-config\") pod \"openstackclient\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.376057 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nfxnz\" (UniqueName: \"kubernetes.io/projected/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-kube-api-access-nfxnz\") pod \"openstackclient\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.376074 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.376907 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-openstack-config\") pod \"openstackclient\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.390143 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-openstack-config-secret\") pod \"openstackclient\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.390239 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-combined-ca-bundle\") pod \"openstackclient\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.425741 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfxnz\" (UniqueName: \"kubernetes.io/projected/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-kube-api-access-nfxnz\") pod \"openstackclient\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.483624 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.569371 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.589137 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.606889 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fa796212-03db-4860-93f4-d2918ed44070","Type":"ContainerStarted","Data":"fd1e277c8951121dadc44ba354b735192014953fab00452b13030855e8591cd1"} Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.662790 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.665254 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.725761 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.727393 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.72738315 podStartE2EDuration="3.72738315s" podCreationTimestamp="2025-12-03 06:52:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:42.660015029 +0000 UTC m=+1300.003598537" watchObservedRunningTime="2025-12-03 06:52:42.72738315 +0000 UTC m=+1300.070966658" Dec 03 06:52:42 crc kubenswrapper[4831]: E1203 06:52:42.750401 4831 log.go:32] "RunPodSandbox from runtime service failed" err=< Dec 03 06:52:42 crc kubenswrapper[4831]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_e8f5894e-c4fc-43a9-93b8-e8f64e980c6b_0(5129e9248f6437ba550a664401a11bf6bbacd037052f9cecda0c0c30beca9ee7): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5129e9248f6437ba550a664401a11bf6bbacd037052f9cecda0c0c30beca9ee7" Netns:"/var/run/netns/05dcc2e4-7a1e-43b7-b4b1-329680696553" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=5129e9248f6437ba550a664401a11bf6bbacd037052f9cecda0c0c30beca9ee7;K8S_POD_UID=e8f5894e-c4fc-43a9-93b8-e8f64e980c6b" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b]: expected pod UID "e8f5894e-c4fc-43a9-93b8-e8f64e980c6b" but got "4cc3003a-0145-4c6f-bf53-10c7c574874f" from Kube API Dec 03 06:52:42 crc kubenswrapper[4831]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 06:52:42 crc kubenswrapper[4831]: > Dec 03 06:52:42 crc kubenswrapper[4831]: E1203 06:52:42.750463 4831 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Dec 03 06:52:42 crc kubenswrapper[4831]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_openstackclient_openstack_e8f5894e-c4fc-43a9-93b8-e8f64e980c6b_0(5129e9248f6437ba550a664401a11bf6bbacd037052f9cecda0c0c30beca9ee7): error adding pod openstack_openstackclient to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"5129e9248f6437ba550a664401a11bf6bbacd037052f9cecda0c0c30beca9ee7" Netns:"/var/run/netns/05dcc2e4-7a1e-43b7-b4b1-329680696553" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openstack;K8S_POD_NAME=openstackclient;K8S_POD_INFRA_CONTAINER_ID=5129e9248f6437ba550a664401a11bf6bbacd037052f9cecda0c0c30beca9ee7;K8S_POD_UID=e8f5894e-c4fc-43a9-93b8-e8f64e980c6b" Path:"" ERRORED: error configuring pod [openstack/openstackclient] networking: Multus: [openstack/openstackclient/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b]: expected pod UID "e8f5894e-c4fc-43a9-93b8-e8f64e980c6b" but got "4cc3003a-0145-4c6f-bf53-10c7c574874f" from Kube API Dec 03 06:52:42 crc kubenswrapper[4831]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Dec 03 06:52:42 crc kubenswrapper[4831]: > pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.789387 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q7wk\" (UniqueName: \"kubernetes.io/projected/4cc3003a-0145-4c6f-bf53-10c7c574874f-kube-api-access-5q7wk\") pod \"openstackclient\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.789570 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4cc3003a-0145-4c6f-bf53-10c7c574874f-openstack-config\") pod \"openstackclient\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.789633 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4cc3003a-0145-4c6f-bf53-10c7c574874f-openstack-config-secret\") pod \"openstackclient\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.789657 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc3003a-0145-4c6f-bf53-10c7c574874f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.891658 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4cc3003a-0145-4c6f-bf53-10c7c574874f-openstack-config\") pod \"openstackclient\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.891727 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4cc3003a-0145-4c6f-bf53-10c7c574874f-openstack-config-secret\") pod \"openstackclient\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.891748 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc3003a-0145-4c6f-bf53-10c7c574874f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.891874 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q7wk\" (UniqueName: \"kubernetes.io/projected/4cc3003a-0145-4c6f-bf53-10c7c574874f-kube-api-access-5q7wk\") pod \"openstackclient\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.892744 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4cc3003a-0145-4c6f-bf53-10c7c574874f-openstack-config\") pod \"openstackclient\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.896822 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc3003a-0145-4c6f-bf53-10c7c574874f-combined-ca-bundle\") pod \"openstackclient\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.902900 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4cc3003a-0145-4c6f-bf53-10c7c574874f-openstack-config-secret\") pod \"openstackclient\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.914876 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q7wk\" (UniqueName: \"kubernetes.io/projected/4cc3003a-0145-4c6f-bf53-10c7c574874f-kube-api-access-5q7wk\") pod \"openstackclient\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " pod="openstack/openstackclient" Dec 03 06:52:42 crc kubenswrapper[4831]: I1203 06:52:42.996221 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.620363 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.640611 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.645926 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e8f5894e-c4fc-43a9-93b8-e8f64e980c6b" podUID="4cc3003a-0145-4c6f-bf53-10c7c574874f" Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.756805 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nfxnz\" (UniqueName: \"kubernetes.io/projected/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-kube-api-access-nfxnz\") pod \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.756886 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-openstack-config\") pod \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.756922 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-openstack-config-secret\") pod \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.757072 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-combined-ca-bundle\") pod \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\" (UID: \"e8f5894e-c4fc-43a9-93b8-e8f64e980c6b\") " Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.759098 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "e8f5894e-c4fc-43a9-93b8-e8f64e980c6b" (UID: "e8f5894e-c4fc-43a9-93b8-e8f64e980c6b"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.763213 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "e8f5894e-c4fc-43a9-93b8-e8f64e980c6b" (UID: "e8f5894e-c4fc-43a9-93b8-e8f64e980c6b"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.763250 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e8f5894e-c4fc-43a9-93b8-e8f64e980c6b" (UID: "e8f5894e-c4fc-43a9-93b8-e8f64e980c6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.763436 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-kube-api-access-nfxnz" (OuterVolumeSpecName: "kube-api-access-nfxnz") pod "e8f5894e-c4fc-43a9-93b8-e8f64e980c6b" (UID: "e8f5894e-c4fc-43a9-93b8-e8f64e980c6b"). InnerVolumeSpecName "kube-api-access-nfxnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.858624 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.858890 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.858954 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.859008 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nfxnz\" (UniqueName: \"kubernetes.io/projected/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b-kube-api-access-nfxnz\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:43 crc kubenswrapper[4831]: I1203 06:52:43.892519 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 06:52:43 crc kubenswrapper[4831]: W1203 06:52:43.908139 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4cc3003a_0145_4c6f_bf53_10c7c574874f.slice/crio-db1d692ee7449e501ebabc324d41070f6004a5380784c9fb39894bc912b223d7 WatchSource:0}: Error finding container db1d692ee7449e501ebabc324d41070f6004a5380784c9fb39894bc912b223d7: Status 404 returned error can't find the container with id db1d692ee7449e501ebabc324d41070f6004a5380784c9fb39894bc912b223d7 Dec 03 06:52:44 crc kubenswrapper[4831]: I1203 06:52:44.630633 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 06:52:44 crc kubenswrapper[4831]: I1203 06:52:44.630659 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4cc3003a-0145-4c6f-bf53-10c7c574874f","Type":"ContainerStarted","Data":"db1d692ee7449e501ebabc324d41070f6004a5380784c9fb39894bc912b223d7"} Dec 03 06:52:44 crc kubenswrapper[4831]: I1203 06:52:44.648595 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="e8f5894e-c4fc-43a9-93b8-e8f64e980c6b" podUID="4cc3003a-0145-4c6f-bf53-10c7c574874f" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.022986 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8f5894e-c4fc-43a9-93b8-e8f64e980c6b" path="/var/lib/kubelet/pods/e8f5894e-c4fc-43a9-93b8-e8f64e980c6b/volumes" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.041298 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.774542 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-6d4758c757-wtw5x"] Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.778596 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.791356 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.791676 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.791795 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.828043 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6d4758c757-wtw5x"] Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.895431 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-public-tls-certs\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.895486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-combined-ca-bundle\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.895579 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d44a6a4-631f-4ccd-ac61-781623a04c11-etc-swift\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.895652 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d44a6a4-631f-4ccd-ac61-781623a04c11-run-httpd\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.895865 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-config-data\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.895919 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d44a6a4-631f-4ccd-ac61-781623a04c11-log-httpd\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.895935 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-internal-tls-certs\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.895957 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pcbbp\" (UniqueName: \"kubernetes.io/projected/5d44a6a4-631f-4ccd-ac61-781623a04c11-kube-api-access-pcbbp\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.997732 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d44a6a4-631f-4ccd-ac61-781623a04c11-log-httpd\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.997791 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-internal-tls-certs\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.997817 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pcbbp\" (UniqueName: \"kubernetes.io/projected/5d44a6a4-631f-4ccd-ac61-781623a04c11-kube-api-access-pcbbp\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.997907 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-combined-ca-bundle\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.997927 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-public-tls-certs\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.997958 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d44a6a4-631f-4ccd-ac61-781623a04c11-etc-swift\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.997983 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d44a6a4-631f-4ccd-ac61-781623a04c11-run-httpd\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:45 crc kubenswrapper[4831]: I1203 06:52:45.998069 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-config-data\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:46 crc kubenswrapper[4831]: I1203 06:52:46.003767 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d44a6a4-631f-4ccd-ac61-781623a04c11-log-httpd\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:46 crc kubenswrapper[4831]: I1203 06:52:46.004790 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d44a6a4-631f-4ccd-ac61-781623a04c11-run-httpd\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:46 crc kubenswrapper[4831]: I1203 06:52:46.012137 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-combined-ca-bundle\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:46 crc kubenswrapper[4831]: I1203 06:52:46.013867 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-public-tls-certs\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:46 crc kubenswrapper[4831]: I1203 06:52:46.014666 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-config-data\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:46 crc kubenswrapper[4831]: I1203 06:52:46.015203 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-internal-tls-certs\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:46 crc kubenswrapper[4831]: I1203 06:52:46.017245 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d44a6a4-631f-4ccd-ac61-781623a04c11-etc-swift\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:46 crc kubenswrapper[4831]: I1203 06:52:46.037309 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pcbbp\" (UniqueName: \"kubernetes.io/projected/5d44a6a4-631f-4ccd-ac61-781623a04c11-kube-api-access-pcbbp\") pod \"swift-proxy-6d4758c757-wtw5x\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:46 crc kubenswrapper[4831]: I1203 06:52:46.094100 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:46 crc kubenswrapper[4831]: W1203 06:52:46.648124 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5d44a6a4_631f_4ccd_ac61_781623a04c11.slice/crio-0b5fbb7474715b1f9abe3e4ac1638b3adf55f55ae3636addae962a7c63758e8a WatchSource:0}: Error finding container 0b5fbb7474715b1f9abe3e4ac1638b3adf55f55ae3636addae962a7c63758e8a: Status 404 returned error can't find the container with id 0b5fbb7474715b1f9abe3e4ac1638b3adf55f55ae3636addae962a7c63758e8a Dec 03 06:52:46 crc kubenswrapper[4831]: I1203 06:52:46.654207 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-6d4758c757-wtw5x"] Dec 03 06:52:47 crc kubenswrapper[4831]: I1203 06:52:47.721947 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d4758c757-wtw5x" event={"ID":"5d44a6a4-631f-4ccd-ac61-781623a04c11","Type":"ContainerStarted","Data":"bbb0b5757917dd69f07b05eab8c7233245af6d5b34e3733e43bfb640fb96d6e3"} Dec 03 06:52:47 crc kubenswrapper[4831]: I1203 06:52:47.722502 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:47 crc kubenswrapper[4831]: I1203 06:52:47.722516 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d4758c757-wtw5x" event={"ID":"5d44a6a4-631f-4ccd-ac61-781623a04c11","Type":"ContainerStarted","Data":"8b47e6d72bbe33bdcb24b9fec31efaddaf7bd0c2d98f21c24f88fbde38ed48a1"} Dec 03 06:52:47 crc kubenswrapper[4831]: I1203 06:52:47.722528 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:47 crc kubenswrapper[4831]: I1203 06:52:47.722543 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d4758c757-wtw5x" event={"ID":"5d44a6a4-631f-4ccd-ac61-781623a04c11","Type":"ContainerStarted","Data":"0b5fbb7474715b1f9abe3e4ac1638b3adf55f55ae3636addae962a7c63758e8a"} Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.504431 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-6d4758c757-wtw5x" podStartSLOduration=3.504413357 podStartE2EDuration="3.504413357s" podCreationTimestamp="2025-12-03 06:52:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:47.750365469 +0000 UTC m=+1305.093948977" watchObservedRunningTime="2025-12-03 06:52:48.504413357 +0000 UTC m=+1305.847996865" Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.512062 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.512413 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e55d9230-8363-41ae-b723-fc4193432067" containerName="glance-log" containerID="cri-o://eae2681a7008672e1bf1e656ca7a562f6fce2305da8067c939311505d06146cc" gracePeriod=30 Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.512508 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="e55d9230-8363-41ae-b723-fc4193432067" containerName="glance-httpd" containerID="cri-o://c7e9b0a650265fb0d7e98932b0ace54706b2cc22c62a95bf09a219d4be3ee1bb" gracePeriod=30 Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.558424 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.558911 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="ceilometer-central-agent" containerID="cri-o://29aa0c2499b14fae3bcfd8bdd98b9a6c961c0d42b19897f90300f4136cde6e18" gracePeriod=30 Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.559036 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="proxy-httpd" containerID="cri-o://55fb83467af43f754587c9982b500c4a9bc6a85477a4efbace9ee5a6dc39c59b" gracePeriod=30 Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.559069 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="sg-core" containerID="cri-o://9956faa0c6448ec45f1b30ab0bae883e690c8bf4a9c2c996f13f1c42e18b2b77" gracePeriod=30 Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.559097 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="ceilometer-notification-agent" containerID="cri-o://5a11208a0d747fa6881cb7bb233e160fdf096c558cf7a3477399b883f603c094" gracePeriod=30 Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.577932 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="proxy-httpd" probeResult="failure" output="HTTP probe failed with statuscode: 502" Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.738958 4831 generic.go:334] "Generic (PLEG): container finished" podID="561be991-dcaa-43e8-aa44-36bc520372fe" containerID="55fb83467af43f754587c9982b500c4a9bc6a85477a4efbace9ee5a6dc39c59b" exitCode=0 Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.738986 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"561be991-dcaa-43e8-aa44-36bc520372fe","Type":"ContainerDied","Data":"55fb83467af43f754587c9982b500c4a9bc6a85477a4efbace9ee5a6dc39c59b"} Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.739028 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"561be991-dcaa-43e8-aa44-36bc520372fe","Type":"ContainerDied","Data":"9956faa0c6448ec45f1b30ab0bae883e690c8bf4a9c2c996f13f1c42e18b2b77"} Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.738999 4831 generic.go:334] "Generic (PLEG): container finished" podID="561be991-dcaa-43e8-aa44-36bc520372fe" containerID="9956faa0c6448ec45f1b30ab0bae883e690c8bf4a9c2c996f13f1c42e18b2b77" exitCode=2 Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.743611 4831 generic.go:334] "Generic (PLEG): container finished" podID="e55d9230-8363-41ae-b723-fc4193432067" containerID="eae2681a7008672e1bf1e656ca7a562f6fce2305da8067c939311505d06146cc" exitCode=143 Dec 03 06:52:48 crc kubenswrapper[4831]: I1203 06:52:48.743687 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e55d9230-8363-41ae-b723-fc4193432067","Type":"ContainerDied","Data":"eae2681a7008672e1bf1e656ca7a562f6fce2305da8067c939311505d06146cc"} Dec 03 06:52:49 crc kubenswrapper[4831]: I1203 06:52:49.761757 4831 generic.go:334] "Generic (PLEG): container finished" podID="561be991-dcaa-43e8-aa44-36bc520372fe" containerID="29aa0c2499b14fae3bcfd8bdd98b9a6c961c0d42b19897f90300f4136cde6e18" exitCode=0 Dec 03 06:52:49 crc kubenswrapper[4831]: I1203 06:52:49.761806 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"561be991-dcaa-43e8-aa44-36bc520372fe","Type":"ContainerDied","Data":"29aa0c2499b14fae3bcfd8bdd98b9a6c961c0d42b19897f90300f4136cde6e18"} Dec 03 06:52:50 crc kubenswrapper[4831]: I1203 06:52:50.089989 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:52:50 crc kubenswrapper[4831]: I1203 06:52:50.091081 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ba03de9b-b96c-4fc0-8534-64e3735dcea4" containerName="glance-httpd" containerID="cri-o://ac319c6003550dd5b09a7a590a5ff5ccb1bfaf5e3bbacedc96c230bef222f23a" gracePeriod=30 Dec 03 06:52:50 crc kubenswrapper[4831]: I1203 06:52:50.090656 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="ba03de9b-b96c-4fc0-8534-64e3735dcea4" containerName="glance-log" containerID="cri-o://2dfdf1c75dd66cea93f44482b31954d0a8d7559f695a599313d41eadeeb88bdb" gracePeriod=30 Dec 03 06:52:50 crc kubenswrapper[4831]: I1203 06:52:50.450345 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 06:52:50 crc kubenswrapper[4831]: I1203 06:52:50.773736 4831 generic.go:334] "Generic (PLEG): container finished" podID="ba03de9b-b96c-4fc0-8534-64e3735dcea4" containerID="2dfdf1c75dd66cea93f44482b31954d0a8d7559f695a599313d41eadeeb88bdb" exitCode=143 Dec 03 06:52:50 crc kubenswrapper[4831]: I1203 06:52:50.773785 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba03de9b-b96c-4fc0-8534-64e3735dcea4","Type":"ContainerDied","Data":"2dfdf1c75dd66cea93f44482b31954d0a8d7559f695a599313d41eadeeb88bdb"} Dec 03 06:52:51 crc kubenswrapper[4831]: I1203 06:52:51.103789 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:51 crc kubenswrapper[4831]: I1203 06:52:51.784088 4831 generic.go:334] "Generic (PLEG): container finished" podID="e55d9230-8363-41ae-b723-fc4193432067" containerID="c7e9b0a650265fb0d7e98932b0ace54706b2cc22c62a95bf09a219d4be3ee1bb" exitCode=0 Dec 03 06:52:51 crc kubenswrapper[4831]: I1203 06:52:51.784233 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e55d9230-8363-41ae-b723-fc4193432067","Type":"ContainerDied","Data":"c7e9b0a650265fb0d7e98932b0ace54706b2cc22c62a95bf09a219d4be3ee1bb"} Dec 03 06:52:52 crc kubenswrapper[4831]: I1203 06:52:52.796474 4831 generic.go:334] "Generic (PLEG): container finished" podID="561be991-dcaa-43e8-aa44-36bc520372fe" containerID="5a11208a0d747fa6881cb7bb233e160fdf096c558cf7a3477399b883f603c094" exitCode=0 Dec 03 06:52:52 crc kubenswrapper[4831]: I1203 06:52:52.796514 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"561be991-dcaa-43e8-aa44-36bc520372fe","Type":"ContainerDied","Data":"5a11208a0d747fa6881cb7bb233e160fdf096c558cf7a3477399b883f603c094"} Dec 03 06:52:53 crc kubenswrapper[4831]: I1203 06:52:53.843601 4831 generic.go:334] "Generic (PLEG): container finished" podID="ba03de9b-b96c-4fc0-8534-64e3735dcea4" containerID="ac319c6003550dd5b09a7a590a5ff5ccb1bfaf5e3bbacedc96c230bef222f23a" exitCode=0 Dec 03 06:52:53 crc kubenswrapper[4831]: I1203 06:52:53.843896 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba03de9b-b96c-4fc0-8534-64e3735dcea4","Type":"ContainerDied","Data":"ac319c6003550dd5b09a7a590a5ff5ccb1bfaf5e3bbacedc96c230bef222f23a"} Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.064335 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.170414 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e55d9230-8363-41ae-b723-fc4193432067-logs\") pod \"e55d9230-8363-41ae-b723-fc4193432067\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.170694 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-config-data\") pod \"e55d9230-8363-41ae-b723-fc4193432067\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.170721 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e55d9230-8363-41ae-b723-fc4193432067-httpd-run\") pod \"e55d9230-8363-41ae-b723-fc4193432067\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.170740 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"e55d9230-8363-41ae-b723-fc4193432067\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.170766 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-combined-ca-bundle\") pod \"e55d9230-8363-41ae-b723-fc4193432067\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.170799 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fw7lr\" (UniqueName: \"kubernetes.io/projected/e55d9230-8363-41ae-b723-fc4193432067-kube-api-access-fw7lr\") pod \"e55d9230-8363-41ae-b723-fc4193432067\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.170832 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-scripts\") pod \"e55d9230-8363-41ae-b723-fc4193432067\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.171007 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-internal-tls-certs\") pod \"e55d9230-8363-41ae-b723-fc4193432067\" (UID: \"e55d9230-8363-41ae-b723-fc4193432067\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.171236 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e55d9230-8363-41ae-b723-fc4193432067-logs" (OuterVolumeSpecName: "logs") pod "e55d9230-8363-41ae-b723-fc4193432067" (UID: "e55d9230-8363-41ae-b723-fc4193432067"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.171535 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e55d9230-8363-41ae-b723-fc4193432067-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.173099 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e55d9230-8363-41ae-b723-fc4193432067-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e55d9230-8363-41ae-b723-fc4193432067" (UID: "e55d9230-8363-41ae-b723-fc4193432067"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.175131 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "e55d9230-8363-41ae-b723-fc4193432067" (UID: "e55d9230-8363-41ae-b723-fc4193432067"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.177726 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e55d9230-8363-41ae-b723-fc4193432067-kube-api-access-fw7lr" (OuterVolumeSpecName: "kube-api-access-fw7lr") pod "e55d9230-8363-41ae-b723-fc4193432067" (UID: "e55d9230-8363-41ae-b723-fc4193432067"). InnerVolumeSpecName "kube-api-access-fw7lr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.179304 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-scripts" (OuterVolumeSpecName: "scripts") pod "e55d9230-8363-41ae-b723-fc4193432067" (UID: "e55d9230-8363-41ae-b723-fc4193432067"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.207648 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e55d9230-8363-41ae-b723-fc4193432067" (UID: "e55d9230-8363-41ae-b723-fc4193432067"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.227392 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e55d9230-8363-41ae-b723-fc4193432067" (UID: "e55d9230-8363-41ae-b723-fc4193432067"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.254590 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-config-data" (OuterVolumeSpecName: "config-data") pod "e55d9230-8363-41ae-b723-fc4193432067" (UID: "e55d9230-8363-41ae-b723-fc4193432067"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.259188 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.273788 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.273815 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.273824 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e55d9230-8363-41ae-b723-fc4193432067-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.273871 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.273880 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.273889 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fw7lr\" (UniqueName: \"kubernetes.io/projected/e55d9230-8363-41ae-b723-fc4193432067-kube-api-access-fw7lr\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.273898 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e55d9230-8363-41ae-b723-fc4193432067-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.300686 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.306779 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.374879 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/561be991-dcaa-43e8-aa44-36bc520372fe-run-httpd\") pod \"561be991-dcaa-43e8-aa44-36bc520372fe\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.374967 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/561be991-dcaa-43e8-aa44-36bc520372fe-log-httpd\") pod \"561be991-dcaa-43e8-aa44-36bc520372fe\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375010 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-combined-ca-bundle\") pod \"561be991-dcaa-43e8-aa44-36bc520372fe\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375104 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba03de9b-b96c-4fc0-8534-64e3735dcea4-logs\") pod \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375160 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-scripts\") pod \"561be991-dcaa-43e8-aa44-36bc520372fe\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375183 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qh5rw\" (UniqueName: \"kubernetes.io/projected/ba03de9b-b96c-4fc0-8534-64e3735dcea4-kube-api-access-qh5rw\") pod \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375253 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7npm\" (UniqueName: \"kubernetes.io/projected/561be991-dcaa-43e8-aa44-36bc520372fe-kube-api-access-t7npm\") pod \"561be991-dcaa-43e8-aa44-36bc520372fe\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375279 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-combined-ca-bundle\") pod \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375349 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-config-data\") pod \"561be991-dcaa-43e8-aa44-36bc520372fe\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375396 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-public-tls-certs\") pod \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375390 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/561be991-dcaa-43e8-aa44-36bc520372fe-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "561be991-dcaa-43e8-aa44-36bc520372fe" (UID: "561be991-dcaa-43e8-aa44-36bc520372fe"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375422 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba03de9b-b96c-4fc0-8534-64e3735dcea4-httpd-run\") pod \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375449 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375492 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-sg-core-conf-yaml\") pod \"561be991-dcaa-43e8-aa44-36bc520372fe\" (UID: \"561be991-dcaa-43e8-aa44-36bc520372fe\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375537 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-scripts\") pod \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375557 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-config-data\") pod \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\" (UID: \"ba03de9b-b96c-4fc0-8534-64e3735dcea4\") " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375574 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/561be991-dcaa-43e8-aa44-36bc520372fe-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "561be991-dcaa-43e8-aa44-36bc520372fe" (UID: "561be991-dcaa-43e8-aa44-36bc520372fe"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375981 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/561be991-dcaa-43e8-aa44-36bc520372fe-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.376006 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/561be991-dcaa-43e8-aa44-36bc520372fe-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.376022 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.375990 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba03de9b-b96c-4fc0-8534-64e3735dcea4-logs" (OuterVolumeSpecName: "logs") pod "ba03de9b-b96c-4fc0-8534-64e3735dcea4" (UID: "ba03de9b-b96c-4fc0-8534-64e3735dcea4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.376617 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ba03de9b-b96c-4fc0-8534-64e3735dcea4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "ba03de9b-b96c-4fc0-8534-64e3735dcea4" (UID: "ba03de9b-b96c-4fc0-8534-64e3735dcea4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.380150 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/561be991-dcaa-43e8-aa44-36bc520372fe-kube-api-access-t7npm" (OuterVolumeSpecName: "kube-api-access-t7npm") pod "561be991-dcaa-43e8-aa44-36bc520372fe" (UID: "561be991-dcaa-43e8-aa44-36bc520372fe"). InnerVolumeSpecName "kube-api-access-t7npm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.380479 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba03de9b-b96c-4fc0-8534-64e3735dcea4-kube-api-access-qh5rw" (OuterVolumeSpecName: "kube-api-access-qh5rw") pod "ba03de9b-b96c-4fc0-8534-64e3735dcea4" (UID: "ba03de9b-b96c-4fc0-8534-64e3735dcea4"). InnerVolumeSpecName "kube-api-access-qh5rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.380852 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-scripts" (OuterVolumeSpecName: "scripts") pod "561be991-dcaa-43e8-aa44-36bc520372fe" (UID: "561be991-dcaa-43e8-aa44-36bc520372fe"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.381743 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-scripts" (OuterVolumeSpecName: "scripts") pod "ba03de9b-b96c-4fc0-8534-64e3735dcea4" (UID: "ba03de9b-b96c-4fc0-8534-64e3735dcea4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.381751 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "ba03de9b-b96c-4fc0-8534-64e3735dcea4" (UID: "ba03de9b-b96c-4fc0-8534-64e3735dcea4"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.407900 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "561be991-dcaa-43e8-aa44-36bc520372fe" (UID: "561be991-dcaa-43e8-aa44-36bc520372fe"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.416697 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba03de9b-b96c-4fc0-8534-64e3735dcea4" (UID: "ba03de9b-b96c-4fc0-8534-64e3735dcea4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.441783 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-config-data" (OuterVolumeSpecName: "config-data") pod "ba03de9b-b96c-4fc0-8534-64e3735dcea4" (UID: "ba03de9b-b96c-4fc0-8534-64e3735dcea4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.444579 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "ba03de9b-b96c-4fc0-8534-64e3735dcea4" (UID: "ba03de9b-b96c-4fc0-8534-64e3735dcea4"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.478496 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7npm\" (UniqueName: \"kubernetes.io/projected/561be991-dcaa-43e8-aa44-36bc520372fe-kube-api-access-t7npm\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.478735 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.478835 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.478903 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/ba03de9b-b96c-4fc0-8534-64e3735dcea4-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.478986 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.479047 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.479104 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.479157 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba03de9b-b96c-4fc0-8534-64e3735dcea4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.479216 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ba03de9b-b96c-4fc0-8534-64e3735dcea4-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.479274 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.479345 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qh5rw\" (UniqueName: \"kubernetes.io/projected/ba03de9b-b96c-4fc0-8534-64e3735dcea4-kube-api-access-qh5rw\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.485484 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "561be991-dcaa-43e8-aa44-36bc520372fe" (UID: "561be991-dcaa-43e8-aa44-36bc520372fe"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.493188 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-config-data" (OuterVolumeSpecName: "config-data") pod "561be991-dcaa-43e8-aa44-36bc520372fe" (UID: "561be991-dcaa-43e8-aa44-36bc520372fe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.497631 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.580465 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.580781 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/561be991-dcaa-43e8-aa44-36bc520372fe-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.580881 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.853239 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"4cc3003a-0145-4c6f-bf53-10c7c574874f","Type":"ContainerStarted","Data":"3aec4649bc0f7597a824ffeb353a8d7f39afe26459a358163be0cbfca4a1b51b"} Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.855844 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"561be991-dcaa-43e8-aa44-36bc520372fe","Type":"ContainerDied","Data":"6d43312896af3441a531a6af22bcf56de3eb731f79367632cd6c8297b81c1d07"} Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.855867 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.855890 4831 scope.go:117] "RemoveContainer" containerID="55fb83467af43f754587c9982b500c4a9bc6a85477a4efbace9ee5a6dc39c59b" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.857375 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e55d9230-8363-41ae-b723-fc4193432067","Type":"ContainerDied","Data":"870f2219e2f89ef890aa24f9fc2083e4e392bbfe523a139b2efad054382ed7b5"} Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.857406 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.859472 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"ba03de9b-b96c-4fc0-8534-64e3735dcea4","Type":"ContainerDied","Data":"4f7cb01c1d871723b053658961e9dfdd54591494dcd22ad0d535c96cdcfdae0a"} Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.859502 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.880444 4831 scope.go:117] "RemoveContainer" containerID="9956faa0c6448ec45f1b30ab0bae883e690c8bf4a9c2c996f13f1c42e18b2b77" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.889730 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.010230414 podStartE2EDuration="12.889712196s" podCreationTimestamp="2025-12-03 06:52:42 +0000 UTC" firstStartedPulling="2025-12-03 06:52:43.913467566 +0000 UTC m=+1301.257051074" lastFinishedPulling="2025-12-03 06:52:53.792949338 +0000 UTC m=+1311.136532856" observedRunningTime="2025-12-03 06:52:54.878211507 +0000 UTC m=+1312.221795015" watchObservedRunningTime="2025-12-03 06:52:54.889712196 +0000 UTC m=+1312.233295704" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.905481 4831 scope.go:117] "RemoveContainer" containerID="5a11208a0d747fa6881cb7bb233e160fdf096c558cf7a3477399b883f603c094" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.917033 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.934163 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.954682 4831 scope.go:117] "RemoveContainer" containerID="29aa0c2499b14fae3bcfd8bdd98b9a6c961c0d42b19897f90300f4136cde6e18" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.958279 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.976289 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.986919 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:52:54 crc kubenswrapper[4831]: E1203 06:52:54.987341 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="ceilometer-central-agent" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987352 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="ceilometer-central-agent" Dec 03 06:52:54 crc kubenswrapper[4831]: E1203 06:52:54.987384 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="ceilometer-notification-agent" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987391 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="ceilometer-notification-agent" Dec 03 06:52:54 crc kubenswrapper[4831]: E1203 06:52:54.987400 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e55d9230-8363-41ae-b723-fc4193432067" containerName="glance-log" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987407 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e55d9230-8363-41ae-b723-fc4193432067" containerName="glance-log" Dec 03 06:52:54 crc kubenswrapper[4831]: E1203 06:52:54.987422 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba03de9b-b96c-4fc0-8534-64e3735dcea4" containerName="glance-log" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987427 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba03de9b-b96c-4fc0-8534-64e3735dcea4" containerName="glance-log" Dec 03 06:52:54 crc kubenswrapper[4831]: E1203 06:52:54.987435 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba03de9b-b96c-4fc0-8534-64e3735dcea4" containerName="glance-httpd" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987440 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba03de9b-b96c-4fc0-8534-64e3735dcea4" containerName="glance-httpd" Dec 03 06:52:54 crc kubenswrapper[4831]: E1203 06:52:54.987455 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="proxy-httpd" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987460 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="proxy-httpd" Dec 03 06:52:54 crc kubenswrapper[4831]: E1203 06:52:54.987471 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e55d9230-8363-41ae-b723-fc4193432067" containerName="glance-httpd" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987477 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e55d9230-8363-41ae-b723-fc4193432067" containerName="glance-httpd" Dec 03 06:52:54 crc kubenswrapper[4831]: E1203 06:52:54.987486 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="sg-core" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987491 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="sg-core" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987643 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="ceilometer-notification-agent" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987656 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="proxy-httpd" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987666 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba03de9b-b96c-4fc0-8534-64e3735dcea4" containerName="glance-httpd" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987679 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="sg-core" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987690 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e55d9230-8363-41ae-b723-fc4193432067" containerName="glance-log" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987706 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" containerName="ceilometer-central-agent" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987714 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba03de9b-b96c-4fc0-8534-64e3735dcea4" containerName="glance-log" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.987722 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e55d9230-8363-41ae-b723-fc4193432067" containerName="glance-httpd" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.988652 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.991085 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-b8246" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.991404 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.991411 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.991721 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.995242 4831 scope.go:117] "RemoveContainer" containerID="c7e9b0a650265fb0d7e98932b0ace54706b2cc22c62a95bf09a219d4be3ee1bb" Dec 03 06:52:54 crc kubenswrapper[4831]: I1203 06:52:54.995474 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.003073 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.010948 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.027032 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="561be991-dcaa-43e8-aa44-36bc520372fe" path="/var/lib/kubelet/pods/561be991-dcaa-43e8-aa44-36bc520372fe/volumes" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.028829 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba03de9b-b96c-4fc0-8534-64e3735dcea4" path="/var/lib/kubelet/pods/ba03de9b-b96c-4fc0-8534-64e3735dcea4/volumes" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.030173 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e55d9230-8363-41ae-b723-fc4193432067" path="/var/lib/kubelet/pods/e55d9230-8363-41ae-b723-fc4193432067/volumes" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.030878 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.034887 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.035034 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.039187 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.042778 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.062477 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.065044 4831 scope.go:117] "RemoveContainer" containerID="eae2681a7008672e1bf1e656ca7a562f6fce2305da8067c939311505d06146cc" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.069854 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.074503 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.075648 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.084389 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.093880 4831 scope.go:117] "RemoveContainer" containerID="ac319c6003550dd5b09a7a590a5ff5ccb1bfaf5e3bbacedc96c230bef222f23a" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.097114 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.097167 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.097220 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.103695 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a5d88e3-73a3-4f3d-af31-af675ab452bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.103759 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.103841 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.103876 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.104007 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-scripts\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.104090 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x45tm\" (UniqueName: \"kubernetes.io/projected/7a5d88e3-73a3-4f3d-af31-af675ab452bd-kube-api-access-x45tm\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.104146 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/267687cf-58da-42e0-852e-c8c87f2ea42a-logs\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.104168 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-config-data\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.104199 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.104226 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.104267 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a5d88e3-73a3-4f3d-af31-af675ab452bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.104342 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/267687cf-58da-42e0-852e-c8c87f2ea42a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.104373 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcj8g\" (UniqueName: \"kubernetes.io/projected/267687cf-58da-42e0-852e-c8c87f2ea42a-kube-api-access-fcj8g\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.112201 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:55 crc kubenswrapper[4831]: E1203 06:52:55.112869 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-strcd log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[combined-ca-bundle config-data kube-api-access-strcd log-httpd run-httpd scripts sg-core-conf-yaml]: context canceled" pod="openstack/ceilometer-0" podUID="7a3edc41-185e-41a1-98dd-54718c5dc66e" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.115729 4831 scope.go:117] "RemoveContainer" containerID="2dfdf1c75dd66cea93f44482b31954d0a8d7559f695a599313d41eadeeb88bdb" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209280 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3edc41-185e-41a1-98dd-54718c5dc66e-run-httpd\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209394 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209431 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209460 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209496 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-strcd\" (UniqueName: \"kubernetes.io/projected/7a3edc41-185e-41a1-98dd-54718c5dc66e-kube-api-access-strcd\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209541 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a5d88e3-73a3-4f3d-af31-af675ab452bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209573 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209624 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-config-data\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209653 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209680 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209715 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-scripts\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209748 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3edc41-185e-41a1-98dd-54718c5dc66e-log-httpd\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209781 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x45tm\" (UniqueName: \"kubernetes.io/projected/7a5d88e3-73a3-4f3d-af31-af675ab452bd-kube-api-access-x45tm\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209812 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-scripts\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209829 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/267687cf-58da-42e0-852e-c8c87f2ea42a-logs\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209843 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-config-data\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209862 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209876 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209894 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a5d88e3-73a3-4f3d-af31-af675ab452bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209917 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/267687cf-58da-42e0-852e-c8c87f2ea42a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209933 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcj8g\" (UniqueName: \"kubernetes.io/projected/267687cf-58da-42e0-852e-c8c87f2ea42a-kube-api-access-fcj8g\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209950 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.209965 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.211166 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.211924 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.212716 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/267687cf-58da-42e0-852e-c8c87f2ea42a-logs\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.212819 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/267687cf-58da-42e0-852e-c8c87f2ea42a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.213377 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a5d88e3-73a3-4f3d-af31-af675ab452bd-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.213920 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a5d88e3-73a3-4f3d-af31-af675ab452bd-logs\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.217369 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.220362 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.220637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.222908 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-scripts\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.224644 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-config-data\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.226499 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-scripts\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.228598 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-config-data\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.230719 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.237369 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x45tm\" (UniqueName: \"kubernetes.io/projected/7a5d88e3-73a3-4f3d-af31-af675ab452bd-kube-api-access-x45tm\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.240331 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcj8g\" (UniqueName: \"kubernetes.io/projected/267687cf-58da-42e0-852e-c8c87f2ea42a-kube-api-access-fcj8g\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.250745 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.252670 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.311225 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.312786 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3edc41-185e-41a1-98dd-54718c5dc66e-log-httpd\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.312828 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-scripts\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.312865 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.312880 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.312904 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3edc41-185e-41a1-98dd-54718c5dc66e-run-httpd\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.312944 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-strcd\" (UniqueName: \"kubernetes.io/projected/7a3edc41-185e-41a1-98dd-54718c5dc66e-kube-api-access-strcd\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.312988 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-config-data\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.313833 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3edc41-185e-41a1-98dd-54718c5dc66e-log-httpd\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.314109 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3edc41-185e-41a1-98dd-54718c5dc66e-run-httpd\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.316727 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-scripts\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.320334 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.320554 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-config-data\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.322332 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.329978 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-strcd\" (UniqueName: \"kubernetes.io/projected/7a3edc41-185e-41a1-98dd-54718c5dc66e-kube-api-access-strcd\") pod \"ceilometer-0\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.358564 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.870841 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.881234 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.924279 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-scripts\") pod \"7a3edc41-185e-41a1-98dd-54718c5dc66e\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.924677 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-config-data\") pod \"7a3edc41-185e-41a1-98dd-54718c5dc66e\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.924741 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3edc41-185e-41a1-98dd-54718c5dc66e-run-httpd\") pod \"7a3edc41-185e-41a1-98dd-54718c5dc66e\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.924819 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-strcd\" (UniqueName: \"kubernetes.io/projected/7a3edc41-185e-41a1-98dd-54718c5dc66e-kube-api-access-strcd\") pod \"7a3edc41-185e-41a1-98dd-54718c5dc66e\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.925007 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3edc41-185e-41a1-98dd-54718c5dc66e-log-httpd\") pod \"7a3edc41-185e-41a1-98dd-54718c5dc66e\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.925044 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-sg-core-conf-yaml\") pod \"7a3edc41-185e-41a1-98dd-54718c5dc66e\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.925180 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-combined-ca-bundle\") pod \"7a3edc41-185e-41a1-98dd-54718c5dc66e\" (UID: \"7a3edc41-185e-41a1-98dd-54718c5dc66e\") " Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.926801 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3edc41-185e-41a1-98dd-54718c5dc66e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7a3edc41-185e-41a1-98dd-54718c5dc66e" (UID: "7a3edc41-185e-41a1-98dd-54718c5dc66e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.926817 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a3edc41-185e-41a1-98dd-54718c5dc66e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7a3edc41-185e-41a1-98dd-54718c5dc66e" (UID: "7a3edc41-185e-41a1-98dd-54718c5dc66e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.931782 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.931912 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7a3edc41-185e-41a1-98dd-54718c5dc66e" (UID: "7a3edc41-185e-41a1-98dd-54718c5dc66e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.932170 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-scripts" (OuterVolumeSpecName: "scripts") pod "7a3edc41-185e-41a1-98dd-54718c5dc66e" (UID: "7a3edc41-185e-41a1-98dd-54718c5dc66e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.932223 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-config-data" (OuterVolumeSpecName: "config-data") pod "7a3edc41-185e-41a1-98dd-54718c5dc66e" (UID: "7a3edc41-185e-41a1-98dd-54718c5dc66e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.933433 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a3edc41-185e-41a1-98dd-54718c5dc66e" (UID: "7a3edc41-185e-41a1-98dd-54718c5dc66e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:55 crc kubenswrapper[4831]: W1203 06:52:55.938872 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a5d88e3_73a3_4f3d_af31_af675ab452bd.slice/crio-2dcbb7352b7ae850b9362974b1e839ce31bdf2ecd80c09b95e3c0056154962ad WatchSource:0}: Error finding container 2dcbb7352b7ae850b9362974b1e839ce31bdf2ecd80c09b95e3c0056154962ad: Status 404 returned error can't find the container with id 2dcbb7352b7ae850b9362974b1e839ce31bdf2ecd80c09b95e3c0056154962ad Dec 03 06:52:55 crc kubenswrapper[4831]: I1203 06:52:55.943253 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a3edc41-185e-41a1-98dd-54718c5dc66e-kube-api-access-strcd" (OuterVolumeSpecName: "kube-api-access-strcd") pod "7a3edc41-185e-41a1-98dd-54718c5dc66e" (UID: "7a3edc41-185e-41a1-98dd-54718c5dc66e"). InnerVolumeSpecName "kube-api-access-strcd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.028673 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.028705 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.028714 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.028723 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3edc41-185e-41a1-98dd-54718c5dc66e-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.028731 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-strcd\" (UniqueName: \"kubernetes.io/projected/7a3edc41-185e-41a1-98dd-54718c5dc66e-kube-api-access-strcd\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.028741 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7a3edc41-185e-41a1-98dd-54718c5dc66e-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.028749 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7a3edc41-185e-41a1-98dd-54718c5dc66e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.036765 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.101132 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.880893 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a5d88e3-73a3-4f3d-af31-af675ab452bd","Type":"ContainerStarted","Data":"67de1998f02ce4f499b3387ae3a93d5ab8ad64720e8c404dce1cf6612058988f"} Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.881140 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a5d88e3-73a3-4f3d-af31-af675ab452bd","Type":"ContainerStarted","Data":"2dcbb7352b7ae850b9362974b1e839ce31bdf2ecd80c09b95e3c0056154962ad"} Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.883936 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.884442 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"267687cf-58da-42e0-852e-c8c87f2ea42a","Type":"ContainerStarted","Data":"a6fcbb703b3c57a0767a6e84e0a057dc745d0f18dbb25eedf9c9e63269b59524"} Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.884503 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"267687cf-58da-42e0-852e-c8c87f2ea42a","Type":"ContainerStarted","Data":"8a05a40ba3b50363e84091dad83d45d1e43ef23cccb85cb2cf5a9c3d010401e0"} Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.934716 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.946518 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.961723 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.969310 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.971359 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.972173 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 06:52:56 crc kubenswrapper[4831]: I1203 06:52:56.979640 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.031600 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a3edc41-185e-41a1-98dd-54718c5dc66e" path="/var/lib/kubelet/pods/7a3edc41-185e-41a1-98dd-54718c5dc66e/volumes" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.061102 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6f36b08-51c0-4976-a86a-f3b47ceebde1-run-httpd\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.061392 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sm5k9\" (UniqueName: \"kubernetes.io/projected/b6f36b08-51c0-4976-a86a-f3b47ceebde1-kube-api-access-sm5k9\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.061539 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-scripts\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.061630 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.061772 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6f36b08-51c0-4976-a86a-f3b47ceebde1-log-httpd\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.061919 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-config-data\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.063006 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.099122 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:57 crc kubenswrapper[4831]: E1203 06:52:57.104008 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle config-data kube-api-access-sm5k9 log-httpd run-httpd scripts sg-core-conf-yaml], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/ceilometer-0" podUID="b6f36b08-51c0-4976-a86a-f3b47ceebde1" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.165452 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-config-data\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.165519 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.165571 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6f36b08-51c0-4976-a86a-f3b47ceebde1-run-httpd\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.165600 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sm5k9\" (UniqueName: \"kubernetes.io/projected/b6f36b08-51c0-4976-a86a-f3b47ceebde1-kube-api-access-sm5k9\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.165664 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-scripts\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.165691 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.165749 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6f36b08-51c0-4976-a86a-f3b47ceebde1-log-httpd\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.166482 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6f36b08-51c0-4976-a86a-f3b47ceebde1-log-httpd\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.167637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6f36b08-51c0-4976-a86a-f3b47ceebde1-run-httpd\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.172280 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.172760 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.177072 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-scripts\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.177249 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-config-data\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.193782 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sm5k9\" (UniqueName: \"kubernetes.io/projected/b6f36b08-51c0-4976-a86a-f3b47ceebde1-kube-api-access-sm5k9\") pod \"ceilometer-0\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.596953 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.597261 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.894442 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a5d88e3-73a3-4f3d-af31-af675ab452bd","Type":"ContainerStarted","Data":"74e0644a9792269f43c86550ff06e6fa5782d3aaca7a69696c14e40e26a5beec"} Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.897665 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"267687cf-58da-42e0-852e-c8c87f2ea42a","Type":"ContainerStarted","Data":"134ca55163ecb5e5269d7ec7014e4b029385ad75e92d2133a4bef2bd84b55d9f"} Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.897769 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.908649 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.920204 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.920184629 podStartE2EDuration="3.920184629s" podCreationTimestamp="2025-12-03 06:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:57.916306687 +0000 UTC m=+1315.259890195" watchObservedRunningTime="2025-12-03 06:52:57.920184629 +0000 UTC m=+1315.263768127" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.946824 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.946805503 podStartE2EDuration="3.946805503s" podCreationTimestamp="2025-12-03 06:52:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:52:57.942969063 +0000 UTC m=+1315.286552561" watchObservedRunningTime="2025-12-03 06:52:57.946805503 +0000 UTC m=+1315.290389031" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.978090 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-config-data\") pod \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.978148 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sm5k9\" (UniqueName: \"kubernetes.io/projected/b6f36b08-51c0-4976-a86a-f3b47ceebde1-kube-api-access-sm5k9\") pod \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.978256 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-sg-core-conf-yaml\") pod \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.978284 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6f36b08-51c0-4976-a86a-f3b47ceebde1-log-httpd\") pod \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.978394 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-combined-ca-bundle\") pod \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.978419 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6f36b08-51c0-4976-a86a-f3b47ceebde1-run-httpd\") pod \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.978464 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-scripts\") pod \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\" (UID: \"b6f36b08-51c0-4976-a86a-f3b47ceebde1\") " Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.978594 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6f36b08-51c0-4976-a86a-f3b47ceebde1-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "b6f36b08-51c0-4976-a86a-f3b47ceebde1" (UID: "b6f36b08-51c0-4976-a86a-f3b47ceebde1"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.978737 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6f36b08-51c0-4976-a86a-f3b47ceebde1-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "b6f36b08-51c0-4976-a86a-f3b47ceebde1" (UID: "b6f36b08-51c0-4976-a86a-f3b47ceebde1"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.978888 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6f36b08-51c0-4976-a86a-f3b47ceebde1-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.978905 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/b6f36b08-51c0-4976-a86a-f3b47ceebde1-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.982596 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-scripts" (OuterVolumeSpecName: "scripts") pod "b6f36b08-51c0-4976-a86a-f3b47ceebde1" (UID: "b6f36b08-51c0-4976-a86a-f3b47ceebde1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.982832 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-config-data" (OuterVolumeSpecName: "config-data") pod "b6f36b08-51c0-4976-a86a-f3b47ceebde1" (UID: "b6f36b08-51c0-4976-a86a-f3b47ceebde1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.983092 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6f36b08-51c0-4976-a86a-f3b47ceebde1-kube-api-access-sm5k9" (OuterVolumeSpecName: "kube-api-access-sm5k9") pod "b6f36b08-51c0-4976-a86a-f3b47ceebde1" (UID: "b6f36b08-51c0-4976-a86a-f3b47ceebde1"). InnerVolumeSpecName "kube-api-access-sm5k9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.990235 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b6f36b08-51c0-4976-a86a-f3b47ceebde1" (UID: "b6f36b08-51c0-4976-a86a-f3b47ceebde1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:57 crc kubenswrapper[4831]: I1203 06:52:57.997483 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "b6f36b08-51c0-4976-a86a-f3b47ceebde1" (UID: "b6f36b08-51c0-4976-a86a-f3b47ceebde1"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.080972 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.081042 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.081061 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.081080 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b6f36b08-51c0-4976-a86a-f3b47ceebde1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.081096 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sm5k9\" (UniqueName: \"kubernetes.io/projected/b6f36b08-51c0-4976-a86a-f3b47ceebde1-kube-api-access-sm5k9\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.711227 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.791044 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-combined-ca-bundle\") pod \"13682415-6e71-4b15-970c-311c9d163f3d\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.791148 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13682415-6e71-4b15-970c-311c9d163f3d-logs\") pod \"13682415-6e71-4b15-970c-311c9d163f3d\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.791238 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-config-data-custom\") pod \"13682415-6e71-4b15-970c-311c9d163f3d\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.791306 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13682415-6e71-4b15-970c-311c9d163f3d-etc-machine-id\") pod \"13682415-6e71-4b15-970c-311c9d163f3d\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.791362 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-scripts\") pod \"13682415-6e71-4b15-970c-311c9d163f3d\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.791411 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kdh4h\" (UniqueName: \"kubernetes.io/projected/13682415-6e71-4b15-970c-311c9d163f3d-kube-api-access-kdh4h\") pod \"13682415-6e71-4b15-970c-311c9d163f3d\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.791448 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-config-data\") pod \"13682415-6e71-4b15-970c-311c9d163f3d\" (UID: \"13682415-6e71-4b15-970c-311c9d163f3d\") " Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.792915 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/13682415-6e71-4b15-970c-311c9d163f3d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "13682415-6e71-4b15-970c-311c9d163f3d" (UID: "13682415-6e71-4b15-970c-311c9d163f3d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.793114 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13682415-6e71-4b15-970c-311c9d163f3d-logs" (OuterVolumeSpecName: "logs") pod "13682415-6e71-4b15-970c-311c9d163f3d" (UID: "13682415-6e71-4b15-970c-311c9d163f3d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.797080 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "13682415-6e71-4b15-970c-311c9d163f3d" (UID: "13682415-6e71-4b15-970c-311c9d163f3d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.799475 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-scripts" (OuterVolumeSpecName: "scripts") pod "13682415-6e71-4b15-970c-311c9d163f3d" (UID: "13682415-6e71-4b15-970c-311c9d163f3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.807048 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13682415-6e71-4b15-970c-311c9d163f3d-kube-api-access-kdh4h" (OuterVolumeSpecName: "kube-api-access-kdh4h") pod "13682415-6e71-4b15-970c-311c9d163f3d" (UID: "13682415-6e71-4b15-970c-311c9d163f3d"). InnerVolumeSpecName "kube-api-access-kdh4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.823575 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "13682415-6e71-4b15-970c-311c9d163f3d" (UID: "13682415-6e71-4b15-970c-311c9d163f3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.874154 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-config-data" (OuterVolumeSpecName: "config-data") pod "13682415-6e71-4b15-970c-311c9d163f3d" (UID: "13682415-6e71-4b15-970c-311c9d163f3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.893308 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.893355 4831 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/13682415-6e71-4b15-970c-311c9d163f3d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.893367 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.893379 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kdh4h\" (UniqueName: \"kubernetes.io/projected/13682415-6e71-4b15-970c-311c9d163f3d-kube-api-access-kdh4h\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.893391 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.893401 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/13682415-6e71-4b15-970c-311c9d163f3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.893409 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/13682415-6e71-4b15-970c-311c9d163f3d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.909703 4831 generic.go:334] "Generic (PLEG): container finished" podID="13682415-6e71-4b15-970c-311c9d163f3d" containerID="3acd0095176c1b2f915f4b35353a5df5914991204a2e41e06ecb4f52ceb13ed7" exitCode=137 Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.909804 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.909998 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"13682415-6e71-4b15-970c-311c9d163f3d","Type":"ContainerDied","Data":"3acd0095176c1b2f915f4b35353a5df5914991204a2e41e06ecb4f52ceb13ed7"} Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.910071 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"13682415-6e71-4b15-970c-311c9d163f3d","Type":"ContainerDied","Data":"ea811b6b8612da7581d425f4eca7a3ee7e3569912f7ee74575da9cf3ebe66436"} Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.910246 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.910369 4831 scope.go:117] "RemoveContainer" containerID="3acd0095176c1b2f915f4b35353a5df5914991204a2e41e06ecb4f52ceb13ed7" Dec 03 06:52:58 crc kubenswrapper[4831]: I1203 06:52:58.949267 4831 scope.go:117] "RemoveContainer" containerID="42fce37b570c2e1294e4c9f9bd73d5d6db29cab63f6d1331ab2e259c3b95d6e4" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.000241 4831 scope.go:117] "RemoveContainer" containerID="3acd0095176c1b2f915f4b35353a5df5914991204a2e41e06ecb4f52ceb13ed7" Dec 03 06:52:59 crc kubenswrapper[4831]: E1203 06:52:59.001263 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3acd0095176c1b2f915f4b35353a5df5914991204a2e41e06ecb4f52ceb13ed7\": container with ID starting with 3acd0095176c1b2f915f4b35353a5df5914991204a2e41e06ecb4f52ceb13ed7 not found: ID does not exist" containerID="3acd0095176c1b2f915f4b35353a5df5914991204a2e41e06ecb4f52ceb13ed7" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.001391 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3acd0095176c1b2f915f4b35353a5df5914991204a2e41e06ecb4f52ceb13ed7"} err="failed to get container status \"3acd0095176c1b2f915f4b35353a5df5914991204a2e41e06ecb4f52ceb13ed7\": rpc error: code = NotFound desc = could not find container \"3acd0095176c1b2f915f4b35353a5df5914991204a2e41e06ecb4f52ceb13ed7\": container with ID starting with 3acd0095176c1b2f915f4b35353a5df5914991204a2e41e06ecb4f52ceb13ed7 not found: ID does not exist" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.001412 4831 scope.go:117] "RemoveContainer" containerID="42fce37b570c2e1294e4c9f9bd73d5d6db29cab63f6d1331ab2e259c3b95d6e4" Dec 03 06:52:59 crc kubenswrapper[4831]: E1203 06:52:59.003833 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"42fce37b570c2e1294e4c9f9bd73d5d6db29cab63f6d1331ab2e259c3b95d6e4\": container with ID starting with 42fce37b570c2e1294e4c9f9bd73d5d6db29cab63f6d1331ab2e259c3b95d6e4 not found: ID does not exist" containerID="42fce37b570c2e1294e4c9f9bd73d5d6db29cab63f6d1331ab2e259c3b95d6e4" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.003858 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"42fce37b570c2e1294e4c9f9bd73d5d6db29cab63f6d1331ab2e259c3b95d6e4"} err="failed to get container status \"42fce37b570c2e1294e4c9f9bd73d5d6db29cab63f6d1331ab2e259c3b95d6e4\": rpc error: code = NotFound desc = could not find container \"42fce37b570c2e1294e4c9f9bd73d5d6db29cab63f6d1331ab2e259c3b95d6e4\": container with ID starting with 42fce37b570c2e1294e4c9f9bd73d5d6db29cab63f6d1331ab2e259c3b95d6e4 not found: ID does not exist" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.026984 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.027025 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.038569 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:59 crc kubenswrapper[4831]: E1203 06:52:59.038912 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13682415-6e71-4b15-970c-311c9d163f3d" containerName="cinder-api-log" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.038923 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="13682415-6e71-4b15-970c-311c9d163f3d" containerName="cinder-api-log" Dec 03 06:52:59 crc kubenswrapper[4831]: E1203 06:52:59.038948 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13682415-6e71-4b15-970c-311c9d163f3d" containerName="cinder-api" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.038954 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="13682415-6e71-4b15-970c-311c9d163f3d" containerName="cinder-api" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.039120 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="13682415-6e71-4b15-970c-311c9d163f3d" containerName="cinder-api" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.039142 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="13682415-6e71-4b15-970c-311c9d163f3d" containerName="cinder-api-log" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.041101 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.048703 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.048945 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.060476 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.066994 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.074162 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.088956 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.106628 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.108333 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.110306 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.112550 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.114350 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.114476 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6979c180-0a35-47ed-bbeb-fa7a30f11bed-run-httpd\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.114543 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.114564 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.114603 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.114751 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.114816 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.114859 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-scripts\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.114912 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6979c180-0a35-47ed-bbeb-fa7a30f11bed-log-httpd\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.115164 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.115285 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlhps\" (UniqueName: \"kubernetes.io/projected/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-kube-api-access-mlhps\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.115348 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.115375 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-config-data\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.115409 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48gpx\" (UniqueName: \"kubernetes.io/projected/6979c180-0a35-47ed-bbeb-fa7a30f11bed-kube-api-access-48gpx\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.115467 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.115522 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.115916 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-logs\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217398 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217440 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217465 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-scripts\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217487 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6979c180-0a35-47ed-bbeb-fa7a30f11bed-log-httpd\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217508 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217540 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlhps\" (UniqueName: \"kubernetes.io/projected/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-kube-api-access-mlhps\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217560 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217576 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-config-data\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217597 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48gpx\" (UniqueName: \"kubernetes.io/projected/6979c180-0a35-47ed-bbeb-fa7a30f11bed-kube-api-access-48gpx\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217622 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217643 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217659 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-logs\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217691 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6979c180-0a35-47ed-bbeb-fa7a30f11bed-run-httpd\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217711 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217728 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217749 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.217840 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-etc-machine-id\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.218828 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-logs\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.219170 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6979c180-0a35-47ed-bbeb-fa7a30f11bed-log-httpd\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.219625 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6979c180-0a35-47ed-bbeb-fa7a30f11bed-run-httpd\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.222780 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.224424 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-scripts\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.224792 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.225086 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.225376 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.225689 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.226869 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-public-tls-certs\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.226944 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.228212 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-config-data\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.234919 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.237794 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlhps\" (UniqueName: \"kubernetes.io/projected/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-kube-api-access-mlhps\") pod \"cinder-api-0\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.242423 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48gpx\" (UniqueName: \"kubernetes.io/projected/6979c180-0a35-47ed-bbeb-fa7a30f11bed-kube-api-access-48gpx\") pod \"ceilometer-0\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.358159 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.428377 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 06:52:59 crc kubenswrapper[4831]: W1203 06:52:59.759180 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6979c180_0a35_47ed_bbeb_fa7a30f11bed.slice/crio-e66a3853376abf86841b764358a520456cfdc406ee6d854df3d823f89e6296d4 WatchSource:0}: Error finding container e66a3853376abf86841b764358a520456cfdc406ee6d854df3d823f89e6296d4: Status 404 returned error can't find the container with id e66a3853376abf86841b764358a520456cfdc406ee6d854df3d823f89e6296d4 Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.760757 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.873071 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.921012 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6979c180-0a35-47ed-bbeb-fa7a30f11bed","Type":"ContainerStarted","Data":"e66a3853376abf86841b764358a520456cfdc406ee6d854df3d823f89e6296d4"} Dec 03 06:52:59 crc kubenswrapper[4831]: I1203 06:52:59.923163 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1","Type":"ContainerStarted","Data":"7d7112e1bf2494fbb6f69eb22091f97a0a8fcd4e8434fc22e1091c126c54bd70"} Dec 03 06:53:00 crc kubenswrapper[4831]: I1203 06:53:00.937218 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6979c180-0a35-47ed-bbeb-fa7a30f11bed","Type":"ContainerStarted","Data":"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30"} Dec 03 06:53:00 crc kubenswrapper[4831]: I1203 06:53:00.939446 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1","Type":"ContainerStarted","Data":"db914d2ef3f9530c88bd1d73542d378efb6e0c34e42edbdd03ab88b16f91baa9"} Dec 03 06:53:01 crc kubenswrapper[4831]: I1203 06:53:01.027670 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13682415-6e71-4b15-970c-311c9d163f3d" path="/var/lib/kubelet/pods/13682415-6e71-4b15-970c-311c9d163f3d/volumes" Dec 03 06:53:01 crc kubenswrapper[4831]: I1203 06:53:01.028639 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6f36b08-51c0-4976-a86a-f3b47ceebde1" path="/var/lib/kubelet/pods/b6f36b08-51c0-4976-a86a-f3b47ceebde1/volumes" Dec 03 06:53:01 crc kubenswrapper[4831]: I1203 06:53:01.961528 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6979c180-0a35-47ed-bbeb-fa7a30f11bed","Type":"ContainerStarted","Data":"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452"} Dec 03 06:53:01 crc kubenswrapper[4831]: I1203 06:53:01.962094 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6979c180-0a35-47ed-bbeb-fa7a30f11bed","Type":"ContainerStarted","Data":"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d"} Dec 03 06:53:01 crc kubenswrapper[4831]: I1203 06:53:01.965981 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1","Type":"ContainerStarted","Data":"0fb9b727228c96fa187f5a0e156c0ff4cac686daaafe9b93470fdfcc2071ea6f"} Dec 03 06:53:01 crc kubenswrapper[4831]: I1203 06:53:01.966598 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 06:53:01 crc kubenswrapper[4831]: I1203 06:53:01.990183 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=2.990166645 podStartE2EDuration="2.990166645s" podCreationTimestamp="2025-12-03 06:52:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:53:01.989187195 +0000 UTC m=+1319.332770703" watchObservedRunningTime="2025-12-03 06:53:01.990166645 +0000 UTC m=+1319.333750153" Dec 03 06:53:03 crc kubenswrapper[4831]: I1203 06:53:03.988376 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6979c180-0a35-47ed-bbeb-fa7a30f11bed","Type":"ContainerStarted","Data":"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9"} Dec 03 06:53:03 crc kubenswrapper[4831]: I1203 06:53:03.989928 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 06:53:04 crc kubenswrapper[4831]: I1203 06:53:04.028066 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.824899261 podStartE2EDuration="6.028046493s" podCreationTimestamp="2025-12-03 06:52:58 +0000 UTC" firstStartedPulling="2025-12-03 06:52:59.760618481 +0000 UTC m=+1317.104201989" lastFinishedPulling="2025-12-03 06:53:02.963765713 +0000 UTC m=+1320.307349221" observedRunningTime="2025-12-03 06:53:04.02152363 +0000 UTC m=+1321.365107188" watchObservedRunningTime="2025-12-03 06:53:04.028046493 +0000 UTC m=+1321.371630001" Dec 03 06:53:05 crc kubenswrapper[4831]: I1203 06:53:05.312412 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 06:53:05 crc kubenswrapper[4831]: I1203 06:53:05.312677 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 06:53:05 crc kubenswrapper[4831]: I1203 06:53:05.343168 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 06:53:05 crc kubenswrapper[4831]: I1203 06:53:05.350777 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 06:53:05 crc kubenswrapper[4831]: I1203 06:53:05.358782 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 06:53:05 crc kubenswrapper[4831]: I1203 06:53:05.359727 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 06:53:05 crc kubenswrapper[4831]: I1203 06:53:05.395613 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 06:53:05 crc kubenswrapper[4831]: I1203 06:53:05.408533 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 06:53:06 crc kubenswrapper[4831]: I1203 06:53:06.005359 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 06:53:06 crc kubenswrapper[4831]: I1203 06:53:06.005406 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 06:53:06 crc kubenswrapper[4831]: I1203 06:53:06.005419 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 06:53:06 crc kubenswrapper[4831]: I1203 06:53:06.005430 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 06:53:06 crc kubenswrapper[4831]: I1203 06:53:06.708406 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.013418 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="sg-core" containerID="cri-o://e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452" gracePeriod=30 Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.013441 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="ceilometer-notification-agent" containerID="cri-o://b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d" gracePeriod=30 Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.013421 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="proxy-httpd" containerID="cri-o://b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9" gracePeriod=30 Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.013489 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="ceilometer-central-agent" containerID="cri-o://16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30" gracePeriod=30 Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.856113 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.988217 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-combined-ca-bundle\") pod \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.988282 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-scripts\") pod \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.988363 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-config-data\") pod \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.988432 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6979c180-0a35-47ed-bbeb-fa7a30f11bed-run-httpd\") pod \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.988466 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48gpx\" (UniqueName: \"kubernetes.io/projected/6979c180-0a35-47ed-bbeb-fa7a30f11bed-kube-api-access-48gpx\") pod \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.988523 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6979c180-0a35-47ed-bbeb-fa7a30f11bed-log-httpd\") pod \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.988566 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-sg-core-conf-yaml\") pod \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\" (UID: \"6979c180-0a35-47ed-bbeb-fa7a30f11bed\") " Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.989104 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6979c180-0a35-47ed-bbeb-fa7a30f11bed-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "6979c180-0a35-47ed-bbeb-fa7a30f11bed" (UID: "6979c180-0a35-47ed-bbeb-fa7a30f11bed"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.989378 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6979c180-0a35-47ed-bbeb-fa7a30f11bed-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "6979c180-0a35-47ed-bbeb-fa7a30f11bed" (UID: "6979c180-0a35-47ed-bbeb-fa7a30f11bed"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:07 crc kubenswrapper[4831]: I1203 06:53:07.996913 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-scripts" (OuterVolumeSpecName: "scripts") pod "6979c180-0a35-47ed-bbeb-fa7a30f11bed" (UID: "6979c180-0a35-47ed-bbeb-fa7a30f11bed"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.009860 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6979c180-0a35-47ed-bbeb-fa7a30f11bed-kube-api-access-48gpx" (OuterVolumeSpecName: "kube-api-access-48gpx") pod "6979c180-0a35-47ed-bbeb-fa7a30f11bed" (UID: "6979c180-0a35-47ed-bbeb-fa7a30f11bed"). InnerVolumeSpecName "kube-api-access-48gpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.020800 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "6979c180-0a35-47ed-bbeb-fa7a30f11bed" (UID: "6979c180-0a35-47ed-bbeb-fa7a30f11bed"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.029082 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037223 4831 generic.go:334] "Generic (PLEG): container finished" podID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerID="b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9" exitCode=0 Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037265 4831 generic.go:334] "Generic (PLEG): container finished" podID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerID="e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452" exitCode=2 Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037274 4831 generic.go:334] "Generic (PLEG): container finished" podID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerID="b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d" exitCode=0 Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037281 4831 generic.go:334] "Generic (PLEG): container finished" podID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerID="16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30" exitCode=0 Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037297 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037376 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037386 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037420 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6979c180-0a35-47ed-bbeb-fa7a30f11bed","Type":"ContainerDied","Data":"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9"} Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037445 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6979c180-0a35-47ed-bbeb-fa7a30f11bed","Type":"ContainerDied","Data":"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452"} Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037455 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6979c180-0a35-47ed-bbeb-fa7a30f11bed","Type":"ContainerDied","Data":"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d"} Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037465 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6979c180-0a35-47ed-bbeb-fa7a30f11bed","Type":"ContainerDied","Data":"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30"} Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037473 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6979c180-0a35-47ed-bbeb-fa7a30f11bed","Type":"ContainerDied","Data":"e66a3853376abf86841b764358a520456cfdc406ee6d854df3d823f89e6296d4"} Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037488 4831 scope.go:117] "RemoveContainer" containerID="b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.037614 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.079428 4831 scope.go:117] "RemoveContainer" containerID="e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.085601 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6979c180-0a35-47ed-bbeb-fa7a30f11bed" (UID: "6979c180-0a35-47ed-bbeb-fa7a30f11bed"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.090430 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6979c180-0a35-47ed-bbeb-fa7a30f11bed-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.090462 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.090476 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.090486 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.090496 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6979c180-0a35-47ed-bbeb-fa7a30f11bed-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.090508 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48gpx\" (UniqueName: \"kubernetes.io/projected/6979c180-0a35-47ed-bbeb-fa7a30f11bed-kube-api-access-48gpx\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.113566 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-config-data" (OuterVolumeSpecName: "config-data") pod "6979c180-0a35-47ed-bbeb-fa7a30f11bed" (UID: "6979c180-0a35-47ed-bbeb-fa7a30f11bed"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.144600 4831 scope.go:117] "RemoveContainer" containerID="b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.205422 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6979c180-0a35-47ed-bbeb-fa7a30f11bed-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.225598 4831 scope.go:117] "RemoveContainer" containerID="16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.283151 4831 scope.go:117] "RemoveContainer" containerID="b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9" Dec 03 06:53:08 crc kubenswrapper[4831]: E1203 06:53:08.285497 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9\": container with ID starting with b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9 not found: ID does not exist" containerID="b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.285537 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9"} err="failed to get container status \"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9\": rpc error: code = NotFound desc = could not find container \"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9\": container with ID starting with b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9 not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.285568 4831 scope.go:117] "RemoveContainer" containerID="e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452" Dec 03 06:53:08 crc kubenswrapper[4831]: E1203 06:53:08.288531 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452\": container with ID starting with e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452 not found: ID does not exist" containerID="e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.288560 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452"} err="failed to get container status \"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452\": rpc error: code = NotFound desc = could not find container \"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452\": container with ID starting with e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452 not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.288583 4831 scope.go:117] "RemoveContainer" containerID="b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d" Dec 03 06:53:08 crc kubenswrapper[4831]: E1203 06:53:08.291722 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d\": container with ID starting with b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d not found: ID does not exist" containerID="b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.291744 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d"} err="failed to get container status \"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d\": rpc error: code = NotFound desc = could not find container \"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d\": container with ID starting with b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.291760 4831 scope.go:117] "RemoveContainer" containerID="16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30" Dec 03 06:53:08 crc kubenswrapper[4831]: E1203 06:53:08.291963 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30\": container with ID starting with 16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30 not found: ID does not exist" containerID="16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.291983 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30"} err="failed to get container status \"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30\": rpc error: code = NotFound desc = could not find container \"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30\": container with ID starting with 16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30 not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.291998 4831 scope.go:117] "RemoveContainer" containerID="b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.292175 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9"} err="failed to get container status \"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9\": rpc error: code = NotFound desc = could not find container \"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9\": container with ID starting with b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9 not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.292193 4831 scope.go:117] "RemoveContainer" containerID="e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.292378 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452"} err="failed to get container status \"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452\": rpc error: code = NotFound desc = could not find container \"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452\": container with ID starting with e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452 not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.292395 4831 scope.go:117] "RemoveContainer" containerID="b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.292564 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d"} err="failed to get container status \"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d\": rpc error: code = NotFound desc = could not find container \"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d\": container with ID starting with b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.292582 4831 scope.go:117] "RemoveContainer" containerID="16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.292746 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30"} err="failed to get container status \"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30\": rpc error: code = NotFound desc = could not find container \"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30\": container with ID starting with 16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30 not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.292767 4831 scope.go:117] "RemoveContainer" containerID="b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.292938 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9"} err="failed to get container status \"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9\": rpc error: code = NotFound desc = could not find container \"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9\": container with ID starting with b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9 not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.292958 4831 scope.go:117] "RemoveContainer" containerID="e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.293169 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452"} err="failed to get container status \"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452\": rpc error: code = NotFound desc = could not find container \"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452\": container with ID starting with e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452 not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.293191 4831 scope.go:117] "RemoveContainer" containerID="b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.293467 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d"} err="failed to get container status \"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d\": rpc error: code = NotFound desc = could not find container \"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d\": container with ID starting with b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.293491 4831 scope.go:117] "RemoveContainer" containerID="16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.293717 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30"} err="failed to get container status \"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30\": rpc error: code = NotFound desc = could not find container \"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30\": container with ID starting with 16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30 not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.293738 4831 scope.go:117] "RemoveContainer" containerID="b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.293939 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9"} err="failed to get container status \"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9\": rpc error: code = NotFound desc = could not find container \"b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9\": container with ID starting with b4d21ed5c647ed13a853c3f238f0bfd94031e90623a689c339d05d7bb87b73a9 not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.293957 4831 scope.go:117] "RemoveContainer" containerID="e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.294149 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452"} err="failed to get container status \"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452\": rpc error: code = NotFound desc = could not find container \"e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452\": container with ID starting with e9fcc0cabe01b8dfc2d5f9c2f6192deb9f761301f45532ce9368c1bdc4baf452 not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.294168 4831 scope.go:117] "RemoveContainer" containerID="b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.294371 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d"} err="failed to get container status \"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d\": rpc error: code = NotFound desc = could not find container \"b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d\": container with ID starting with b0966fe72fa45b79262f23be76787791692f4d207824e464802b30475597351d not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.294389 4831 scope.go:117] "RemoveContainer" containerID="16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.294591 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30"} err="failed to get container status \"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30\": rpc error: code = NotFound desc = could not find container \"16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30\": container with ID starting with 16a4e050df12d4c788e469c51d8f8662c22bd7085f2c4f8152fec33bdcee8c30 not found: ID does not exist" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.365111 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.397599 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.407082 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.427293 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:53:08 crc kubenswrapper[4831]: E1203 06:53:08.427666 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="proxy-httpd" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.427686 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="proxy-httpd" Dec 03 06:53:08 crc kubenswrapper[4831]: E1203 06:53:08.427719 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="sg-core" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.427726 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="sg-core" Dec 03 06:53:08 crc kubenswrapper[4831]: E1203 06:53:08.427734 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="ceilometer-central-agent" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.427741 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="ceilometer-central-agent" Dec 03 06:53:08 crc kubenswrapper[4831]: E1203 06:53:08.427758 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="ceilometer-notification-agent" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.427764 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="ceilometer-notification-agent" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.427922 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="sg-core" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.427938 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="ceilometer-central-agent" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.427954 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="proxy-httpd" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.427974 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" containerName="ceilometer-notification-agent" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.431963 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.437330 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.437629 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.458755 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.512986 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e81d6eaf-ea09-43db-906e-dae06ce685d8-run-httpd\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.513053 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhl6x\" (UniqueName: \"kubernetes.io/projected/e81d6eaf-ea09-43db-906e-dae06ce685d8-kube-api-access-lhl6x\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.513099 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.513150 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.513199 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-scripts\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.513240 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e81d6eaf-ea09-43db-906e-dae06ce685d8-log-httpd\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.513267 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-config-data\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.573348 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.615187 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e81d6eaf-ea09-43db-906e-dae06ce685d8-run-httpd\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.615235 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhl6x\" (UniqueName: \"kubernetes.io/projected/e81d6eaf-ea09-43db-906e-dae06ce685d8-kube-api-access-lhl6x\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.615277 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.615336 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.615384 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-scripts\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.615421 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e81d6eaf-ea09-43db-906e-dae06ce685d8-log-httpd\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.615445 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-config-data\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.616469 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e81d6eaf-ea09-43db-906e-dae06ce685d8-log-httpd\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.616957 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e81d6eaf-ea09-43db-906e-dae06ce685d8-run-httpd\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.643178 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhl6x\" (UniqueName: \"kubernetes.io/projected/e81d6eaf-ea09-43db-906e-dae06ce685d8-kube-api-access-lhl6x\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.643758 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.643813 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-scripts\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.643982 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.645629 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-config-data\") pod \"ceilometer-0\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " pod="openstack/ceilometer-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.710242 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 06:53:08 crc kubenswrapper[4831]: I1203 06:53:08.757299 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:53:09 crc kubenswrapper[4831]: I1203 06:53:09.024337 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6979c180-0a35-47ed-bbeb-fa7a30f11bed" path="/var/lib/kubelet/pods/6979c180-0a35-47ed-bbeb-fa7a30f11bed/volumes" Dec 03 06:53:09 crc kubenswrapper[4831]: I1203 06:53:09.290930 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:53:09 crc kubenswrapper[4831]: I1203 06:53:09.659744 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:53:10 crc kubenswrapper[4831]: I1203 06:53:10.073249 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e81d6eaf-ea09-43db-906e-dae06ce685d8","Type":"ContainerStarted","Data":"75b047084dc8b16ee4f7f7d719653de8add48f84de233fd696b42a672c3d2c8e"} Dec 03 06:53:10 crc kubenswrapper[4831]: I1203 06:53:10.073602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e81d6eaf-ea09-43db-906e-dae06ce685d8","Type":"ContainerStarted","Data":"3bfb226b992e572755c8dff95b14dfcddcf2ad64aad687e09de9a231b028cfad"} Dec 03 06:53:11 crc kubenswrapper[4831]: I1203 06:53:11.084079 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e81d6eaf-ea09-43db-906e-dae06ce685d8","Type":"ContainerStarted","Data":"fde2e50f4d7cfe1b96e8b3f523c0e29c649af725d445a3690039bc74b475203c"} Dec 03 06:53:12 crc kubenswrapper[4831]: I1203 06:53:12.039126 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 06:53:12 crc kubenswrapper[4831]: I1203 06:53:12.984633 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-q4vmd"] Dec 03 06:53:12 crc kubenswrapper[4831]: I1203 06:53:12.985978 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q4vmd" Dec 03 06:53:12 crc kubenswrapper[4831]: I1203 06:53:12.998166 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-q4vmd"] Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.095575 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-w5hvd"] Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.098510 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-w5hvd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.109367 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8edc-account-create-update-fnzhp"] Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.110636 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8edc-account-create-update-fnzhp" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.112344 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa364658-00c2-41ba-bb0d-eaae5161de19-operator-scripts\") pod \"nova-api-db-create-q4vmd\" (UID: \"fa364658-00c2-41ba-bb0d-eaae5161de19\") " pod="openstack/nova-api-db-create-q4vmd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.112426 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn4mc\" (UniqueName: \"kubernetes.io/projected/fa364658-00c2-41ba-bb0d-eaae5161de19-kube-api-access-tn4mc\") pod \"nova-api-db-create-q4vmd\" (UID: \"fa364658-00c2-41ba-bb0d-eaae5161de19\") " pod="openstack/nova-api-db-create-q4vmd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.112503 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.129942 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-w5hvd"] Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.131357 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e81d6eaf-ea09-43db-906e-dae06ce685d8","Type":"ContainerStarted","Data":"81b5150fbd0b1172d6d28f39ba926d2a465c6b9bc8c2478685908f4bccff980a"} Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.137144 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8edc-account-create-update-fnzhp"] Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.215271 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa364658-00c2-41ba-bb0d-eaae5161de19-operator-scripts\") pod \"nova-api-db-create-q4vmd\" (UID: \"fa364658-00c2-41ba-bb0d-eaae5161de19\") " pod="openstack/nova-api-db-create-q4vmd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.215414 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4cd3e86-d46d-47f3-8477-4dfff4134915-operator-scripts\") pod \"nova-api-8edc-account-create-update-fnzhp\" (UID: \"e4cd3e86-d46d-47f3-8477-4dfff4134915\") " pod="openstack/nova-api-8edc-account-create-update-fnzhp" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.215465 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn4mc\" (UniqueName: \"kubernetes.io/projected/fa364658-00c2-41ba-bb0d-eaae5161de19-kube-api-access-tn4mc\") pod \"nova-api-db-create-q4vmd\" (UID: \"fa364658-00c2-41ba-bb0d-eaae5161de19\") " pod="openstack/nova-api-db-create-q4vmd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.215534 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q8br4\" (UniqueName: \"kubernetes.io/projected/e4cd3e86-d46d-47f3-8477-4dfff4134915-kube-api-access-q8br4\") pod \"nova-api-8edc-account-create-update-fnzhp\" (UID: \"e4cd3e86-d46d-47f3-8477-4dfff4134915\") " pod="openstack/nova-api-8edc-account-create-update-fnzhp" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.215572 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5k4l6\" (UniqueName: \"kubernetes.io/projected/b2c30990-3c55-4441-8325-1132b7411671-kube-api-access-5k4l6\") pod \"nova-cell0-db-create-w5hvd\" (UID: \"b2c30990-3c55-4441-8325-1132b7411671\") " pod="openstack/nova-cell0-db-create-w5hvd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.215599 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c30990-3c55-4441-8325-1132b7411671-operator-scripts\") pod \"nova-cell0-db-create-w5hvd\" (UID: \"b2c30990-3c55-4441-8325-1132b7411671\") " pod="openstack/nova-cell0-db-create-w5hvd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.217681 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa364658-00c2-41ba-bb0d-eaae5161de19-operator-scripts\") pod \"nova-api-db-create-q4vmd\" (UID: \"fa364658-00c2-41ba-bb0d-eaae5161de19\") " pod="openstack/nova-api-db-create-q4vmd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.234190 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn4mc\" (UniqueName: \"kubernetes.io/projected/fa364658-00c2-41ba-bb0d-eaae5161de19-kube-api-access-tn4mc\") pod \"nova-api-db-create-q4vmd\" (UID: \"fa364658-00c2-41ba-bb0d-eaae5161de19\") " pod="openstack/nova-api-db-create-q4vmd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.300731 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q4vmd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.300858 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-7556-account-create-update-h24bn"] Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.303398 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7556-account-create-update-h24bn" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.309484 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.317372 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q8br4\" (UniqueName: \"kubernetes.io/projected/e4cd3e86-d46d-47f3-8477-4dfff4134915-kube-api-access-q8br4\") pod \"nova-api-8edc-account-create-update-fnzhp\" (UID: \"e4cd3e86-d46d-47f3-8477-4dfff4134915\") " pod="openstack/nova-api-8edc-account-create-update-fnzhp" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.317440 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5k4l6\" (UniqueName: \"kubernetes.io/projected/b2c30990-3c55-4441-8325-1132b7411671-kube-api-access-5k4l6\") pod \"nova-cell0-db-create-w5hvd\" (UID: \"b2c30990-3c55-4441-8325-1132b7411671\") " pod="openstack/nova-cell0-db-create-w5hvd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.317478 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c30990-3c55-4441-8325-1132b7411671-operator-scripts\") pod \"nova-cell0-db-create-w5hvd\" (UID: \"b2c30990-3c55-4441-8325-1132b7411671\") " pod="openstack/nova-cell0-db-create-w5hvd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.317656 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4cd3e86-d46d-47f3-8477-4dfff4134915-operator-scripts\") pod \"nova-api-8edc-account-create-update-fnzhp\" (UID: \"e4cd3e86-d46d-47f3-8477-4dfff4134915\") " pod="openstack/nova-api-8edc-account-create-update-fnzhp" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.318456 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4cd3e86-d46d-47f3-8477-4dfff4134915-operator-scripts\") pod \"nova-api-8edc-account-create-update-fnzhp\" (UID: \"e4cd3e86-d46d-47f3-8477-4dfff4134915\") " pod="openstack/nova-api-8edc-account-create-update-fnzhp" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.320815 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c30990-3c55-4441-8325-1132b7411671-operator-scripts\") pod \"nova-cell0-db-create-w5hvd\" (UID: \"b2c30990-3c55-4441-8325-1132b7411671\") " pod="openstack/nova-cell0-db-create-w5hvd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.322950 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7556-account-create-update-h24bn"] Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.346482 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5k4l6\" (UniqueName: \"kubernetes.io/projected/b2c30990-3c55-4441-8325-1132b7411671-kube-api-access-5k4l6\") pod \"nova-cell0-db-create-w5hvd\" (UID: \"b2c30990-3c55-4441-8325-1132b7411671\") " pod="openstack/nova-cell0-db-create-w5hvd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.352737 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q8br4\" (UniqueName: \"kubernetes.io/projected/e4cd3e86-d46d-47f3-8477-4dfff4134915-kube-api-access-q8br4\") pod \"nova-api-8edc-account-create-update-fnzhp\" (UID: \"e4cd3e86-d46d-47f3-8477-4dfff4134915\") " pod="openstack/nova-api-8edc-account-create-update-fnzhp" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.401362 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-d6pll"] Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.403141 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d6pll" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.416883 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-w5hvd" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.419335 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e57717-9dd6-441b-b3bc-563ce1951f14-operator-scripts\") pod \"nova-cell0-7556-account-create-update-h24bn\" (UID: \"29e57717-9dd6-441b-b3bc-563ce1951f14\") " pod="openstack/nova-cell0-7556-account-create-update-h24bn" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.419448 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwhct\" (UniqueName: \"kubernetes.io/projected/29e57717-9dd6-441b-b3bc-563ce1951f14-kube-api-access-gwhct\") pod \"nova-cell0-7556-account-create-update-h24bn\" (UID: \"29e57717-9dd6-441b-b3bc-563ce1951f14\") " pod="openstack/nova-cell0-7556-account-create-update-h24bn" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.434033 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-d6pll"] Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.448364 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8edc-account-create-update-fnzhp" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.518407 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-dc42-account-create-update-w5llk"] Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.519889 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-dc42-account-create-update-w5llk" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.522536 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khggr\" (UniqueName: \"kubernetes.io/projected/a70b57c4-0e81-4c6f-95db-7e4dfb9231df-kube-api-access-khggr\") pod \"nova-cell1-db-create-d6pll\" (UID: \"a70b57c4-0e81-4c6f-95db-7e4dfb9231df\") " pod="openstack/nova-cell1-db-create-d6pll" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.522597 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwhct\" (UniqueName: \"kubernetes.io/projected/29e57717-9dd6-441b-b3bc-563ce1951f14-kube-api-access-gwhct\") pod \"nova-cell0-7556-account-create-update-h24bn\" (UID: \"29e57717-9dd6-441b-b3bc-563ce1951f14\") " pod="openstack/nova-cell0-7556-account-create-update-h24bn" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.522647 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a70b57c4-0e81-4c6f-95db-7e4dfb9231df-operator-scripts\") pod \"nova-cell1-db-create-d6pll\" (UID: \"a70b57c4-0e81-4c6f-95db-7e4dfb9231df\") " pod="openstack/nova-cell1-db-create-d6pll" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.522702 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e57717-9dd6-441b-b3bc-563ce1951f14-operator-scripts\") pod \"nova-cell0-7556-account-create-update-h24bn\" (UID: \"29e57717-9dd6-441b-b3bc-563ce1951f14\") " pod="openstack/nova-cell0-7556-account-create-update-h24bn" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.522984 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-dc42-account-create-update-w5llk"] Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.523392 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e57717-9dd6-441b-b3bc-563ce1951f14-operator-scripts\") pod \"nova-cell0-7556-account-create-update-h24bn\" (UID: \"29e57717-9dd6-441b-b3bc-563ce1951f14\") " pod="openstack/nova-cell0-7556-account-create-update-h24bn" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.528459 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.571253 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwhct\" (UniqueName: \"kubernetes.io/projected/29e57717-9dd6-441b-b3bc-563ce1951f14-kube-api-access-gwhct\") pod \"nova-cell0-7556-account-create-update-h24bn\" (UID: \"29e57717-9dd6-441b-b3bc-563ce1951f14\") " pod="openstack/nova-cell0-7556-account-create-update-h24bn" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.625068 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbtgg\" (UniqueName: \"kubernetes.io/projected/0678352c-a29a-4a58-908c-7066bbfe5825-kube-api-access-pbtgg\") pod \"nova-cell1-dc42-account-create-update-w5llk\" (UID: \"0678352c-a29a-4a58-908c-7066bbfe5825\") " pod="openstack/nova-cell1-dc42-account-create-update-w5llk" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.625166 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0678352c-a29a-4a58-908c-7066bbfe5825-operator-scripts\") pod \"nova-cell1-dc42-account-create-update-w5llk\" (UID: \"0678352c-a29a-4a58-908c-7066bbfe5825\") " pod="openstack/nova-cell1-dc42-account-create-update-w5llk" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.625222 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khggr\" (UniqueName: \"kubernetes.io/projected/a70b57c4-0e81-4c6f-95db-7e4dfb9231df-kube-api-access-khggr\") pod \"nova-cell1-db-create-d6pll\" (UID: \"a70b57c4-0e81-4c6f-95db-7e4dfb9231df\") " pod="openstack/nova-cell1-db-create-d6pll" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.625310 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a70b57c4-0e81-4c6f-95db-7e4dfb9231df-operator-scripts\") pod \"nova-cell1-db-create-d6pll\" (UID: \"a70b57c4-0e81-4c6f-95db-7e4dfb9231df\") " pod="openstack/nova-cell1-db-create-d6pll" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.626054 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a70b57c4-0e81-4c6f-95db-7e4dfb9231df-operator-scripts\") pod \"nova-cell1-db-create-d6pll\" (UID: \"a70b57c4-0e81-4c6f-95db-7e4dfb9231df\") " pod="openstack/nova-cell1-db-create-d6pll" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.645535 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khggr\" (UniqueName: \"kubernetes.io/projected/a70b57c4-0e81-4c6f-95db-7e4dfb9231df-kube-api-access-khggr\") pod \"nova-cell1-db-create-d6pll\" (UID: \"a70b57c4-0e81-4c6f-95db-7e4dfb9231df\") " pod="openstack/nova-cell1-db-create-d6pll" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.684966 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7556-account-create-update-h24bn" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.688404 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-q4vmd"] Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.726945 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbtgg\" (UniqueName: \"kubernetes.io/projected/0678352c-a29a-4a58-908c-7066bbfe5825-kube-api-access-pbtgg\") pod \"nova-cell1-dc42-account-create-update-w5llk\" (UID: \"0678352c-a29a-4a58-908c-7066bbfe5825\") " pod="openstack/nova-cell1-dc42-account-create-update-w5llk" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.727003 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0678352c-a29a-4a58-908c-7066bbfe5825-operator-scripts\") pod \"nova-cell1-dc42-account-create-update-w5llk\" (UID: \"0678352c-a29a-4a58-908c-7066bbfe5825\") " pod="openstack/nova-cell1-dc42-account-create-update-w5llk" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.727685 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0678352c-a29a-4a58-908c-7066bbfe5825-operator-scripts\") pod \"nova-cell1-dc42-account-create-update-w5llk\" (UID: \"0678352c-a29a-4a58-908c-7066bbfe5825\") " pod="openstack/nova-cell1-dc42-account-create-update-w5llk" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.730580 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d6pll" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.744632 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbtgg\" (UniqueName: \"kubernetes.io/projected/0678352c-a29a-4a58-908c-7066bbfe5825-kube-api-access-pbtgg\") pod \"nova-cell1-dc42-account-create-update-w5llk\" (UID: \"0678352c-a29a-4a58-908c-7066bbfe5825\") " pod="openstack/nova-cell1-dc42-account-create-update-w5llk" Dec 03 06:53:13 crc kubenswrapper[4831]: I1203 06:53:13.862785 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-dc42-account-create-update-w5llk" Dec 03 06:53:14 crc kubenswrapper[4831]: I1203 06:53:14.034535 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-w5hvd"] Dec 03 06:53:14 crc kubenswrapper[4831]: I1203 06:53:14.061525 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8edc-account-create-update-fnzhp"] Dec 03 06:53:14 crc kubenswrapper[4831]: I1203 06:53:14.166196 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-w5hvd" event={"ID":"b2c30990-3c55-4441-8325-1132b7411671","Type":"ContainerStarted","Data":"073ef8a773638e2017d6cd910f10c17067bc8ade36bb29890048d8c25438b09f"} Dec 03 06:53:14 crc kubenswrapper[4831]: I1203 06:53:14.196989 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-d6pll"] Dec 03 06:53:14 crc kubenswrapper[4831]: I1203 06:53:14.197889 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q4vmd" event={"ID":"fa364658-00c2-41ba-bb0d-eaae5161de19","Type":"ContainerStarted","Data":"9e3af65913d913726511c961e8acbcbe13473dc1b95897d24ebe8eaa2cf37615"} Dec 03 06:53:14 crc kubenswrapper[4831]: I1203 06:53:14.213572 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8edc-account-create-update-fnzhp" event={"ID":"e4cd3e86-d46d-47f3-8477-4dfff4134915","Type":"ContainerStarted","Data":"65702994dfbdb1a6e4588b447dfaf6706eb3c727d6055ef2106d9d6759f50ec2"} Dec 03 06:53:14 crc kubenswrapper[4831]: I1203 06:53:14.217504 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-7556-account-create-update-h24bn"] Dec 03 06:53:14 crc kubenswrapper[4831]: W1203 06:53:14.283214 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod29e57717_9dd6_441b_b3bc_563ce1951f14.slice/crio-74420c11ddfbc9e9273388eb71ede849a27180d3173e31853300e34fff96ff89 WatchSource:0}: Error finding container 74420c11ddfbc9e9273388eb71ede849a27180d3173e31853300e34fff96ff89: Status 404 returned error can't find the container with id 74420c11ddfbc9e9273388eb71ede849a27180d3173e31853300e34fff96ff89 Dec 03 06:53:14 crc kubenswrapper[4831]: I1203 06:53:14.572542 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-dc42-account-create-update-w5llk"] Dec 03 06:53:14 crc kubenswrapper[4831]: E1203 06:53:14.739598 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa364658_00c2_41ba_bb0d_eaae5161de19.slice/crio-54f683321e8d2d8116c08f5d3cfe3537791340a4d01427fbdbb99a494adf2145.scope\": RecentStats: unable to find data in memory cache]" Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.228865 4831 generic.go:334] "Generic (PLEG): container finished" podID="e4cd3e86-d46d-47f3-8477-4dfff4134915" containerID="f43ab3aa4c0eaaff226644bb2adc947099b427f65d25d4d5d58debc73fcb00c2" exitCode=0 Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.228971 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8edc-account-create-update-fnzhp" event={"ID":"e4cd3e86-d46d-47f3-8477-4dfff4134915","Type":"ContainerDied","Data":"f43ab3aa4c0eaaff226644bb2adc947099b427f65d25d4d5d58debc73fcb00c2"} Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.231558 4831 generic.go:334] "Generic (PLEG): container finished" podID="0678352c-a29a-4a58-908c-7066bbfe5825" containerID="559f2b757d8e6a0c71983c6a77dee78c697efd373adaa39024dff05879505861" exitCode=0 Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.231653 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-dc42-account-create-update-w5llk" event={"ID":"0678352c-a29a-4a58-908c-7066bbfe5825","Type":"ContainerDied","Data":"559f2b757d8e6a0c71983c6a77dee78c697efd373adaa39024dff05879505861"} Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.231735 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-dc42-account-create-update-w5llk" event={"ID":"0678352c-a29a-4a58-908c-7066bbfe5825","Type":"ContainerStarted","Data":"179a1bcb73528d947d09ea403116a5cf0c8ba773d1c702b6750bd035245b0e9f"} Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.233712 4831 generic.go:334] "Generic (PLEG): container finished" podID="b2c30990-3c55-4441-8325-1132b7411671" containerID="291b131153029f387717c7da3fd356090449748a8a6f50e5e4ea5f9663532572" exitCode=0 Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.233760 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-w5hvd" event={"ID":"b2c30990-3c55-4441-8325-1132b7411671","Type":"ContainerDied","Data":"291b131153029f387717c7da3fd356090449748a8a6f50e5e4ea5f9663532572"} Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.235730 4831 generic.go:334] "Generic (PLEG): container finished" podID="29e57717-9dd6-441b-b3bc-563ce1951f14" containerID="ab65d6c3a454bed3af0834fe18511469a011150d61395258c6f5699f0ddd1b99" exitCode=0 Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.235810 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7556-account-create-update-h24bn" event={"ID":"29e57717-9dd6-441b-b3bc-563ce1951f14","Type":"ContainerDied","Data":"ab65d6c3a454bed3af0834fe18511469a011150d61395258c6f5699f0ddd1b99"} Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.235840 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7556-account-create-update-h24bn" event={"ID":"29e57717-9dd6-441b-b3bc-563ce1951f14","Type":"ContainerStarted","Data":"74420c11ddfbc9e9273388eb71ede849a27180d3173e31853300e34fff96ff89"} Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.242451 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e81d6eaf-ea09-43db-906e-dae06ce685d8","Type":"ContainerStarted","Data":"cc59704ac175aa51eef90665f0c142d600f23e32957bdb845b6911299943cd82"} Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.242846 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="ceilometer-central-agent" containerID="cri-o://75b047084dc8b16ee4f7f7d719653de8add48f84de233fd696b42a672c3d2c8e" gracePeriod=30 Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.243233 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.243265 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="sg-core" containerID="cri-o://81b5150fbd0b1172d6d28f39ba926d2a465c6b9bc8c2478685908f4bccff980a" gracePeriod=30 Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.243341 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="ceilometer-notification-agent" containerID="cri-o://fde2e50f4d7cfe1b96e8b3f523c0e29c649af725d445a3690039bc74b475203c" gracePeriod=30 Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.243414 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="proxy-httpd" containerID="cri-o://cc59704ac175aa51eef90665f0c142d600f23e32957bdb845b6911299943cd82" gracePeriod=30 Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.248167 4831 generic.go:334] "Generic (PLEG): container finished" podID="a70b57c4-0e81-4c6f-95db-7e4dfb9231df" containerID="cb55894e2c9cf5e0ad069474f6818fd633cf016e9cf5e6f36f1acf0b93692be8" exitCode=0 Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.248271 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d6pll" event={"ID":"a70b57c4-0e81-4c6f-95db-7e4dfb9231df","Type":"ContainerDied","Data":"cb55894e2c9cf5e0ad069474f6818fd633cf016e9cf5e6f36f1acf0b93692be8"} Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.248305 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d6pll" event={"ID":"a70b57c4-0e81-4c6f-95db-7e4dfb9231df","Type":"ContainerStarted","Data":"3f7e7bd7c42d2c57c16f9efa079132154ea113a535b3291f1e6a5a87650ce458"} Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.260445 4831 generic.go:334] "Generic (PLEG): container finished" podID="fa364658-00c2-41ba-bb0d-eaae5161de19" containerID="54f683321e8d2d8116c08f5d3cfe3537791340a4d01427fbdbb99a494adf2145" exitCode=0 Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.260521 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q4vmd" event={"ID":"fa364658-00c2-41ba-bb0d-eaae5161de19","Type":"ContainerDied","Data":"54f683321e8d2d8116c08f5d3cfe3537791340a4d01427fbdbb99a494adf2145"} Dec 03 06:53:15 crc kubenswrapper[4831]: I1203 06:53:15.333562 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.333312435 podStartE2EDuration="7.333546931s" podCreationTimestamp="2025-12-03 06:53:08 +0000 UTC" firstStartedPulling="2025-12-03 06:53:09.301658277 +0000 UTC m=+1326.645241785" lastFinishedPulling="2025-12-03 06:53:14.301892773 +0000 UTC m=+1331.645476281" observedRunningTime="2025-12-03 06:53:15.329895817 +0000 UTC m=+1332.673479335" watchObservedRunningTime="2025-12-03 06:53:15.333546931 +0000 UTC m=+1332.677130439" Dec 03 06:53:16 crc kubenswrapper[4831]: I1203 06:53:16.298819 4831 generic.go:334] "Generic (PLEG): container finished" podID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerID="cc59704ac175aa51eef90665f0c142d600f23e32957bdb845b6911299943cd82" exitCode=0 Dec 03 06:53:16 crc kubenswrapper[4831]: I1203 06:53:16.299166 4831 generic.go:334] "Generic (PLEG): container finished" podID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerID="81b5150fbd0b1172d6d28f39ba926d2a465c6b9bc8c2478685908f4bccff980a" exitCode=2 Dec 03 06:53:16 crc kubenswrapper[4831]: I1203 06:53:16.299176 4831 generic.go:334] "Generic (PLEG): container finished" podID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerID="fde2e50f4d7cfe1b96e8b3f523c0e29c649af725d445a3690039bc74b475203c" exitCode=0 Dec 03 06:53:16 crc kubenswrapper[4831]: I1203 06:53:16.299121 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e81d6eaf-ea09-43db-906e-dae06ce685d8","Type":"ContainerDied","Data":"cc59704ac175aa51eef90665f0c142d600f23e32957bdb845b6911299943cd82"} Dec 03 06:53:16 crc kubenswrapper[4831]: I1203 06:53:16.299396 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e81d6eaf-ea09-43db-906e-dae06ce685d8","Type":"ContainerDied","Data":"81b5150fbd0b1172d6d28f39ba926d2a465c6b9bc8c2478685908f4bccff980a"} Dec 03 06:53:16 crc kubenswrapper[4831]: I1203 06:53:16.299409 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e81d6eaf-ea09-43db-906e-dae06ce685d8","Type":"ContainerDied","Data":"fde2e50f4d7cfe1b96e8b3f523c0e29c649af725d445a3690039bc74b475203c"} Dec 03 06:53:16 crc kubenswrapper[4831]: I1203 06:53:16.846532 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q4vmd" Dec 03 06:53:16 crc kubenswrapper[4831]: I1203 06:53:16.899042 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn4mc\" (UniqueName: \"kubernetes.io/projected/fa364658-00c2-41ba-bb0d-eaae5161de19-kube-api-access-tn4mc\") pod \"fa364658-00c2-41ba-bb0d-eaae5161de19\" (UID: \"fa364658-00c2-41ba-bb0d-eaae5161de19\") " Dec 03 06:53:16 crc kubenswrapper[4831]: I1203 06:53:16.899190 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa364658-00c2-41ba-bb0d-eaae5161de19-operator-scripts\") pod \"fa364658-00c2-41ba-bb0d-eaae5161de19\" (UID: \"fa364658-00c2-41ba-bb0d-eaae5161de19\") " Dec 03 06:53:16 crc kubenswrapper[4831]: I1203 06:53:16.900570 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa364658-00c2-41ba-bb0d-eaae5161de19-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fa364658-00c2-41ba-bb0d-eaae5161de19" (UID: "fa364658-00c2-41ba-bb0d-eaae5161de19"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:16 crc kubenswrapper[4831]: I1203 06:53:16.910492 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa364658-00c2-41ba-bb0d-eaae5161de19-kube-api-access-tn4mc" (OuterVolumeSpecName: "kube-api-access-tn4mc") pod "fa364658-00c2-41ba-bb0d-eaae5161de19" (UID: "fa364658-00c2-41ba-bb0d-eaae5161de19"). InnerVolumeSpecName "kube-api-access-tn4mc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.001467 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn4mc\" (UniqueName: \"kubernetes.io/projected/fa364658-00c2-41ba-bb0d-eaae5161de19-kube-api-access-tn4mc\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.001504 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fa364658-00c2-41ba-bb0d-eaae5161de19-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.024986 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-dc42-account-create-update-w5llk" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.031256 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7556-account-create-update-h24bn" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.036996 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d6pll" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.044183 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8edc-account-create-update-fnzhp" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.066757 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-w5hvd" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.103908 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e57717-9dd6-441b-b3bc-563ce1951f14-operator-scripts\") pod \"29e57717-9dd6-441b-b3bc-563ce1951f14\" (UID: \"29e57717-9dd6-441b-b3bc-563ce1951f14\") " Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.103974 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q8br4\" (UniqueName: \"kubernetes.io/projected/e4cd3e86-d46d-47f3-8477-4dfff4134915-kube-api-access-q8br4\") pod \"e4cd3e86-d46d-47f3-8477-4dfff4134915\" (UID: \"e4cd3e86-d46d-47f3-8477-4dfff4134915\") " Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.104107 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwhct\" (UniqueName: \"kubernetes.io/projected/29e57717-9dd6-441b-b3bc-563ce1951f14-kube-api-access-gwhct\") pod \"29e57717-9dd6-441b-b3bc-563ce1951f14\" (UID: \"29e57717-9dd6-441b-b3bc-563ce1951f14\") " Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.104145 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khggr\" (UniqueName: \"kubernetes.io/projected/a70b57c4-0e81-4c6f-95db-7e4dfb9231df-kube-api-access-khggr\") pod \"a70b57c4-0e81-4c6f-95db-7e4dfb9231df\" (UID: \"a70b57c4-0e81-4c6f-95db-7e4dfb9231df\") " Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.104227 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c30990-3c55-4441-8325-1132b7411671-operator-scripts\") pod \"b2c30990-3c55-4441-8325-1132b7411671\" (UID: \"b2c30990-3c55-4441-8325-1132b7411671\") " Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.104250 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4cd3e86-d46d-47f3-8477-4dfff4134915-operator-scripts\") pod \"e4cd3e86-d46d-47f3-8477-4dfff4134915\" (UID: \"e4cd3e86-d46d-47f3-8477-4dfff4134915\") " Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.104274 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0678352c-a29a-4a58-908c-7066bbfe5825-operator-scripts\") pod \"0678352c-a29a-4a58-908c-7066bbfe5825\" (UID: \"0678352c-a29a-4a58-908c-7066bbfe5825\") " Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.104307 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5k4l6\" (UniqueName: \"kubernetes.io/projected/b2c30990-3c55-4441-8325-1132b7411671-kube-api-access-5k4l6\") pod \"b2c30990-3c55-4441-8325-1132b7411671\" (UID: \"b2c30990-3c55-4441-8325-1132b7411671\") " Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.104360 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbtgg\" (UniqueName: \"kubernetes.io/projected/0678352c-a29a-4a58-908c-7066bbfe5825-kube-api-access-pbtgg\") pod \"0678352c-a29a-4a58-908c-7066bbfe5825\" (UID: \"0678352c-a29a-4a58-908c-7066bbfe5825\") " Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.104400 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a70b57c4-0e81-4c6f-95db-7e4dfb9231df-operator-scripts\") pod \"a70b57c4-0e81-4c6f-95db-7e4dfb9231df\" (UID: \"a70b57c4-0e81-4c6f-95db-7e4dfb9231df\") " Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.105571 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29e57717-9dd6-441b-b3bc-563ce1951f14-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "29e57717-9dd6-441b-b3bc-563ce1951f14" (UID: "29e57717-9dd6-441b-b3bc-563ce1951f14"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.106242 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a70b57c4-0e81-4c6f-95db-7e4dfb9231df-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a70b57c4-0e81-4c6f-95db-7e4dfb9231df" (UID: "a70b57c4-0e81-4c6f-95db-7e4dfb9231df"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.107371 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0678352c-a29a-4a58-908c-7066bbfe5825-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0678352c-a29a-4a58-908c-7066bbfe5825" (UID: "0678352c-a29a-4a58-908c-7066bbfe5825"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.108206 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2c30990-3c55-4441-8325-1132b7411671-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2c30990-3c55-4441-8325-1132b7411671" (UID: "b2c30990-3c55-4441-8325-1132b7411671"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.109290 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4cd3e86-d46d-47f3-8477-4dfff4134915-kube-api-access-q8br4" (OuterVolumeSpecName: "kube-api-access-q8br4") pod "e4cd3e86-d46d-47f3-8477-4dfff4134915" (UID: "e4cd3e86-d46d-47f3-8477-4dfff4134915"). InnerVolumeSpecName "kube-api-access-q8br4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.110540 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4cd3e86-d46d-47f3-8477-4dfff4134915-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e4cd3e86-d46d-47f3-8477-4dfff4134915" (UID: "e4cd3e86-d46d-47f3-8477-4dfff4134915"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.111663 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29e57717-9dd6-441b-b3bc-563ce1951f14-kube-api-access-gwhct" (OuterVolumeSpecName: "kube-api-access-gwhct") pod "29e57717-9dd6-441b-b3bc-563ce1951f14" (UID: "29e57717-9dd6-441b-b3bc-563ce1951f14"). InnerVolumeSpecName "kube-api-access-gwhct". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.111801 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0678352c-a29a-4a58-908c-7066bbfe5825-kube-api-access-pbtgg" (OuterVolumeSpecName: "kube-api-access-pbtgg") pod "0678352c-a29a-4a58-908c-7066bbfe5825" (UID: "0678352c-a29a-4a58-908c-7066bbfe5825"). InnerVolumeSpecName "kube-api-access-pbtgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.112188 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c30990-3c55-4441-8325-1132b7411671-kube-api-access-5k4l6" (OuterVolumeSpecName: "kube-api-access-5k4l6") pod "b2c30990-3c55-4441-8325-1132b7411671" (UID: "b2c30990-3c55-4441-8325-1132b7411671"). InnerVolumeSpecName "kube-api-access-5k4l6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.116525 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a70b57c4-0e81-4c6f-95db-7e4dfb9231df-kube-api-access-khggr" (OuterVolumeSpecName: "kube-api-access-khggr") pod "a70b57c4-0e81-4c6f-95db-7e4dfb9231df" (UID: "a70b57c4-0e81-4c6f-95db-7e4dfb9231df"). InnerVolumeSpecName "kube-api-access-khggr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.206225 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/29e57717-9dd6-441b-b3bc-563ce1951f14-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.206261 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q8br4\" (UniqueName: \"kubernetes.io/projected/e4cd3e86-d46d-47f3-8477-4dfff4134915-kube-api-access-q8br4\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.206273 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwhct\" (UniqueName: \"kubernetes.io/projected/29e57717-9dd6-441b-b3bc-563ce1951f14-kube-api-access-gwhct\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.206281 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khggr\" (UniqueName: \"kubernetes.io/projected/a70b57c4-0e81-4c6f-95db-7e4dfb9231df-kube-api-access-khggr\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.206289 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2c30990-3c55-4441-8325-1132b7411671-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.206300 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e4cd3e86-d46d-47f3-8477-4dfff4134915-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.206308 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0678352c-a29a-4a58-908c-7066bbfe5825-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.206334 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5k4l6\" (UniqueName: \"kubernetes.io/projected/b2c30990-3c55-4441-8325-1132b7411671-kube-api-access-5k4l6\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.206343 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbtgg\" (UniqueName: \"kubernetes.io/projected/0678352c-a29a-4a58-908c-7066bbfe5825-kube-api-access-pbtgg\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.206351 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a70b57c4-0e81-4c6f-95db-7e4dfb9231df-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.313710 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-q4vmd" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.313733 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-q4vmd" event={"ID":"fa364658-00c2-41ba-bb0d-eaae5161de19","Type":"ContainerDied","Data":"9e3af65913d913726511c961e8acbcbe13473dc1b95897d24ebe8eaa2cf37615"} Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.314223 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e3af65913d913726511c961e8acbcbe13473dc1b95897d24ebe8eaa2cf37615" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.317766 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8edc-account-create-update-fnzhp" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.317802 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8edc-account-create-update-fnzhp" event={"ID":"e4cd3e86-d46d-47f3-8477-4dfff4134915","Type":"ContainerDied","Data":"65702994dfbdb1a6e4588b447dfaf6706eb3c727d6055ef2106d9d6759f50ec2"} Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.317839 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65702994dfbdb1a6e4588b447dfaf6706eb3c727d6055ef2106d9d6759f50ec2" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.323601 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-dc42-account-create-update-w5llk" event={"ID":"0678352c-a29a-4a58-908c-7066bbfe5825","Type":"ContainerDied","Data":"179a1bcb73528d947d09ea403116a5cf0c8ba773d1c702b6750bd035245b0e9f"} Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.323737 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="179a1bcb73528d947d09ea403116a5cf0c8ba773d1c702b6750bd035245b0e9f" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.323627 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-dc42-account-create-update-w5llk" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.325166 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-w5hvd" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.325156 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-w5hvd" event={"ID":"b2c30990-3c55-4441-8325-1132b7411671","Type":"ContainerDied","Data":"073ef8a773638e2017d6cd910f10c17067bc8ade36bb29890048d8c25438b09f"} Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.326001 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="073ef8a773638e2017d6cd910f10c17067bc8ade36bb29890048d8c25438b09f" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.337810 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-7556-account-create-update-h24bn" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.337960 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-7556-account-create-update-h24bn" event={"ID":"29e57717-9dd6-441b-b3bc-563ce1951f14","Type":"ContainerDied","Data":"74420c11ddfbc9e9273388eb71ede849a27180d3173e31853300e34fff96ff89"} Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.338054 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="74420c11ddfbc9e9273388eb71ede849a27180d3173e31853300e34fff96ff89" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.343190 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-d6pll" event={"ID":"a70b57c4-0e81-4c6f-95db-7e4dfb9231df","Type":"ContainerDied","Data":"3f7e7bd7c42d2c57c16f9efa079132154ea113a535b3291f1e6a5a87650ce458"} Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.343298 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3f7e7bd7c42d2c57c16f9efa079132154ea113a535b3291f1e6a5a87650ce458" Dec 03 06:53:17 crc kubenswrapper[4831]: I1203 06:53:17.343424 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-d6pll" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.675412 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wdqgf"] Dec 03 06:53:18 crc kubenswrapper[4831]: E1203 06:53:18.676771 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c30990-3c55-4441-8325-1132b7411671" containerName="mariadb-database-create" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.676864 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c30990-3c55-4441-8325-1132b7411671" containerName="mariadb-database-create" Dec 03 06:53:18 crc kubenswrapper[4831]: E1203 06:53:18.676976 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa364658-00c2-41ba-bb0d-eaae5161de19" containerName="mariadb-database-create" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.677080 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa364658-00c2-41ba-bb0d-eaae5161de19" containerName="mariadb-database-create" Dec 03 06:53:18 crc kubenswrapper[4831]: E1203 06:53:18.677176 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4cd3e86-d46d-47f3-8477-4dfff4134915" containerName="mariadb-account-create-update" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.677245 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4cd3e86-d46d-47f3-8477-4dfff4134915" containerName="mariadb-account-create-update" Dec 03 06:53:18 crc kubenswrapper[4831]: E1203 06:53:18.677347 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29e57717-9dd6-441b-b3bc-563ce1951f14" containerName="mariadb-account-create-update" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.677438 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="29e57717-9dd6-441b-b3bc-563ce1951f14" containerName="mariadb-account-create-update" Dec 03 06:53:18 crc kubenswrapper[4831]: E1203 06:53:18.677524 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a70b57c4-0e81-4c6f-95db-7e4dfb9231df" containerName="mariadb-database-create" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.677603 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a70b57c4-0e81-4c6f-95db-7e4dfb9231df" containerName="mariadb-database-create" Dec 03 06:53:18 crc kubenswrapper[4831]: E1203 06:53:18.677678 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0678352c-a29a-4a58-908c-7066bbfe5825" containerName="mariadb-account-create-update" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.677738 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0678352c-a29a-4a58-908c-7066bbfe5825" containerName="mariadb-account-create-update" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.678041 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c30990-3c55-4441-8325-1132b7411671" containerName="mariadb-database-create" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.678142 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0678352c-a29a-4a58-908c-7066bbfe5825" containerName="mariadb-account-create-update" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.678231 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="29e57717-9dd6-441b-b3bc-563ce1951f14" containerName="mariadb-account-create-update" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.678311 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a70b57c4-0e81-4c6f-95db-7e4dfb9231df" containerName="mariadb-database-create" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.678417 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa364658-00c2-41ba-bb0d-eaae5161de19" containerName="mariadb-database-create" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.678493 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4cd3e86-d46d-47f3-8477-4dfff4134915" containerName="mariadb-account-create-update" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.701301 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.703393 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.703834 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wdqgf"] Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.704533 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.704710 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7rthp" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.737837 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-scripts\") pod \"nova-cell0-conductor-db-sync-wdqgf\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.738081 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wdqgf\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.738117 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-config-data\") pod \"nova-cell0-conductor-db-sync-wdqgf\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.738140 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96hzb\" (UniqueName: \"kubernetes.io/projected/ce67ad1c-2870-4109-8fc8-86f35414b1ea-kube-api-access-96hzb\") pod \"nova-cell0-conductor-db-sync-wdqgf\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.839701 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-scripts\") pod \"nova-cell0-conductor-db-sync-wdqgf\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.839761 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wdqgf\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.839805 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-config-data\") pod \"nova-cell0-conductor-db-sync-wdqgf\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.839827 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96hzb\" (UniqueName: \"kubernetes.io/projected/ce67ad1c-2870-4109-8fc8-86f35414b1ea-kube-api-access-96hzb\") pod \"nova-cell0-conductor-db-sync-wdqgf\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.848554 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-wdqgf\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.850873 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-scripts\") pod \"nova-cell0-conductor-db-sync-wdqgf\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.851421 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-config-data\") pod \"nova-cell0-conductor-db-sync-wdqgf\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:18 crc kubenswrapper[4831]: I1203 06:53:18.864078 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96hzb\" (UniqueName: \"kubernetes.io/projected/ce67ad1c-2870-4109-8fc8-86f35414b1ea-kube-api-access-96hzb\") pod \"nova-cell0-conductor-db-sync-wdqgf\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:19 crc kubenswrapper[4831]: I1203 06:53:19.037910 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:19 crc kubenswrapper[4831]: I1203 06:53:19.513229 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wdqgf"] Dec 03 06:53:20 crc kubenswrapper[4831]: I1203 06:53:20.374314 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wdqgf" event={"ID":"ce67ad1c-2870-4109-8fc8-86f35414b1ea","Type":"ContainerStarted","Data":"eaecde01d819f9c42319b239d35bab9d6dcfa2975d5b896dea8ad9ffaa2df8d0"} Dec 03 06:53:24 crc kubenswrapper[4831]: I1203 06:53:24.414872 4831 generic.go:334] "Generic (PLEG): container finished" podID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerID="75b047084dc8b16ee4f7f7d719653de8add48f84de233fd696b42a672c3d2c8e" exitCode=0 Dec 03 06:53:24 crc kubenswrapper[4831]: I1203 06:53:24.415187 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e81d6eaf-ea09-43db-906e-dae06ce685d8","Type":"ContainerDied","Data":"75b047084dc8b16ee4f7f7d719653de8add48f84de233fd696b42a672c3d2c8e"} Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.436784 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"e81d6eaf-ea09-43db-906e-dae06ce685d8","Type":"ContainerDied","Data":"3bfb226b992e572755c8dff95b14dfcddcf2ad64aad687e09de9a231b028cfad"} Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.437126 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bfb226b992e572755c8dff95b14dfcddcf2ad64aad687e09de9a231b028cfad" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.510655 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.695295 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-combined-ca-bundle\") pod \"e81d6eaf-ea09-43db-906e-dae06ce685d8\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.695724 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-config-data\") pod \"e81d6eaf-ea09-43db-906e-dae06ce685d8\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.695760 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e81d6eaf-ea09-43db-906e-dae06ce685d8-run-httpd\") pod \"e81d6eaf-ea09-43db-906e-dae06ce685d8\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.695776 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-scripts\") pod \"e81d6eaf-ea09-43db-906e-dae06ce685d8\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.695823 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhl6x\" (UniqueName: \"kubernetes.io/projected/e81d6eaf-ea09-43db-906e-dae06ce685d8-kube-api-access-lhl6x\") pod \"e81d6eaf-ea09-43db-906e-dae06ce685d8\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.695865 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e81d6eaf-ea09-43db-906e-dae06ce685d8-log-httpd\") pod \"e81d6eaf-ea09-43db-906e-dae06ce685d8\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.695901 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-sg-core-conf-yaml\") pod \"e81d6eaf-ea09-43db-906e-dae06ce685d8\" (UID: \"e81d6eaf-ea09-43db-906e-dae06ce685d8\") " Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.696679 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e81d6eaf-ea09-43db-906e-dae06ce685d8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "e81d6eaf-ea09-43db-906e-dae06ce685d8" (UID: "e81d6eaf-ea09-43db-906e-dae06ce685d8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.697034 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e81d6eaf-ea09-43db-906e-dae06ce685d8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "e81d6eaf-ea09-43db-906e-dae06ce685d8" (UID: "e81d6eaf-ea09-43db-906e-dae06ce685d8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.702518 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-scripts" (OuterVolumeSpecName: "scripts") pod "e81d6eaf-ea09-43db-906e-dae06ce685d8" (UID: "e81d6eaf-ea09-43db-906e-dae06ce685d8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.719168 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e81d6eaf-ea09-43db-906e-dae06ce685d8-kube-api-access-lhl6x" (OuterVolumeSpecName: "kube-api-access-lhl6x") pod "e81d6eaf-ea09-43db-906e-dae06ce685d8" (UID: "e81d6eaf-ea09-43db-906e-dae06ce685d8"). InnerVolumeSpecName "kube-api-access-lhl6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.751857 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "e81d6eaf-ea09-43db-906e-dae06ce685d8" (UID: "e81d6eaf-ea09-43db-906e-dae06ce685d8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.798521 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e81d6eaf-ea09-43db-906e-dae06ce685d8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.798581 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.798598 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhl6x\" (UniqueName: \"kubernetes.io/projected/e81d6eaf-ea09-43db-906e-dae06ce685d8-kube-api-access-lhl6x\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.798611 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/e81d6eaf-ea09-43db-906e-dae06ce685d8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.798622 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.824584 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-config-data" (OuterVolumeSpecName: "config-data") pod "e81d6eaf-ea09-43db-906e-dae06ce685d8" (UID: "e81d6eaf-ea09-43db-906e-dae06ce685d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.828064 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e81d6eaf-ea09-43db-906e-dae06ce685d8" (UID: "e81d6eaf-ea09-43db-906e-dae06ce685d8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.900723 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:26 crc kubenswrapper[4831]: I1203 06:53:26.900758 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e81d6eaf-ea09-43db-906e-dae06ce685d8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.446466 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.446460 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wdqgf" event={"ID":"ce67ad1c-2870-4109-8fc8-86f35414b1ea","Type":"ContainerStarted","Data":"06366c6c3c4c924b06643fdb1184e05bdab909f3fb392fa7334a1d7661ab4b77"} Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.477847 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-wdqgf" podStartSLOduration=2.610641744 podStartE2EDuration="9.477827953s" podCreationTimestamp="2025-12-03 06:53:18 +0000 UTC" firstStartedPulling="2025-12-03 06:53:19.51718643 +0000 UTC m=+1336.860769938" lastFinishedPulling="2025-12-03 06:53:26.384372639 +0000 UTC m=+1343.727956147" observedRunningTime="2025-12-03 06:53:27.468938764 +0000 UTC m=+1344.812522272" watchObservedRunningTime="2025-12-03 06:53:27.477827953 +0000 UTC m=+1344.821411461" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.489284 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.497069 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.528887 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:53:27 crc kubenswrapper[4831]: E1203 06:53:27.529349 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="ceilometer-central-agent" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.529376 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="ceilometer-central-agent" Dec 03 06:53:27 crc kubenswrapper[4831]: E1203 06:53:27.529403 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="sg-core" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.529412 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="sg-core" Dec 03 06:53:27 crc kubenswrapper[4831]: E1203 06:53:27.529425 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="proxy-httpd" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.529432 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="proxy-httpd" Dec 03 06:53:27 crc kubenswrapper[4831]: E1203 06:53:27.529464 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="ceilometer-notification-agent" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.529473 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="ceilometer-notification-agent" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.529700 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="proxy-httpd" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.529732 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="ceilometer-central-agent" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.529748 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="ceilometer-notification-agent" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.529760 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" containerName="sg-core" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.531742 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.534877 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.535086 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.541000 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.597159 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.597230 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.715065 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.715330 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-run-httpd\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.715348 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-config-data\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.715379 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-scripts\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.715402 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxv2k\" (UniqueName: \"kubernetes.io/projected/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-kube-api-access-mxv2k\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.715421 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-log-httpd\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.715455 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.817299 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.817916 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-run-httpd\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.818111 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-config-data\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.818338 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-scripts\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.818507 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxv2k\" (UniqueName: \"kubernetes.io/projected/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-kube-api-access-mxv2k\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.818587 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-log-httpd\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.818688 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.818466 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-run-httpd\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.818857 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-log-httpd\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.822976 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-scripts\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.827266 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.832256 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.834120 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-config-data\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.840722 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxv2k\" (UniqueName: \"kubernetes.io/projected/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-kube-api-access-mxv2k\") pod \"ceilometer-0\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " pod="openstack/ceilometer-0" Dec 03 06:53:27 crc kubenswrapper[4831]: I1203 06:53:27.849170 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:53:28 crc kubenswrapper[4831]: I1203 06:53:28.335832 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:53:28 crc kubenswrapper[4831]: I1203 06:53:28.462744 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bdd7583-b04b-4c20-929b-b69fb3d05aa9","Type":"ContainerStarted","Data":"3ee0fe7037ea2749eadf4dbe56a6691c0ff277aaf807750306b452a35a5e238c"} Dec 03 06:53:29 crc kubenswrapper[4831]: I1203 06:53:29.022479 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e81d6eaf-ea09-43db-906e-dae06ce685d8" path="/var/lib/kubelet/pods/e81d6eaf-ea09-43db-906e-dae06ce685d8/volumes" Dec 03 06:53:29 crc kubenswrapper[4831]: I1203 06:53:29.474162 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bdd7583-b04b-4c20-929b-b69fb3d05aa9","Type":"ContainerStarted","Data":"2933fcb207bc32227fdb1ba781d0341988b90601e7267857a242d7f9788296e6"} Dec 03 06:53:30 crc kubenswrapper[4831]: I1203 06:53:30.484499 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bdd7583-b04b-4c20-929b-b69fb3d05aa9","Type":"ContainerStarted","Data":"cf850d58a7b9d8eeb93c45d17249be6db7780116da4b43d4d4d65253be93eebe"} Dec 03 06:53:30 crc kubenswrapper[4831]: I1203 06:53:30.484950 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bdd7583-b04b-4c20-929b-b69fb3d05aa9","Type":"ContainerStarted","Data":"b9a763d25783bb2ff842983da8225794ec4595c451d69e1f3cec040169dbf73f"} Dec 03 06:53:32 crc kubenswrapper[4831]: I1203 06:53:32.524238 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bdd7583-b04b-4c20-929b-b69fb3d05aa9","Type":"ContainerStarted","Data":"19be777957213075e4546820575fdaec6a08d4cac8c59f524e471de80e74ee17"} Dec 03 06:53:32 crc kubenswrapper[4831]: I1203 06:53:32.524882 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 06:53:32 crc kubenswrapper[4831]: I1203 06:53:32.572484 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.536991568 podStartE2EDuration="5.572460977s" podCreationTimestamp="2025-12-03 06:53:27 +0000 UTC" firstStartedPulling="2025-12-03 06:53:28.32820259 +0000 UTC m=+1345.671786098" lastFinishedPulling="2025-12-03 06:53:31.363671989 +0000 UTC m=+1348.707255507" observedRunningTime="2025-12-03 06:53:32.559671587 +0000 UTC m=+1349.903255095" watchObservedRunningTime="2025-12-03 06:53:32.572460977 +0000 UTC m=+1349.916044495" Dec 03 06:53:36 crc kubenswrapper[4831]: I1203 06:53:36.588275 4831 generic.go:334] "Generic (PLEG): container finished" podID="ce67ad1c-2870-4109-8fc8-86f35414b1ea" containerID="06366c6c3c4c924b06643fdb1184e05bdab909f3fb392fa7334a1d7661ab4b77" exitCode=0 Dec 03 06:53:36 crc kubenswrapper[4831]: I1203 06:53:36.588361 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wdqgf" event={"ID":"ce67ad1c-2870-4109-8fc8-86f35414b1ea","Type":"ContainerDied","Data":"06366c6c3c4c924b06643fdb1184e05bdab909f3fb392fa7334a1d7661ab4b77"} Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.087991 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.236068 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-config-data\") pod \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.236241 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-combined-ca-bundle\") pod \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.236463 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-scripts\") pod \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.236532 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96hzb\" (UniqueName: \"kubernetes.io/projected/ce67ad1c-2870-4109-8fc8-86f35414b1ea-kube-api-access-96hzb\") pod \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\" (UID: \"ce67ad1c-2870-4109-8fc8-86f35414b1ea\") " Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.242709 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-scripts" (OuterVolumeSpecName: "scripts") pod "ce67ad1c-2870-4109-8fc8-86f35414b1ea" (UID: "ce67ad1c-2870-4109-8fc8-86f35414b1ea"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.243423 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce67ad1c-2870-4109-8fc8-86f35414b1ea-kube-api-access-96hzb" (OuterVolumeSpecName: "kube-api-access-96hzb") pod "ce67ad1c-2870-4109-8fc8-86f35414b1ea" (UID: "ce67ad1c-2870-4109-8fc8-86f35414b1ea"). InnerVolumeSpecName "kube-api-access-96hzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.262509 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ce67ad1c-2870-4109-8fc8-86f35414b1ea" (UID: "ce67ad1c-2870-4109-8fc8-86f35414b1ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.269368 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-config-data" (OuterVolumeSpecName: "config-data") pod "ce67ad1c-2870-4109-8fc8-86f35414b1ea" (UID: "ce67ad1c-2870-4109-8fc8-86f35414b1ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.339520 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.339579 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.339598 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-96hzb\" (UniqueName: \"kubernetes.io/projected/ce67ad1c-2870-4109-8fc8-86f35414b1ea-kube-api-access-96hzb\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.339618 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ce67ad1c-2870-4109-8fc8-86f35414b1ea-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.612546 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-wdqgf" event={"ID":"ce67ad1c-2870-4109-8fc8-86f35414b1ea","Type":"ContainerDied","Data":"eaecde01d819f9c42319b239d35bab9d6dcfa2975d5b896dea8ad9ffaa2df8d0"} Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.612609 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eaecde01d819f9c42319b239d35bab9d6dcfa2975d5b896dea8ad9ffaa2df8d0" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.612666 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-wdqgf" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.737681 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 06:53:38 crc kubenswrapper[4831]: E1203 06:53:38.738343 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce67ad1c-2870-4109-8fc8-86f35414b1ea" containerName="nova-cell0-conductor-db-sync" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.738369 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce67ad1c-2870-4109-8fc8-86f35414b1ea" containerName="nova-cell0-conductor-db-sync" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.738778 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce67ad1c-2870-4109-8fc8-86f35414b1ea" containerName="nova-cell0-conductor-db-sync" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.739988 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.742960 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7rthp" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.744035 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.753512 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.851009 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a18ae10-7f43-4072-b01c-1564735985be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1a18ae10-7f43-4072-b01c-1564735985be\") " pod="openstack/nova-cell0-conductor-0" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.851351 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a18ae10-7f43-4072-b01c-1564735985be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1a18ae10-7f43-4072-b01c-1564735985be\") " pod="openstack/nova-cell0-conductor-0" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.851441 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2rj4\" (UniqueName: \"kubernetes.io/projected/1a18ae10-7f43-4072-b01c-1564735985be-kube-api-access-d2rj4\") pod \"nova-cell0-conductor-0\" (UID: \"1a18ae10-7f43-4072-b01c-1564735985be\") " pod="openstack/nova-cell0-conductor-0" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.952754 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a18ae10-7f43-4072-b01c-1564735985be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1a18ae10-7f43-4072-b01c-1564735985be\") " pod="openstack/nova-cell0-conductor-0" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.952833 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2rj4\" (UniqueName: \"kubernetes.io/projected/1a18ae10-7f43-4072-b01c-1564735985be-kube-api-access-d2rj4\") pod \"nova-cell0-conductor-0\" (UID: \"1a18ae10-7f43-4072-b01c-1564735985be\") " pod="openstack/nova-cell0-conductor-0" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.952898 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a18ae10-7f43-4072-b01c-1564735985be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1a18ae10-7f43-4072-b01c-1564735985be\") " pod="openstack/nova-cell0-conductor-0" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.959360 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a18ae10-7f43-4072-b01c-1564735985be-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1a18ae10-7f43-4072-b01c-1564735985be\") " pod="openstack/nova-cell0-conductor-0" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.959784 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a18ae10-7f43-4072-b01c-1564735985be-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1a18ae10-7f43-4072-b01c-1564735985be\") " pod="openstack/nova-cell0-conductor-0" Dec 03 06:53:38 crc kubenswrapper[4831]: I1203 06:53:38.975482 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2rj4\" (UniqueName: \"kubernetes.io/projected/1a18ae10-7f43-4072-b01c-1564735985be-kube-api-access-d2rj4\") pod \"nova-cell0-conductor-0\" (UID: \"1a18ae10-7f43-4072-b01c-1564735985be\") " pod="openstack/nova-cell0-conductor-0" Dec 03 06:53:39 crc kubenswrapper[4831]: I1203 06:53:39.064937 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 06:53:39 crc kubenswrapper[4831]: I1203 06:53:39.568085 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 06:53:39 crc kubenswrapper[4831]: I1203 06:53:39.627446 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1a18ae10-7f43-4072-b01c-1564735985be","Type":"ContainerStarted","Data":"a6116f2662c7629f66fb5e0289f22eab236dae54ed70aba728091ccce2d2fbd4"} Dec 03 06:53:40 crc kubenswrapper[4831]: I1203 06:53:40.643969 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1a18ae10-7f43-4072-b01c-1564735985be","Type":"ContainerStarted","Data":"3733faf327347b93ef92bc4d96624b9ad7e5d0b2f31cffa41222a7c371a88c3d"} Dec 03 06:53:40 crc kubenswrapper[4831]: I1203 06:53:40.644728 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 06:53:40 crc kubenswrapper[4831]: I1203 06:53:40.681115 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.681076608 podStartE2EDuration="2.681076608s" podCreationTimestamp="2025-12-03 06:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:53:40.674489072 +0000 UTC m=+1358.018072620" watchObservedRunningTime="2025-12-03 06:53:40.681076608 +0000 UTC m=+1358.024660146" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.109794 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.589294 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-vw5wm"] Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.591580 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.597988 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.598496 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.607898 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vw5wm"] Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.793883 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-scripts\") pod \"nova-cell0-cell-mapping-vw5wm\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.793973 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-config-data\") pod \"nova-cell0-cell-mapping-vw5wm\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.794043 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vw5wm\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.794082 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2t6r\" (UniqueName: \"kubernetes.io/projected/c09903ad-c95e-4958-904f-11dc2e7e52cb-kube-api-access-g2t6r\") pod \"nova-cell0-cell-mapping-vw5wm\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.799770 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.801615 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.806989 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.817272 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.818442 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.820678 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.836129 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.873038 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.895882 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-scripts\") pod \"nova-cell0-cell-mapping-vw5wm\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.895956 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsbrb\" (UniqueName: \"kubernetes.io/projected/7521f444-707b-4cda-ab06-827988c985b9-kube-api-access-lsbrb\") pod \"nova-api-0\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " pod="openstack/nova-api-0" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.895982 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-config-data\") pod \"nova-cell0-cell-mapping-vw5wm\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.896018 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7521f444-707b-4cda-ab06-827988c985b9-logs\") pod \"nova-api-0\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " pod="openstack/nova-api-0" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.896036 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72aaa56-c315-426d-87e3-7f1996fd1bdf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\") " pod="openstack/nova-scheduler-0" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.896054 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ksbq4\" (UniqueName: \"kubernetes.io/projected/b72aaa56-c315-426d-87e3-7f1996fd1bdf-kube-api-access-ksbq4\") pod \"nova-scheduler-0\" (UID: \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\") " pod="openstack/nova-scheduler-0" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.896088 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b72aaa56-c315-426d-87e3-7f1996fd1bdf-config-data\") pod \"nova-scheduler-0\" (UID: \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\") " pod="openstack/nova-scheduler-0" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.896114 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vw5wm\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.896138 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g2t6r\" (UniqueName: \"kubernetes.io/projected/c09903ad-c95e-4958-904f-11dc2e7e52cb-kube-api-access-g2t6r\") pod \"nova-cell0-cell-mapping-vw5wm\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.896171 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7521f444-707b-4cda-ab06-827988c985b9-config-data\") pod \"nova-api-0\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " pod="openstack/nova-api-0" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.896208 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7521f444-707b-4cda-ab06-827988c985b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " pod="openstack/nova-api-0" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.905609 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-scripts\") pod \"nova-cell0-cell-mapping-vw5wm\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.911221 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.911845 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-vw5wm\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.912764 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.913277 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-config-data\") pod \"nova-cell0-cell-mapping-vw5wm\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.915936 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.918774 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.927910 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2t6r\" (UniqueName: \"kubernetes.io/projected/c09903ad-c95e-4958-904f-11dc2e7e52cb-kube-api-access-g2t6r\") pod \"nova-cell0-cell-mapping-vw5wm\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.986962 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 06:53:44 crc kubenswrapper[4831]: I1203 06:53:44.988652 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.002279 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7521f444-707b-4cda-ab06-827988c985b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " pod="openstack/nova-api-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.002363 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.002397 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsbrb\" (UniqueName: \"kubernetes.io/projected/7521f444-707b-4cda-ab06-827988c985b9-kube-api-access-lsbrb\") pod \"nova-api-0\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " pod="openstack/nova-api-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.002435 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7521f444-707b-4cda-ab06-827988c985b9-logs\") pod \"nova-api-0\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " pod="openstack/nova-api-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.002455 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72aaa56-c315-426d-87e3-7f1996fd1bdf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\") " pod="openstack/nova-scheduler-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.002475 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ksbq4\" (UniqueName: \"kubernetes.io/projected/b72aaa56-c315-426d-87e3-7f1996fd1bdf-kube-api-access-ksbq4\") pod \"nova-scheduler-0\" (UID: \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\") " pod="openstack/nova-scheduler-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.002510 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b72aaa56-c315-426d-87e3-7f1996fd1bdf-config-data\") pod \"nova-scheduler-0\" (UID: \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\") " pod="openstack/nova-scheduler-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.002560 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7521f444-707b-4cda-ab06-827988c985b9-config-data\") pod \"nova-api-0\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " pod="openstack/nova-api-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.003616 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7521f444-707b-4cda-ab06-827988c985b9-logs\") pod \"nova-api-0\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " pod="openstack/nova-api-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.005797 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.006538 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7521f444-707b-4cda-ab06-827988c985b9-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " pod="openstack/nova-api-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.007103 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7521f444-707b-4cda-ab06-827988c985b9-config-data\") pod \"nova-api-0\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " pod="openstack/nova-api-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.007455 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b72aaa56-c315-426d-87e3-7f1996fd1bdf-config-data\") pod \"nova-scheduler-0\" (UID: \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\") " pod="openstack/nova-scheduler-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.008967 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72aaa56-c315-426d-87e3-7f1996fd1bdf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\") " pod="openstack/nova-scheduler-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.043361 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsbrb\" (UniqueName: \"kubernetes.io/projected/7521f444-707b-4cda-ab06-827988c985b9-kube-api-access-lsbrb\") pod \"nova-api-0\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " pod="openstack/nova-api-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.045717 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ksbq4\" (UniqueName: \"kubernetes.io/projected/b72aaa56-c315-426d-87e3-7f1996fd1bdf-kube-api-access-ksbq4\") pod \"nova-scheduler-0\" (UID: \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\") " pod="openstack/nova-scheduler-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.073765 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-97qsj"] Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.075300 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.087322 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-97qsj"] Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.104586 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56aff6b8-8e5d-4fa3-b43c-14d58054745f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " pod="openstack/nova-metadata-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.104640 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12bc766-285d-4de7-afca-be32ff19514f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d12bc766-285d-4de7-afca-be32ff19514f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.104682 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82mmx\" (UniqueName: \"kubernetes.io/projected/d12bc766-285d-4de7-afca-be32ff19514f-kube-api-access-82mmx\") pod \"nova-cell1-novncproxy-0\" (UID: \"d12bc766-285d-4de7-afca-be32ff19514f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.104707 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12bc766-285d-4de7-afca-be32ff19514f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d12bc766-285d-4de7-afca-be32ff19514f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.104730 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56aff6b8-8e5d-4fa3-b43c-14d58054745f-logs\") pod \"nova-metadata-0\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " pod="openstack/nova-metadata-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.104781 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zdgsq\" (UniqueName: \"kubernetes.io/projected/56aff6b8-8e5d-4fa3-b43c-14d58054745f-kube-api-access-zdgsq\") pod \"nova-metadata-0\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " pod="openstack/nova-metadata-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.104797 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56aff6b8-8e5d-4fa3-b43c-14d58054745f-config-data\") pod \"nova-metadata-0\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " pod="openstack/nova-metadata-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.127714 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.139177 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.206472 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9ztp\" (UniqueName: \"kubernetes.io/projected/d0a762a2-e53e-4c92-9dee-600387fa5444-kube-api-access-h9ztp\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.206775 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56aff6b8-8e5d-4fa3-b43c-14d58054745f-config-data\") pod \"nova-metadata-0\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " pod="openstack/nova-metadata-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.206797 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zdgsq\" (UniqueName: \"kubernetes.io/projected/56aff6b8-8e5d-4fa3-b43c-14d58054745f-kube-api-access-zdgsq\") pod \"nova-metadata-0\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " pod="openstack/nova-metadata-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.206851 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.206890 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.206919 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56aff6b8-8e5d-4fa3-b43c-14d58054745f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " pod="openstack/nova-metadata-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.206959 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12bc766-285d-4de7-afca-be32ff19514f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d12bc766-285d-4de7-afca-be32ff19514f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.207004 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82mmx\" (UniqueName: \"kubernetes.io/projected/d12bc766-285d-4de7-afca-be32ff19514f-kube-api-access-82mmx\") pod \"nova-cell1-novncproxy-0\" (UID: \"d12bc766-285d-4de7-afca-be32ff19514f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.207036 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-dns-svc\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.207055 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12bc766-285d-4de7-afca-be32ff19514f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d12bc766-285d-4de7-afca-be32ff19514f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.207085 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-config\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.207109 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56aff6b8-8e5d-4fa3-b43c-14d58054745f-logs\") pod \"nova-metadata-0\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " pod="openstack/nova-metadata-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.207154 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.209677 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56aff6b8-8e5d-4fa3-b43c-14d58054745f-logs\") pod \"nova-metadata-0\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " pod="openstack/nova-metadata-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.210707 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12bc766-285d-4de7-afca-be32ff19514f-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d12bc766-285d-4de7-afca-be32ff19514f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.210809 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56aff6b8-8e5d-4fa3-b43c-14d58054745f-config-data\") pod \"nova-metadata-0\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " pod="openstack/nova-metadata-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.212725 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12bc766-285d-4de7-afca-be32ff19514f-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d12bc766-285d-4de7-afca-be32ff19514f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.214462 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56aff6b8-8e5d-4fa3-b43c-14d58054745f-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " pod="openstack/nova-metadata-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.224627 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.232340 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zdgsq\" (UniqueName: \"kubernetes.io/projected/56aff6b8-8e5d-4fa3-b43c-14d58054745f-kube-api-access-zdgsq\") pod \"nova-metadata-0\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " pod="openstack/nova-metadata-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.238081 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82mmx\" (UniqueName: \"kubernetes.io/projected/d12bc766-285d-4de7-afca-be32ff19514f-kube-api-access-82mmx\") pod \"nova-cell1-novncproxy-0\" (UID: \"d12bc766-285d-4de7-afca-be32ff19514f\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.309022 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.309129 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-dns-svc\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.309150 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-config\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.309184 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.309218 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9ztp\" (UniqueName: \"kubernetes.io/projected/d0a762a2-e53e-4c92-9dee-600387fa5444-kube-api-access-h9ztp\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.309256 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.310139 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-config\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.310346 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.310547 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-dns-svc\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.310547 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.310741 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.343900 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9ztp\" (UniqueName: \"kubernetes.io/projected/d0a762a2-e53e-4c92-9dee-600387fa5444-kube-api-access-h9ztp\") pod \"dnsmasq-dns-bccf8f775-97qsj\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.416924 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.428744 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.441020 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.627126 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gjh7q"] Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.628226 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.634805 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gjh7q"] Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.635740 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.636717 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.649057 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:53:45 crc kubenswrapper[4831]: W1203 06:53:45.656518 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb72aaa56_c315_426d_87e3_7f1996fd1bdf.slice/crio-1ccda456d4dc12c8c6a133df8a6027910ae1afa9da2b47d213aec1dc6484e32a WatchSource:0}: Error finding container 1ccda456d4dc12c8c6a133df8a6027910ae1afa9da2b47d213aec1dc6484e32a: Status 404 returned error can't find the container with id 1ccda456d4dc12c8c6a133df8a6027910ae1afa9da2b47d213aec1dc6484e32a Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.696261 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b72aaa56-c315-426d-87e3-7f1996fd1bdf","Type":"ContainerStarted","Data":"1ccda456d4dc12c8c6a133df8a6027910ae1afa9da2b47d213aec1dc6484e32a"} Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.726454 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6n9h\" (UniqueName: \"kubernetes.io/projected/1e1232dc-f0a3-4694-ba92-127eaba9fda6-kube-api-access-q6n9h\") pod \"nova-cell1-conductor-db-sync-gjh7q\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.726528 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gjh7q\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.726596 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-scripts\") pod \"nova-cell1-conductor-db-sync-gjh7q\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.726673 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-config-data\") pod \"nova-cell1-conductor-db-sync-gjh7q\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.752661 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.828846 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-config-data\") pod \"nova-cell1-conductor-db-sync-gjh7q\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.829808 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6n9h\" (UniqueName: \"kubernetes.io/projected/1e1232dc-f0a3-4694-ba92-127eaba9fda6-kube-api-access-q6n9h\") pod \"nova-cell1-conductor-db-sync-gjh7q\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.829867 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gjh7q\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.829904 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-scripts\") pod \"nova-cell1-conductor-db-sync-gjh7q\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.834067 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-scripts\") pod \"nova-cell1-conductor-db-sync-gjh7q\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.834844 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-gjh7q\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.835658 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-config-data\") pod \"nova-cell1-conductor-db-sync-gjh7q\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.856864 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6n9h\" (UniqueName: \"kubernetes.io/projected/1e1232dc-f0a3-4694-ba92-127eaba9fda6-kube-api-access-q6n9h\") pod \"nova-cell1-conductor-db-sync-gjh7q\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:45 crc kubenswrapper[4831]: I1203 06:53:45.952551 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:46 crc kubenswrapper[4831]: I1203 06:53:46.800837 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-vw5wm"] Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.313205 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.410503 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gjh7q"] Dec 03 06:53:47 crc kubenswrapper[4831]: W1203 06:53:47.412115 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1e1232dc_f0a3_4694_ba92_127eaba9fda6.slice/crio-f09af791e118b4a119e1730fb9d051f1805db9a43152da9f2e529bbe51c03d21 WatchSource:0}: Error finding container f09af791e118b4a119e1730fb9d051f1805db9a43152da9f2e529bbe51c03d21: Status 404 returned error can't find the container with id f09af791e118b4a119e1730fb9d051f1805db9a43152da9f2e529bbe51c03d21 Dec 03 06:53:47 crc kubenswrapper[4831]: W1203 06:53:47.424899 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0a762a2_e53e_4c92_9dee_600387fa5444.slice/crio-c19cc2a3c853fa8a9e426cd70a7d51ae8d585ad36871dea8c7519b659efc8514 WatchSource:0}: Error finding container c19cc2a3c853fa8a9e426cd70a7d51ae8d585ad36871dea8c7519b659efc8514: Status 404 returned error can't find the container with id c19cc2a3c853fa8a9e426cd70a7d51ae8d585ad36871dea8c7519b659efc8514 Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.429987 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-97qsj"] Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.527754 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.786763 4831 generic.go:334] "Generic (PLEG): container finished" podID="d0a762a2-e53e-4c92-9dee-600387fa5444" containerID="cfec50ae3b416523c16951bccc3e0858691aa3190b42701d46b9e5ce5f8845c3" exitCode=0 Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.787134 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-97qsj" event={"ID":"d0a762a2-e53e-4c92-9dee-600387fa5444","Type":"ContainerDied","Data":"cfec50ae3b416523c16951bccc3e0858691aa3190b42701d46b9e5ce5f8845c3"} Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.787165 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-97qsj" event={"ID":"d0a762a2-e53e-4c92-9dee-600387fa5444","Type":"ContainerStarted","Data":"c19cc2a3c853fa8a9e426cd70a7d51ae8d585ad36871dea8c7519b659efc8514"} Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.791327 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d12bc766-285d-4de7-afca-be32ff19514f","Type":"ContainerStarted","Data":"436a0cd0f674d25b9d4aaa04bce763fb896ed7051620d55535bc4a497f800817"} Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.797349 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56aff6b8-8e5d-4fa3-b43c-14d58054745f","Type":"ContainerStarted","Data":"fab639ae2820c5dc91dff4c511c5fc6900abbc63c0111306c0829dcd0b5c1261"} Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.806547 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vw5wm" event={"ID":"c09903ad-c95e-4958-904f-11dc2e7e52cb","Type":"ContainerStarted","Data":"658edd03dc1f0ca24b716ddd82b056b7f1cebc66352ec23318becee9a216d637"} Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.806591 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vw5wm" event={"ID":"c09903ad-c95e-4958-904f-11dc2e7e52cb","Type":"ContainerStarted","Data":"50b68433212e913a7c1f1be441aef8119c4d032c54c9062d27e252142a1967ab"} Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.814457 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7521f444-707b-4cda-ab06-827988c985b9","Type":"ContainerStarted","Data":"7356ef4704df3f306b3f8e9c79647fe0ecddb6361da77c973adca2de6c919e3b"} Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.823081 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gjh7q" event={"ID":"1e1232dc-f0a3-4694-ba92-127eaba9fda6","Type":"ContainerStarted","Data":"cead6ad1cbb0e7e540d790058263a8a503bc9aedb3e982519dedca76eca29655"} Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.823135 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gjh7q" event={"ID":"1e1232dc-f0a3-4694-ba92-127eaba9fda6","Type":"ContainerStarted","Data":"f09af791e118b4a119e1730fb9d051f1805db9a43152da9f2e529bbe51c03d21"} Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.834599 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-vw5wm" podStartSLOduration=3.83458095 podStartE2EDuration="3.83458095s" podCreationTimestamp="2025-12-03 06:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:53:47.824423574 +0000 UTC m=+1365.168007082" watchObservedRunningTime="2025-12-03 06:53:47.83458095 +0000 UTC m=+1365.178164458" Dec 03 06:53:47 crc kubenswrapper[4831]: I1203 06:53:47.855101 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-gjh7q" podStartSLOduration=2.855077134 podStartE2EDuration="2.855077134s" podCreationTimestamp="2025-12-03 06:53:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:53:47.842482825 +0000 UTC m=+1365.186066333" watchObservedRunningTime="2025-12-03 06:53:47.855077134 +0000 UTC m=+1365.198660642" Dec 03 06:53:48 crc kubenswrapper[4831]: I1203 06:53:48.688082 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:48 crc kubenswrapper[4831]: I1203 06:53:48.763422 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 06:53:50 crc kubenswrapper[4831]: I1203 06:53:50.868196 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56aff6b8-8e5d-4fa3-b43c-14d58054745f","Type":"ContainerStarted","Data":"b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd"} Dec 03 06:53:50 crc kubenswrapper[4831]: I1203 06:53:50.877444 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7521f444-707b-4cda-ab06-827988c985b9","Type":"ContainerStarted","Data":"4ca2e843ae48855de4a913ac91e55cbb2fbeebc91b668d9d4229fee21f523ac9"} Dec 03 06:53:50 crc kubenswrapper[4831]: I1203 06:53:50.881403 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-97qsj" event={"ID":"d0a762a2-e53e-4c92-9dee-600387fa5444","Type":"ContainerStarted","Data":"7201edb891fb06ccebd0eb3a43372cdfac77a423ee5b65fa409e7b7a58e2a3ac"} Dec 03 06:53:50 crc kubenswrapper[4831]: I1203 06:53:50.882865 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:50 crc kubenswrapper[4831]: I1203 06:53:50.888917 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b72aaa56-c315-426d-87e3-7f1996fd1bdf","Type":"ContainerStarted","Data":"890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8"} Dec 03 06:53:50 crc kubenswrapper[4831]: I1203 06:53:50.905983 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-97qsj" podStartSLOduration=6.90596372 podStartE2EDuration="6.90596372s" podCreationTimestamp="2025-12-03 06:53:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:53:50.900015016 +0000 UTC m=+1368.243598534" watchObservedRunningTime="2025-12-03 06:53:50.90596372 +0000 UTC m=+1368.249547228" Dec 03 06:53:50 crc kubenswrapper[4831]: I1203 06:53:50.925234 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.624249308 podStartE2EDuration="6.925215807s" podCreationTimestamp="2025-12-03 06:53:44 +0000 UTC" firstStartedPulling="2025-12-03 06:53:45.660241055 +0000 UTC m=+1363.003824563" lastFinishedPulling="2025-12-03 06:53:49.961207554 +0000 UTC m=+1367.304791062" observedRunningTime="2025-12-03 06:53:50.923660958 +0000 UTC m=+1368.267244466" watchObservedRunningTime="2025-12-03 06:53:50.925215807 +0000 UTC m=+1368.268799315" Dec 03 06:53:51 crc kubenswrapper[4831]: I1203 06:53:51.903726 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56aff6b8-8e5d-4fa3-b43c-14d58054745f","Type":"ContainerStarted","Data":"e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3"} Dec 03 06:53:51 crc kubenswrapper[4831]: I1203 06:53:51.903867 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="56aff6b8-8e5d-4fa3-b43c-14d58054745f" containerName="nova-metadata-log" containerID="cri-o://b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd" gracePeriod=30 Dec 03 06:53:51 crc kubenswrapper[4831]: I1203 06:53:51.904007 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="56aff6b8-8e5d-4fa3-b43c-14d58054745f" containerName="nova-metadata-metadata" containerID="cri-o://e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3" gracePeriod=30 Dec 03 06:53:51 crc kubenswrapper[4831]: I1203 06:53:51.907166 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7521f444-707b-4cda-ab06-827988c985b9","Type":"ContainerStarted","Data":"3f5495541ad4768cbf19b795446392d6715d36c918a6c14d4e58e86942f002e7"} Dec 03 06:53:51 crc kubenswrapper[4831]: I1203 06:53:51.913173 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d12bc766-285d-4de7-afca-be32ff19514f" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://b53ec2242e225d07b36b13977911027d860febc1488fba9a744a48720480f4ad" gracePeriod=30 Dec 03 06:53:51 crc kubenswrapper[4831]: I1203 06:53:51.913310 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d12bc766-285d-4de7-afca-be32ff19514f","Type":"ContainerStarted","Data":"b53ec2242e225d07b36b13977911027d860febc1488fba9a744a48720480f4ad"} Dec 03 06:53:51 crc kubenswrapper[4831]: I1203 06:53:51.932198 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=5.236504442 podStartE2EDuration="7.932176642s" podCreationTimestamp="2025-12-03 06:53:44 +0000 UTC" firstStartedPulling="2025-12-03 06:53:47.551899374 +0000 UTC m=+1364.895482882" lastFinishedPulling="2025-12-03 06:53:50.247571574 +0000 UTC m=+1367.591155082" observedRunningTime="2025-12-03 06:53:51.925456193 +0000 UTC m=+1369.269039701" watchObservedRunningTime="2025-12-03 06:53:51.932176642 +0000 UTC m=+1369.275760150" Dec 03 06:53:51 crc kubenswrapper[4831]: I1203 06:53:51.951109 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=4.480742435 podStartE2EDuration="7.951093168s" podCreationTimestamp="2025-12-03 06:53:44 +0000 UTC" firstStartedPulling="2025-12-03 06:53:47.356595927 +0000 UTC m=+1364.700179435" lastFinishedPulling="2025-12-03 06:53:50.82694666 +0000 UTC m=+1368.170530168" observedRunningTime="2025-12-03 06:53:51.94342632 +0000 UTC m=+1369.287009848" watchObservedRunningTime="2025-12-03 06:53:51.951093168 +0000 UTC m=+1369.294676676" Dec 03 06:53:51 crc kubenswrapper[4831]: I1203 06:53:51.968294 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=4.778672055 podStartE2EDuration="7.968276951s" podCreationTimestamp="2025-12-03 06:53:44 +0000 UTC" firstStartedPulling="2025-12-03 06:53:46.772392932 +0000 UTC m=+1364.115976430" lastFinishedPulling="2025-12-03 06:53:49.961997818 +0000 UTC m=+1367.305581326" observedRunningTime="2025-12-03 06:53:51.961965935 +0000 UTC m=+1369.305549443" watchObservedRunningTime="2025-12-03 06:53:51.968276951 +0000 UTC m=+1369.311860459" Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.499738 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.506501 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zdgsq\" (UniqueName: \"kubernetes.io/projected/56aff6b8-8e5d-4fa3-b43c-14d58054745f-kube-api-access-zdgsq\") pod \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.506566 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56aff6b8-8e5d-4fa3-b43c-14d58054745f-config-data\") pod \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.513091 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56aff6b8-8e5d-4fa3-b43c-14d58054745f-kube-api-access-zdgsq" (OuterVolumeSpecName: "kube-api-access-zdgsq") pod "56aff6b8-8e5d-4fa3-b43c-14d58054745f" (UID: "56aff6b8-8e5d-4fa3-b43c-14d58054745f"). InnerVolumeSpecName "kube-api-access-zdgsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.561705 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56aff6b8-8e5d-4fa3-b43c-14d58054745f-config-data" (OuterVolumeSpecName: "config-data") pod "56aff6b8-8e5d-4fa3-b43c-14d58054745f" (UID: "56aff6b8-8e5d-4fa3-b43c-14d58054745f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.608796 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56aff6b8-8e5d-4fa3-b43c-14d58054745f-combined-ca-bundle\") pod \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.609282 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56aff6b8-8e5d-4fa3-b43c-14d58054745f-logs\") pod \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\" (UID: \"56aff6b8-8e5d-4fa3-b43c-14d58054745f\") " Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.609606 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56aff6b8-8e5d-4fa3-b43c-14d58054745f-logs" (OuterVolumeSpecName: "logs") pod "56aff6b8-8e5d-4fa3-b43c-14d58054745f" (UID: "56aff6b8-8e5d-4fa3-b43c-14d58054745f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.610040 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/56aff6b8-8e5d-4fa3-b43c-14d58054745f-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.610073 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zdgsq\" (UniqueName: \"kubernetes.io/projected/56aff6b8-8e5d-4fa3-b43c-14d58054745f-kube-api-access-zdgsq\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.610093 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/56aff6b8-8e5d-4fa3-b43c-14d58054745f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.645674 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56aff6b8-8e5d-4fa3-b43c-14d58054745f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "56aff6b8-8e5d-4fa3-b43c-14d58054745f" (UID: "56aff6b8-8e5d-4fa3-b43c-14d58054745f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.711336 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/56aff6b8-8e5d-4fa3-b43c-14d58054745f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.925369 4831 generic.go:334] "Generic (PLEG): container finished" podID="56aff6b8-8e5d-4fa3-b43c-14d58054745f" containerID="e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3" exitCode=0 Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.925423 4831 generic.go:334] "Generic (PLEG): container finished" podID="56aff6b8-8e5d-4fa3-b43c-14d58054745f" containerID="b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd" exitCode=143 Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.928107 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.932751 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56aff6b8-8e5d-4fa3-b43c-14d58054745f","Type":"ContainerDied","Data":"e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3"} Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.932922 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56aff6b8-8e5d-4fa3-b43c-14d58054745f","Type":"ContainerDied","Data":"b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd"} Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.932942 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"56aff6b8-8e5d-4fa3-b43c-14d58054745f","Type":"ContainerDied","Data":"fab639ae2820c5dc91dff4c511c5fc6900abbc63c0111306c0829dcd0b5c1261"} Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.933057 4831 scope.go:117] "RemoveContainer" containerID="e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3" Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.967281 4831 scope.go:117] "RemoveContainer" containerID="b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd" Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.985443 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:52 crc kubenswrapper[4831]: I1203 06:53:52.993868 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.006098 4831 scope.go:117] "RemoveContainer" containerID="e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3" Dec 03 06:53:53 crc kubenswrapper[4831]: E1203 06:53:53.006687 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3\": container with ID starting with e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3 not found: ID does not exist" containerID="e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.006758 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3"} err="failed to get container status \"e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3\": rpc error: code = NotFound desc = could not find container \"e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3\": container with ID starting with e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3 not found: ID does not exist" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.006798 4831 scope.go:117] "RemoveContainer" containerID="b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd" Dec 03 06:53:53 crc kubenswrapper[4831]: E1203 06:53:53.007220 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd\": container with ID starting with b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd not found: ID does not exist" containerID="b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.007274 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd"} err="failed to get container status \"b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd\": rpc error: code = NotFound desc = could not find container \"b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd\": container with ID starting with b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd not found: ID does not exist" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.007306 4831 scope.go:117] "RemoveContainer" containerID="e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.007879 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3"} err="failed to get container status \"e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3\": rpc error: code = NotFound desc = could not find container \"e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3\": container with ID starting with e1005f6b0b2659b7fed89f9bbf060394fc8de0fadf4c58212011654d3ce117a3 not found: ID does not exist" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.007914 4831 scope.go:117] "RemoveContainer" containerID="b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.008598 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd"} err="failed to get container status \"b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd\": rpc error: code = NotFound desc = could not find container \"b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd\": container with ID starting with b2eeff5870ac3dcbf3618564cf86a68509a847aa3390e91f425970669cb4bdfd not found: ID does not exist" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.067641 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56aff6b8-8e5d-4fa3-b43c-14d58054745f" path="/var/lib/kubelet/pods/56aff6b8-8e5d-4fa3-b43c-14d58054745f/volumes" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.068601 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:53 crc kubenswrapper[4831]: E1203 06:53:53.069876 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56aff6b8-8e5d-4fa3-b43c-14d58054745f" containerName="nova-metadata-metadata" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.069902 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="56aff6b8-8e5d-4fa3-b43c-14d58054745f" containerName="nova-metadata-metadata" Dec 03 06:53:53 crc kubenswrapper[4831]: E1203 06:53:53.069927 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56aff6b8-8e5d-4fa3-b43c-14d58054745f" containerName="nova-metadata-log" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.069939 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="56aff6b8-8e5d-4fa3-b43c-14d58054745f" containerName="nova-metadata-log" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.070526 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="56aff6b8-8e5d-4fa3-b43c-14d58054745f" containerName="nova-metadata-log" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.070562 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="56aff6b8-8e5d-4fa3-b43c-14d58054745f" containerName="nova-metadata-metadata" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.073270 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.078842 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.079954 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.084696 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.119525 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-config-data\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.119775 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/367e8bb2-e5b4-47f9-8c20-a992a9686074-logs\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.120031 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.120073 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.120119 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdf48\" (UniqueName: \"kubernetes.io/projected/367e8bb2-e5b4-47f9-8c20-a992a9686074-kube-api-access-mdf48\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.221695 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/367e8bb2-e5b4-47f9-8c20-a992a9686074-logs\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.221800 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.221846 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.221889 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdf48\" (UniqueName: \"kubernetes.io/projected/367e8bb2-e5b4-47f9-8c20-a992a9686074-kube-api-access-mdf48\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.221981 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-config-data\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.222482 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/367e8bb2-e5b4-47f9-8c20-a992a9686074-logs\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.228329 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.228920 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.230710 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-config-data\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.247139 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdf48\" (UniqueName: \"kubernetes.io/projected/367e8bb2-e5b4-47f9-8c20-a992a9686074-kube-api-access-mdf48\") pod \"nova-metadata-0\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.399731 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:53:53 crc kubenswrapper[4831]: W1203 06:53:53.889462 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod367e8bb2_e5b4_47f9_8c20_a992a9686074.slice/crio-24441865215a36f01b4d912cde61cfe446551966b5e97c58eed7a17f5ec5f5ba WatchSource:0}: Error finding container 24441865215a36f01b4d912cde61cfe446551966b5e97c58eed7a17f5ec5f5ba: Status 404 returned error can't find the container with id 24441865215a36f01b4d912cde61cfe446551966b5e97c58eed7a17f5ec5f5ba Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.894851 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:53 crc kubenswrapper[4831]: I1203 06:53:53.936637 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"367e8bb2-e5b4-47f9-8c20-a992a9686074","Type":"ContainerStarted","Data":"24441865215a36f01b4d912cde61cfe446551966b5e97c58eed7a17f5ec5f5ba"} Dec 03 06:53:54 crc kubenswrapper[4831]: I1203 06:53:54.957017 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"367e8bb2-e5b4-47f9-8c20-a992a9686074","Type":"ContainerStarted","Data":"a67dbe9acf5cebdd7d1e210df2673326df61a2ca4bf8fc85830ebb44bc1d3ed7"} Dec 03 06:53:54 crc kubenswrapper[4831]: I1203 06:53:54.957438 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"367e8bb2-e5b4-47f9-8c20-a992a9686074","Type":"ContainerStarted","Data":"8d12da2b30487a5cd15d8065a56fec96c2b95250e57ebf0f8a9e54418f944608"} Dec 03 06:53:54 crc kubenswrapper[4831]: I1203 06:53:54.961694 4831 generic.go:334] "Generic (PLEG): container finished" podID="c09903ad-c95e-4958-904f-11dc2e7e52cb" containerID="658edd03dc1f0ca24b716ddd82b056b7f1cebc66352ec23318becee9a216d637" exitCode=0 Dec 03 06:53:54 crc kubenswrapper[4831]: I1203 06:53:54.961770 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vw5wm" event={"ID":"c09903ad-c95e-4958-904f-11dc2e7e52cb","Type":"ContainerDied","Data":"658edd03dc1f0ca24b716ddd82b056b7f1cebc66352ec23318becee9a216d637"} Dec 03 06:53:55 crc kubenswrapper[4831]: I1203 06:53:55.003766 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.003746458 podStartE2EDuration="3.003746458s" podCreationTimestamp="2025-12-03 06:53:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:53:54.978498705 +0000 UTC m=+1372.322082233" watchObservedRunningTime="2025-12-03 06:53:55.003746458 +0000 UTC m=+1372.347329966" Dec 03 06:53:55 crc kubenswrapper[4831]: I1203 06:53:55.130112 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 06:53:55 crc kubenswrapper[4831]: I1203 06:53:55.130179 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 06:53:55 crc kubenswrapper[4831]: I1203 06:53:55.139725 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 06:53:55 crc kubenswrapper[4831]: I1203 06:53:55.139861 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 06:53:55 crc kubenswrapper[4831]: I1203 06:53:55.191645 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 06:53:55 crc kubenswrapper[4831]: I1203 06:53:55.429641 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:53:55 crc kubenswrapper[4831]: I1203 06:53:55.443633 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:53:55 crc kubenswrapper[4831]: I1203 06:53:55.512495 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bggbl"] Dec 03 06:53:55 crc kubenswrapper[4831]: I1203 06:53:55.512729 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" podUID="a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" containerName="dnsmasq-dns" containerID="cri-o://37442f25c3406732dc8f977e015ae0191d021a30df623fa7e2cf9a8112d10750" gracePeriod=10 Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.006526 4831 generic.go:334] "Generic (PLEG): container finished" podID="a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" containerID="37442f25c3406732dc8f977e015ae0191d021a30df623fa7e2cf9a8112d10750" exitCode=0 Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.006602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" event={"ID":"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7","Type":"ContainerDied","Data":"37442f25c3406732dc8f977e015ae0191d021a30df623fa7e2cf9a8112d10750"} Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.006937 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" event={"ID":"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7","Type":"ContainerDied","Data":"2a243ee5e2ac9b1265ede5fccbf297a17120811b5ab1827cff58d87adf39a0f2"} Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.006953 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a243ee5e2ac9b1265ede5fccbf297a17120811b5ab1827cff58d87adf39a0f2" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.015645 4831 generic.go:334] "Generic (PLEG): container finished" podID="1e1232dc-f0a3-4694-ba92-127eaba9fda6" containerID="cead6ad1cbb0e7e540d790058263a8a503bc9aedb3e982519dedca76eca29655" exitCode=0 Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.015705 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gjh7q" event={"ID":"1e1232dc-f0a3-4694-ba92-127eaba9fda6","Type":"ContainerDied","Data":"cead6ad1cbb0e7e540d790058263a8a503bc9aedb3e982519dedca76eca29655"} Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.051550 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.053956 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.078719 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj87h\" (UniqueName: \"kubernetes.io/projected/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-kube-api-access-wj87h\") pod \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.078787 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-dns-swift-storage-0\") pod \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.078878 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-ovsdbserver-sb\") pod \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.079029 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-ovsdbserver-nb\") pod \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.079065 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-dns-svc\") pod \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.079093 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-config\") pod \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\" (UID: \"a3cf71a6-daa0-4f51-ba7d-f1663e5669e7\") " Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.104664 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-kube-api-access-wj87h" (OuterVolumeSpecName: "kube-api-access-wj87h") pod "a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" (UID: "a3cf71a6-daa0-4f51-ba7d-f1663e5669e7"). InnerVolumeSpecName "kube-api-access-wj87h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.143172 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" (UID: "a3cf71a6-daa0-4f51-ba7d-f1663e5669e7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.172740 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" (UID: "a3cf71a6-daa0-4f51-ba7d-f1663e5669e7"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.181655 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.181699 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wj87h\" (UniqueName: \"kubernetes.io/projected/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-kube-api-access-wj87h\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.181717 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.181858 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" (UID: "a3cf71a6-daa0-4f51-ba7d-f1663e5669e7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.200043 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" (UID: "a3cf71a6-daa0-4f51-ba7d-f1663e5669e7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.204656 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-config" (OuterVolumeSpecName: "config") pod "a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" (UID: "a3cf71a6-daa0-4f51-ba7d-f1663e5669e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.212604 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7521f444-707b-4cda-ab06-827988c985b9" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.215202 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="7521f444-707b-4cda-ab06-827988c985b9" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.282934 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.282975 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.282988 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.326921 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.487834 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-config-data\") pod \"c09903ad-c95e-4958-904f-11dc2e7e52cb\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.488159 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-scripts\") pod \"c09903ad-c95e-4958-904f-11dc2e7e52cb\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.488233 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-combined-ca-bundle\") pod \"c09903ad-c95e-4958-904f-11dc2e7e52cb\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.488265 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2t6r\" (UniqueName: \"kubernetes.io/projected/c09903ad-c95e-4958-904f-11dc2e7e52cb-kube-api-access-g2t6r\") pod \"c09903ad-c95e-4958-904f-11dc2e7e52cb\" (UID: \"c09903ad-c95e-4958-904f-11dc2e7e52cb\") " Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.491816 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09903ad-c95e-4958-904f-11dc2e7e52cb-kube-api-access-g2t6r" (OuterVolumeSpecName: "kube-api-access-g2t6r") pod "c09903ad-c95e-4958-904f-11dc2e7e52cb" (UID: "c09903ad-c95e-4958-904f-11dc2e7e52cb"). InnerVolumeSpecName "kube-api-access-g2t6r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.494938 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-scripts" (OuterVolumeSpecName: "scripts") pod "c09903ad-c95e-4958-904f-11dc2e7e52cb" (UID: "c09903ad-c95e-4958-904f-11dc2e7e52cb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.523498 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-config-data" (OuterVolumeSpecName: "config-data") pod "c09903ad-c95e-4958-904f-11dc2e7e52cb" (UID: "c09903ad-c95e-4958-904f-11dc2e7e52cb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.523914 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c09903ad-c95e-4958-904f-11dc2e7e52cb" (UID: "c09903ad-c95e-4958-904f-11dc2e7e52cb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.589985 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.590032 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.590042 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c09903ad-c95e-4958-904f-11dc2e7e52cb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:56 crc kubenswrapper[4831]: I1203 06:53:56.590052 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g2t6r\" (UniqueName: \"kubernetes.io/projected/c09903ad-c95e-4958-904f-11dc2e7e52cb-kube-api-access-g2t6r\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.026352 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-vw5wm" event={"ID":"c09903ad-c95e-4958-904f-11dc2e7e52cb","Type":"ContainerDied","Data":"50b68433212e913a7c1f1be441aef8119c4d032c54c9062d27e252142a1967ab"} Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.026672 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50b68433212e913a7c1f1be441aef8119c4d032c54c9062d27e252142a1967ab" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.026414 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.027286 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-vw5wm" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.104415 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bggbl"] Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.122073 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-bggbl"] Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.139128 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.139491 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7521f444-707b-4cda-ab06-827988c985b9" containerName="nova-api-log" containerID="cri-o://4ca2e843ae48855de4a913ac91e55cbb2fbeebc91b668d9d4229fee21f523ac9" gracePeriod=30 Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.139764 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="7521f444-707b-4cda-ab06-827988c985b9" containerName="nova-api-api" containerID="cri-o://3f5495541ad4768cbf19b795446392d6715d36c918a6c14d4e58e86942f002e7" gracePeriod=30 Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.152441 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.167789 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.168083 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="367e8bb2-e5b4-47f9-8c20-a992a9686074" containerName="nova-metadata-log" containerID="cri-o://8d12da2b30487a5cd15d8065a56fec96c2b95250e57ebf0f8a9e54418f944608" gracePeriod=30 Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.169485 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="367e8bb2-e5b4-47f9-8c20-a992a9686074" containerName="nova-metadata-metadata" containerID="cri-o://a67dbe9acf5cebdd7d1e210df2673326df61a2ca4bf8fc85830ebb44bc1d3ed7" gracePeriod=30 Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.445330 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.510267 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-scripts\") pod \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.510412 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6n9h\" (UniqueName: \"kubernetes.io/projected/1e1232dc-f0a3-4694-ba92-127eaba9fda6-kube-api-access-q6n9h\") pod \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.510467 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-combined-ca-bundle\") pod \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.510584 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-config-data\") pod \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\" (UID: \"1e1232dc-f0a3-4694-ba92-127eaba9fda6\") " Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.515351 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-scripts" (OuterVolumeSpecName: "scripts") pod "1e1232dc-f0a3-4694-ba92-127eaba9fda6" (UID: "1e1232dc-f0a3-4694-ba92-127eaba9fda6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.515386 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e1232dc-f0a3-4694-ba92-127eaba9fda6-kube-api-access-q6n9h" (OuterVolumeSpecName: "kube-api-access-q6n9h") pod "1e1232dc-f0a3-4694-ba92-127eaba9fda6" (UID: "1e1232dc-f0a3-4694-ba92-127eaba9fda6"). InnerVolumeSpecName "kube-api-access-q6n9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.541261 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1e1232dc-f0a3-4694-ba92-127eaba9fda6" (UID: "1e1232dc-f0a3-4694-ba92-127eaba9fda6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.548396 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-config-data" (OuterVolumeSpecName: "config-data") pod "1e1232dc-f0a3-4694-ba92-127eaba9fda6" (UID: "1e1232dc-f0a3-4694-ba92-127eaba9fda6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.597244 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.597308 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.597368 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.598044 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cdd4c5391a62949652f778a85945bc3bd1190df6d8604a3965df5d75b3dfc56a"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.598119 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://cdd4c5391a62949652f778a85945bc3bd1190df6d8604a3965df5d75b3dfc56a" gracePeriod=600 Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.612594 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.612627 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6n9h\" (UniqueName: \"kubernetes.io/projected/1e1232dc-f0a3-4694-ba92-127eaba9fda6-kube-api-access-q6n9h\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.612639 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.612648 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1e1232dc-f0a3-4694-ba92-127eaba9fda6-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:57 crc kubenswrapper[4831]: I1203 06:53:57.854671 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.042005 4831 generic.go:334] "Generic (PLEG): container finished" podID="7521f444-707b-4cda-ab06-827988c985b9" containerID="4ca2e843ae48855de4a913ac91e55cbb2fbeebc91b668d9d4229fee21f523ac9" exitCode=143 Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.042091 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7521f444-707b-4cda-ab06-827988c985b9","Type":"ContainerDied","Data":"4ca2e843ae48855de4a913ac91e55cbb2fbeebc91b668d9d4229fee21f523ac9"} Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.046207 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-gjh7q" event={"ID":"1e1232dc-f0a3-4694-ba92-127eaba9fda6","Type":"ContainerDied","Data":"f09af791e118b4a119e1730fb9d051f1805db9a43152da9f2e529bbe51c03d21"} Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.046237 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-gjh7q" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.046248 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f09af791e118b4a119e1730fb9d051f1805db9a43152da9f2e529bbe51c03d21" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.050670 4831 generic.go:334] "Generic (PLEG): container finished" podID="367e8bb2-e5b4-47f9-8c20-a992a9686074" containerID="a67dbe9acf5cebdd7d1e210df2673326df61a2ca4bf8fc85830ebb44bc1d3ed7" exitCode=0 Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.050706 4831 generic.go:334] "Generic (PLEG): container finished" podID="367e8bb2-e5b4-47f9-8c20-a992a9686074" containerID="8d12da2b30487a5cd15d8065a56fec96c2b95250e57ebf0f8a9e54418f944608" exitCode=143 Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.050755 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"367e8bb2-e5b4-47f9-8c20-a992a9686074","Type":"ContainerDied","Data":"a67dbe9acf5cebdd7d1e210df2673326df61a2ca4bf8fc85830ebb44bc1d3ed7"} Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.050787 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"367e8bb2-e5b4-47f9-8c20-a992a9686074","Type":"ContainerDied","Data":"8d12da2b30487a5cd15d8065a56fec96c2b95250e57ebf0f8a9e54418f944608"} Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.064083 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="cdd4c5391a62949652f778a85945bc3bd1190df6d8604a3965df5d75b3dfc56a" exitCode=0 Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.064235 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"cdd4c5391a62949652f778a85945bc3bd1190df6d8604a3965df5d75b3dfc56a"} Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.064301 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d"} Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.064352 4831 scope.go:117] "RemoveContainer" containerID="fe6e405940a8abb32a63a2b267869e6a6149449d90d59de98ade036092eb761f" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.198285 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 06:53:58 crc kubenswrapper[4831]: E1203 06:53:58.203211 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" containerName="init" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.203229 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" containerName="init" Dec 03 06:53:58 crc kubenswrapper[4831]: E1203 06:53:58.203649 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09903ad-c95e-4958-904f-11dc2e7e52cb" containerName="nova-manage" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.203657 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09903ad-c95e-4958-904f-11dc2e7e52cb" containerName="nova-manage" Dec 03 06:53:58 crc kubenswrapper[4831]: E1203 06:53:58.203675 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" containerName="dnsmasq-dns" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.203681 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" containerName="dnsmasq-dns" Dec 03 06:53:58 crc kubenswrapper[4831]: E1203 06:53:58.203690 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e1232dc-f0a3-4694-ba92-127eaba9fda6" containerName="nova-cell1-conductor-db-sync" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.203696 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e1232dc-f0a3-4694-ba92-127eaba9fda6" containerName="nova-cell1-conductor-db-sync" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.203878 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e1232dc-f0a3-4694-ba92-127eaba9fda6" containerName="nova-cell1-conductor-db-sync" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.203897 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09903ad-c95e-4958-904f-11dc2e7e52cb" containerName="nova-manage" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.203918 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" containerName="dnsmasq-dns" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.204721 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.208649 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.213868 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.350700 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60bce87-ea0b-4b3d-8243-93ed40c232ff-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\") " pod="openstack/nova-cell1-conductor-0" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.351037 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zz2mw\" (UniqueName: \"kubernetes.io/projected/c60bce87-ea0b-4b3d-8243-93ed40c232ff-kube-api-access-zz2mw\") pod \"nova-cell1-conductor-0\" (UID: \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\") " pod="openstack/nova-cell1-conductor-0" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.351108 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c60bce87-ea0b-4b3d-8243-93ed40c232ff-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\") " pod="openstack/nova-cell1-conductor-0" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.359766 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.453433 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c60bce87-ea0b-4b3d-8243-93ed40c232ff-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\") " pod="openstack/nova-cell1-conductor-0" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.453576 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60bce87-ea0b-4b3d-8243-93ed40c232ff-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\") " pod="openstack/nova-cell1-conductor-0" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.453655 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zz2mw\" (UniqueName: \"kubernetes.io/projected/c60bce87-ea0b-4b3d-8243-93ed40c232ff-kube-api-access-zz2mw\") pod \"nova-cell1-conductor-0\" (UID: \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\") " pod="openstack/nova-cell1-conductor-0" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.459132 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60bce87-ea0b-4b3d-8243-93ed40c232ff-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\") " pod="openstack/nova-cell1-conductor-0" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.460295 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c60bce87-ea0b-4b3d-8243-93ed40c232ff-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\") " pod="openstack/nova-cell1-conductor-0" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.475799 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zz2mw\" (UniqueName: \"kubernetes.io/projected/c60bce87-ea0b-4b3d-8243-93ed40c232ff-kube-api-access-zz2mw\") pod \"nova-cell1-conductor-0\" (UID: \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\") " pod="openstack/nova-cell1-conductor-0" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.528602 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.554626 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-nova-metadata-tls-certs\") pod \"367e8bb2-e5b4-47f9-8c20-a992a9686074\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.554718 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-combined-ca-bundle\") pod \"367e8bb2-e5b4-47f9-8c20-a992a9686074\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.554786 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/367e8bb2-e5b4-47f9-8c20-a992a9686074-logs\") pod \"367e8bb2-e5b4-47f9-8c20-a992a9686074\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.554854 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-config-data\") pod \"367e8bb2-e5b4-47f9-8c20-a992a9686074\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.554936 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdf48\" (UniqueName: \"kubernetes.io/projected/367e8bb2-e5b4-47f9-8c20-a992a9686074-kube-api-access-mdf48\") pod \"367e8bb2-e5b4-47f9-8c20-a992a9686074\" (UID: \"367e8bb2-e5b4-47f9-8c20-a992a9686074\") " Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.555529 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/367e8bb2-e5b4-47f9-8c20-a992a9686074-logs" (OuterVolumeSpecName: "logs") pod "367e8bb2-e5b4-47f9-8c20-a992a9686074" (UID: "367e8bb2-e5b4-47f9-8c20-a992a9686074"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.558580 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/367e8bb2-e5b4-47f9-8c20-a992a9686074-kube-api-access-mdf48" (OuterVolumeSpecName: "kube-api-access-mdf48") pod "367e8bb2-e5b4-47f9-8c20-a992a9686074" (UID: "367e8bb2-e5b4-47f9-8c20-a992a9686074"). InnerVolumeSpecName "kube-api-access-mdf48". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.583519 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "367e8bb2-e5b4-47f9-8c20-a992a9686074" (UID: "367e8bb2-e5b4-47f9-8c20-a992a9686074"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.584031 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-config-data" (OuterVolumeSpecName: "config-data") pod "367e8bb2-e5b4-47f9-8c20-a992a9686074" (UID: "367e8bb2-e5b4-47f9-8c20-a992a9686074"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.624645 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "367e8bb2-e5b4-47f9-8c20-a992a9686074" (UID: "367e8bb2-e5b4-47f9-8c20-a992a9686074"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.657515 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/367e8bb2-e5b4-47f9-8c20-a992a9686074-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.657580 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.657611 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdf48\" (UniqueName: \"kubernetes.io/projected/367e8bb2-e5b4-47f9-8c20-a992a9686074-kube-api-access-mdf48\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.657639 4831 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:58 crc kubenswrapper[4831]: I1203 06:53:58.657658 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/367e8bb2-e5b4-47f9-8c20-a992a9686074-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.023650 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" path="/var/lib/kubelet/pods/a3cf71a6-daa0-4f51-ba7d-f1663e5669e7/volumes" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.080297 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"367e8bb2-e5b4-47f9-8c20-a992a9686074","Type":"ContainerDied","Data":"24441865215a36f01b4d912cde61cfe446551966b5e97c58eed7a17f5ec5f5ba"} Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.082024 4831 scope.go:117] "RemoveContainer" containerID="a67dbe9acf5cebdd7d1e210df2673326df61a2ca4bf8fc85830ebb44bc1d3ed7" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.080481 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.087808 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="b72aaa56-c315-426d-87e3-7f1996fd1bdf" containerName="nova-scheduler-scheduler" containerID="cri-o://890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8" gracePeriod=30 Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.121825 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.130349 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.137960 4831 scope.go:117] "RemoveContainer" containerID="8d12da2b30487a5cd15d8065a56fec96c2b95250e57ebf0f8a9e54418f944608" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.140432 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:59 crc kubenswrapper[4831]: W1203 06:53:59.144643 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc60bce87_ea0b_4b3d_8243_93ed40c232ff.slice/crio-7ed4005aa07f9a098997fab85847f9392c3a28c78abe03b54b5d13fcf163d7d7 WatchSource:0}: Error finding container 7ed4005aa07f9a098997fab85847f9392c3a28c78abe03b54b5d13fcf163d7d7: Status 404 returned error can't find the container with id 7ed4005aa07f9a098997fab85847f9392c3a28c78abe03b54b5d13fcf163d7d7 Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.159466 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:59 crc kubenswrapper[4831]: E1203 06:53:59.159873 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367e8bb2-e5b4-47f9-8c20-a992a9686074" containerName="nova-metadata-metadata" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.159895 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="367e8bb2-e5b4-47f9-8c20-a992a9686074" containerName="nova-metadata-metadata" Dec 03 06:53:59 crc kubenswrapper[4831]: E1203 06:53:59.159911 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="367e8bb2-e5b4-47f9-8c20-a992a9686074" containerName="nova-metadata-log" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.159920 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="367e8bb2-e5b4-47f9-8c20-a992a9686074" containerName="nova-metadata-log" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.161656 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="367e8bb2-e5b4-47f9-8c20-a992a9686074" containerName="nova-metadata-log" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.161690 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="367e8bb2-e5b4-47f9-8c20-a992a9686074" containerName="nova-metadata-metadata" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.162823 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.164735 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.165016 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.168549 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.272734 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9eb2fbe-db81-4563-82c4-b20e364a64e2-logs\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.272847 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.272949 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-config-data\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.273005 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.273105 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-677w7\" (UniqueName: \"kubernetes.io/projected/b9eb2fbe-db81-4563-82c4-b20e364a64e2-kube-api-access-677w7\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.374848 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-677w7\" (UniqueName: \"kubernetes.io/projected/b9eb2fbe-db81-4563-82c4-b20e364a64e2-kube-api-access-677w7\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.374914 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9eb2fbe-db81-4563-82c4-b20e364a64e2-logs\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.374976 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.375040 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-config-data\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.375063 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.376010 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9eb2fbe-db81-4563-82c4-b20e364a64e2-logs\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.379447 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.379656 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.380260 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-config-data\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.401227 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-677w7\" (UniqueName: \"kubernetes.io/projected/b9eb2fbe-db81-4563-82c4-b20e364a64e2-kube-api-access-677w7\") pod \"nova-metadata-0\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " pod="openstack/nova-metadata-0" Dec 03 06:53:59 crc kubenswrapper[4831]: I1203 06:53:59.604680 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:54:00 crc kubenswrapper[4831]: I1203 06:54:00.097084 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:54:00 crc kubenswrapper[4831]: I1203 06:54:00.101531 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c60bce87-ea0b-4b3d-8243-93ed40c232ff","Type":"ContainerStarted","Data":"d2b937efb5cb363c5d2b1ffebf2c05783efceefbe6e756b381e8b96438b287fb"} Dec 03 06:54:00 crc kubenswrapper[4831]: I1203 06:54:00.101607 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c60bce87-ea0b-4b3d-8243-93ed40c232ff","Type":"ContainerStarted","Data":"7ed4005aa07f9a098997fab85847f9392c3a28c78abe03b54b5d13fcf163d7d7"} Dec 03 06:54:00 crc kubenswrapper[4831]: I1203 06:54:00.101770 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 06:54:00 crc kubenswrapper[4831]: I1203 06:54:00.128441 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.128423008 podStartE2EDuration="2.128423008s" podCreationTimestamp="2025-12-03 06:53:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:00.121220315 +0000 UTC m=+1377.464803823" watchObservedRunningTime="2025-12-03 06:54:00.128423008 +0000 UTC m=+1377.472006506" Dec 03 06:54:00 crc kubenswrapper[4831]: E1203 06:54:00.141658 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 06:54:00 crc kubenswrapper[4831]: E1203 06:54:00.143284 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 06:54:00 crc kubenswrapper[4831]: E1203 06:54:00.144761 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 06:54:00 crc kubenswrapper[4831]: E1203 06:54:00.144801 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="b72aaa56-c315-426d-87e3-7f1996fd1bdf" containerName="nova-scheduler-scheduler" Dec 03 06:54:00 crc kubenswrapper[4831]: I1203 06:54:00.935426 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6578955fd5-bggbl" podUID="a3cf71a6-daa0-4f51-ba7d-f1663e5669e7" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.161:5353: i/o timeout" Dec 03 06:54:01 crc kubenswrapper[4831]: I1203 06:54:01.025809 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="367e8bb2-e5b4-47f9-8c20-a992a9686074" path="/var/lib/kubelet/pods/367e8bb2-e5b4-47f9-8c20-a992a9686074/volumes" Dec 03 06:54:01 crc kubenswrapper[4831]: I1203 06:54:01.117701 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9eb2fbe-db81-4563-82c4-b20e364a64e2","Type":"ContainerStarted","Data":"13ce7ea095470db4e684606e9687e7f6edd2203fc808ef459ac3baa5d6d59863"} Dec 03 06:54:01 crc kubenswrapper[4831]: I1203 06:54:01.117809 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9eb2fbe-db81-4563-82c4-b20e364a64e2","Type":"ContainerStarted","Data":"c25727d41e7f65e645b2f8e57ec6dfb34ca6f5c24c8086394bb173525d95df03"} Dec 03 06:54:01 crc kubenswrapper[4831]: I1203 06:54:01.117833 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9eb2fbe-db81-4563-82c4-b20e364a64e2","Type":"ContainerStarted","Data":"49d6e71b52757a5ca8d446de334ba984d88d2a09292421a8c4768f053659fb3c"} Dec 03 06:54:01 crc kubenswrapper[4831]: I1203 06:54:01.139117 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.139081419 podStartE2EDuration="2.139081419s" podCreationTimestamp="2025-12-03 06:53:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:01.134089504 +0000 UTC m=+1378.477673052" watchObservedRunningTime="2025-12-03 06:54:01.139081419 +0000 UTC m=+1378.482664987" Dec 03 06:54:01 crc kubenswrapper[4831]: I1203 06:54:01.777433 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:54:01 crc kubenswrapper[4831]: I1203 06:54:01.777971 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="0f65d17d-bff1-412c-94ab-cf83c538a36c" containerName="kube-state-metrics" containerID="cri-o://ecd51d1a399172f04cada1b6c630a980e38d48f2222540f5057abc4cda8babb0" gracePeriod=30 Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.099819 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.149191 4831 generic.go:334] "Generic (PLEG): container finished" podID="0f65d17d-bff1-412c-94ab-cf83c538a36c" containerID="ecd51d1a399172f04cada1b6c630a980e38d48f2222540f5057abc4cda8babb0" exitCode=2 Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.149228 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0f65d17d-bff1-412c-94ab-cf83c538a36c","Type":"ContainerDied","Data":"ecd51d1a399172f04cada1b6c630a980e38d48f2222540f5057abc4cda8babb0"} Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.151776 4831 generic.go:334] "Generic (PLEG): container finished" podID="7521f444-707b-4cda-ab06-827988c985b9" containerID="3f5495541ad4768cbf19b795446392d6715d36c918a6c14d4e58e86942f002e7" exitCode=0 Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.152773 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.153296 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7521f444-707b-4cda-ab06-827988c985b9","Type":"ContainerDied","Data":"3f5495541ad4768cbf19b795446392d6715d36c918a6c14d4e58e86942f002e7"} Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.153340 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"7521f444-707b-4cda-ab06-827988c985b9","Type":"ContainerDied","Data":"7356ef4704df3f306b3f8e9c79647fe0ecddb6361da77c973adca2de6c919e3b"} Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.153357 4831 scope.go:117] "RemoveContainer" containerID="3f5495541ad4768cbf19b795446392d6715d36c918a6c14d4e58e86942f002e7" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.175198 4831 scope.go:117] "RemoveContainer" containerID="4ca2e843ae48855de4a913ac91e55cbb2fbeebc91b668d9d4229fee21f523ac9" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.241188 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7521f444-707b-4cda-ab06-827988c985b9-logs\") pod \"7521f444-707b-4cda-ab06-827988c985b9\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.241450 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7521f444-707b-4cda-ab06-827988c985b9-combined-ca-bundle\") pod \"7521f444-707b-4cda-ab06-827988c985b9\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.241640 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsbrb\" (UniqueName: \"kubernetes.io/projected/7521f444-707b-4cda-ab06-827988c985b9-kube-api-access-lsbrb\") pod \"7521f444-707b-4cda-ab06-827988c985b9\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.241674 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7521f444-707b-4cda-ab06-827988c985b9-config-data\") pod \"7521f444-707b-4cda-ab06-827988c985b9\" (UID: \"7521f444-707b-4cda-ab06-827988c985b9\") " Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.241789 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7521f444-707b-4cda-ab06-827988c985b9-logs" (OuterVolumeSpecName: "logs") pod "7521f444-707b-4cda-ab06-827988c985b9" (UID: "7521f444-707b-4cda-ab06-827988c985b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.242093 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7521f444-707b-4cda-ab06-827988c985b9-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.247215 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7521f444-707b-4cda-ab06-827988c985b9-kube-api-access-lsbrb" (OuterVolumeSpecName: "kube-api-access-lsbrb") pod "7521f444-707b-4cda-ab06-827988c985b9" (UID: "7521f444-707b-4cda-ab06-827988c985b9"). InnerVolumeSpecName "kube-api-access-lsbrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.265185 4831 scope.go:117] "RemoveContainer" containerID="3f5495541ad4768cbf19b795446392d6715d36c918a6c14d4e58e86942f002e7" Dec 03 06:54:02 crc kubenswrapper[4831]: E1203 06:54:02.265636 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f5495541ad4768cbf19b795446392d6715d36c918a6c14d4e58e86942f002e7\": container with ID starting with 3f5495541ad4768cbf19b795446392d6715d36c918a6c14d4e58e86942f002e7 not found: ID does not exist" containerID="3f5495541ad4768cbf19b795446392d6715d36c918a6c14d4e58e86942f002e7" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.265676 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f5495541ad4768cbf19b795446392d6715d36c918a6c14d4e58e86942f002e7"} err="failed to get container status \"3f5495541ad4768cbf19b795446392d6715d36c918a6c14d4e58e86942f002e7\": rpc error: code = NotFound desc = could not find container \"3f5495541ad4768cbf19b795446392d6715d36c918a6c14d4e58e86942f002e7\": container with ID starting with 3f5495541ad4768cbf19b795446392d6715d36c918a6c14d4e58e86942f002e7 not found: ID does not exist" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.265702 4831 scope.go:117] "RemoveContainer" containerID="4ca2e843ae48855de4a913ac91e55cbb2fbeebc91b668d9d4229fee21f523ac9" Dec 03 06:54:02 crc kubenswrapper[4831]: E1203 06:54:02.266066 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ca2e843ae48855de4a913ac91e55cbb2fbeebc91b668d9d4229fee21f523ac9\": container with ID starting with 4ca2e843ae48855de4a913ac91e55cbb2fbeebc91b668d9d4229fee21f523ac9 not found: ID does not exist" containerID="4ca2e843ae48855de4a913ac91e55cbb2fbeebc91b668d9d4229fee21f523ac9" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.266130 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ca2e843ae48855de4a913ac91e55cbb2fbeebc91b668d9d4229fee21f523ac9"} err="failed to get container status \"4ca2e843ae48855de4a913ac91e55cbb2fbeebc91b668d9d4229fee21f523ac9\": rpc error: code = NotFound desc = could not find container \"4ca2e843ae48855de4a913ac91e55cbb2fbeebc91b668d9d4229fee21f523ac9\": container with ID starting with 4ca2e843ae48855de4a913ac91e55cbb2fbeebc91b668d9d4229fee21f523ac9 not found: ID does not exist" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.272123 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7521f444-707b-4cda-ab06-827988c985b9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7521f444-707b-4cda-ab06-827988c985b9" (UID: "7521f444-707b-4cda-ab06-827988c985b9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.272818 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7521f444-707b-4cda-ab06-827988c985b9-config-data" (OuterVolumeSpecName: "config-data") pod "7521f444-707b-4cda-ab06-827988c985b9" (UID: "7521f444-707b-4cda-ab06-827988c985b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.272899 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.343735 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsbrb\" (UniqueName: \"kubernetes.io/projected/7521f444-707b-4cda-ab06-827988c985b9-kube-api-access-lsbrb\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.343778 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7521f444-707b-4cda-ab06-827988c985b9-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.343792 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7521f444-707b-4cda-ab06-827988c985b9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.454604 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v2bnf\" (UniqueName: \"kubernetes.io/projected/0f65d17d-bff1-412c-94ab-cf83c538a36c-kube-api-access-v2bnf\") pod \"0f65d17d-bff1-412c-94ab-cf83c538a36c\" (UID: \"0f65d17d-bff1-412c-94ab-cf83c538a36c\") " Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.457241 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f65d17d-bff1-412c-94ab-cf83c538a36c-kube-api-access-v2bnf" (OuterVolumeSpecName: "kube-api-access-v2bnf") pod "0f65d17d-bff1-412c-94ab-cf83c538a36c" (UID: "0f65d17d-bff1-412c-94ab-cf83c538a36c"). InnerVolumeSpecName "kube-api-access-v2bnf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.488628 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.502057 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.508304 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:02 crc kubenswrapper[4831]: E1203 06:54:02.508782 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7521f444-707b-4cda-ab06-827988c985b9" containerName="nova-api-api" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.508796 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7521f444-707b-4cda-ab06-827988c985b9" containerName="nova-api-api" Dec 03 06:54:02 crc kubenswrapper[4831]: E1203 06:54:02.508808 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f65d17d-bff1-412c-94ab-cf83c538a36c" containerName="kube-state-metrics" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.508813 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f65d17d-bff1-412c-94ab-cf83c538a36c" containerName="kube-state-metrics" Dec 03 06:54:02 crc kubenswrapper[4831]: E1203 06:54:02.508824 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7521f444-707b-4cda-ab06-827988c985b9" containerName="nova-api-log" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.508831 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7521f444-707b-4cda-ab06-827988c985b9" containerName="nova-api-log" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.508986 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f65d17d-bff1-412c-94ab-cf83c538a36c" containerName="kube-state-metrics" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.509005 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7521f444-707b-4cda-ab06-827988c985b9" containerName="nova-api-log" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.509012 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7521f444-707b-4cda-ab06-827988c985b9" containerName="nova-api-api" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.510011 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.512539 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.527844 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.556893 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81d9172-7404-4091-ba7a-60c5ee3266ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.556994 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81d9172-7404-4091-ba7a-60c5ee3266ea-config-data\") pod \"nova-api-0\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.557028 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a81d9172-7404-4091-ba7a-60c5ee3266ea-logs\") pod \"nova-api-0\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.557050 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsq5n\" (UniqueName: \"kubernetes.io/projected/a81d9172-7404-4091-ba7a-60c5ee3266ea-kube-api-access-zsq5n\") pod \"nova-api-0\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.557117 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v2bnf\" (UniqueName: \"kubernetes.io/projected/0f65d17d-bff1-412c-94ab-cf83c538a36c-kube-api-access-v2bnf\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.576830 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.658809 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72aaa56-c315-426d-87e3-7f1996fd1bdf-combined-ca-bundle\") pod \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\" (UID: \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\") " Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.658875 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ksbq4\" (UniqueName: \"kubernetes.io/projected/b72aaa56-c315-426d-87e3-7f1996fd1bdf-kube-api-access-ksbq4\") pod \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\" (UID: \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\") " Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.658930 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b72aaa56-c315-426d-87e3-7f1996fd1bdf-config-data\") pod \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\" (UID: \"b72aaa56-c315-426d-87e3-7f1996fd1bdf\") " Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.659261 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81d9172-7404-4091-ba7a-60c5ee3266ea-config-data\") pod \"nova-api-0\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.659306 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a81d9172-7404-4091-ba7a-60c5ee3266ea-logs\") pod \"nova-api-0\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.659344 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsq5n\" (UniqueName: \"kubernetes.io/projected/a81d9172-7404-4091-ba7a-60c5ee3266ea-kube-api-access-zsq5n\") pod \"nova-api-0\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.659402 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81d9172-7404-4091-ba7a-60c5ee3266ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.660573 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a81d9172-7404-4091-ba7a-60c5ee3266ea-logs\") pod \"nova-api-0\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.663897 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b72aaa56-c315-426d-87e3-7f1996fd1bdf-kube-api-access-ksbq4" (OuterVolumeSpecName: "kube-api-access-ksbq4") pod "b72aaa56-c315-426d-87e3-7f1996fd1bdf" (UID: "b72aaa56-c315-426d-87e3-7f1996fd1bdf"). InnerVolumeSpecName "kube-api-access-ksbq4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.663948 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81d9172-7404-4091-ba7a-60c5ee3266ea-config-data\") pod \"nova-api-0\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.664588 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81d9172-7404-4091-ba7a-60c5ee3266ea-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.675845 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsq5n\" (UniqueName: \"kubernetes.io/projected/a81d9172-7404-4091-ba7a-60c5ee3266ea-kube-api-access-zsq5n\") pod \"nova-api-0\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " pod="openstack/nova-api-0" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.688755 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b72aaa56-c315-426d-87e3-7f1996fd1bdf-config-data" (OuterVolumeSpecName: "config-data") pod "b72aaa56-c315-426d-87e3-7f1996fd1bdf" (UID: "b72aaa56-c315-426d-87e3-7f1996fd1bdf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.692293 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b72aaa56-c315-426d-87e3-7f1996fd1bdf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b72aaa56-c315-426d-87e3-7f1996fd1bdf" (UID: "b72aaa56-c315-426d-87e3-7f1996fd1bdf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.760782 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b72aaa56-c315-426d-87e3-7f1996fd1bdf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.760810 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ksbq4\" (UniqueName: \"kubernetes.io/projected/b72aaa56-c315-426d-87e3-7f1996fd1bdf-kube-api-access-ksbq4\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.760822 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b72aaa56-c315-426d-87e3-7f1996fd1bdf-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:02 crc kubenswrapper[4831]: I1203 06:54:02.829265 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.032985 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7521f444-707b-4cda-ab06-827988c985b9" path="/var/lib/kubelet/pods/7521f444-707b-4cda-ab06-827988c985b9/volumes" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.166553 4831 generic.go:334] "Generic (PLEG): container finished" podID="b72aaa56-c315-426d-87e3-7f1996fd1bdf" containerID="890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8" exitCode=0 Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.166722 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.166713 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b72aaa56-c315-426d-87e3-7f1996fd1bdf","Type":"ContainerDied","Data":"890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8"} Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.166763 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b72aaa56-c315-426d-87e3-7f1996fd1bdf","Type":"ContainerDied","Data":"1ccda456d4dc12c8c6a133df8a6027910ae1afa9da2b47d213aec1dc6484e32a"} Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.166785 4831 scope.go:117] "RemoveContainer" containerID="890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.172665 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"0f65d17d-bff1-412c-94ab-cf83c538a36c","Type":"ContainerDied","Data":"edcd9b50446d35c2e66d6508ac37e4fd5fa84752dc01e31d062149bdc66dc8cb"} Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.172756 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.200209 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.209931 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.223381 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.228443 4831 scope.go:117] "RemoveContainer" containerID="890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8" Dec 03 06:54:03 crc kubenswrapper[4831]: E1203 06:54:03.230280 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8\": container with ID starting with 890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8 not found: ID does not exist" containerID="890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.230337 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8"} err="failed to get container status \"890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8\": rpc error: code = NotFound desc = could not find container \"890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8\": container with ID starting with 890e6eb25b1003ff462d519a52289a4ae3cc5c915004bc8a10091ad78bb02bd8 not found: ID does not exist" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.230367 4831 scope.go:117] "RemoveContainer" containerID="ecd51d1a399172f04cada1b6c630a980e38d48f2222540f5057abc4cda8babb0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.235117 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:54:03 crc kubenswrapper[4831]: E1203 06:54:03.235755 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b72aaa56-c315-426d-87e3-7f1996fd1bdf" containerName="nova-scheduler-scheduler" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.235877 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b72aaa56-c315-426d-87e3-7f1996fd1bdf" containerName="nova-scheduler-scheduler" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.236437 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b72aaa56-c315-426d-87e3-7f1996fd1bdf" containerName="nova-scheduler-scheduler" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.237447 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.244294 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.246003 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.255554 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.265550 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.266827 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.269746 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.269965 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.273767 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.274715 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.274774 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2qw6\" (UniqueName: \"kubernetes.io/projected/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-api-access-q2qw6\") pod \"kube-state-metrics-0\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.274838 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-config-data\") pod \"nova-scheduler-0\" (UID: \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.274865 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.274952 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6jkd\" (UniqueName: \"kubernetes.io/projected/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-kube-api-access-d6jkd\") pod \"nova-scheduler-0\" (UID: \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.274984 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.275002 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.292162 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:03 crc kubenswrapper[4831]: W1203 06:54:03.309688 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda81d9172_7404_4091_ba7a_60c5ee3266ea.slice/crio-59dbf63d9435b73ac76cb1b2f5f437f65810278ae754480f8426ceb89bb775a1 WatchSource:0}: Error finding container 59dbf63d9435b73ac76cb1b2f5f437f65810278ae754480f8426ceb89bb775a1: Status 404 returned error can't find the container with id 59dbf63d9435b73ac76cb1b2f5f437f65810278ae754480f8426ceb89bb775a1 Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.376344 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6jkd\" (UniqueName: \"kubernetes.io/projected/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-kube-api-access-d6jkd\") pod \"nova-scheduler-0\" (UID: \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.376575 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.376661 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.376777 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.376907 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2qw6\" (UniqueName: \"kubernetes.io/projected/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-api-access-q2qw6\") pod \"kube-state-metrics-0\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.377018 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-config-data\") pod \"nova-scheduler-0\" (UID: \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.377128 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.383935 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.383935 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.383986 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-config-data\") pod \"nova-scheduler-0\" (UID: \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.388076 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.388184 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.391510 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6jkd\" (UniqueName: \"kubernetes.io/projected/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-kube-api-access-d6jkd\") pod \"nova-scheduler-0\" (UID: \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.392658 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2qw6\" (UniqueName: \"kubernetes.io/projected/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-api-access-q2qw6\") pod \"kube-state-metrics-0\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " pod="openstack/kube-state-metrics-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.544592 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.544967 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="ceilometer-central-agent" containerID="cri-o://2933fcb207bc32227fdb1ba781d0341988b90601e7267857a242d7f9788296e6" gracePeriod=30 Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.545110 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="ceilometer-notification-agent" containerID="cri-o://b9a763d25783bb2ff842983da8225794ec4595c451d69e1f3cec040169dbf73f" gracePeriod=30 Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.545335 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="sg-core" containerID="cri-o://cf850d58a7b9d8eeb93c45d17249be6db7780116da4b43d4d4d65253be93eebe" gracePeriod=30 Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.545215 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="proxy-httpd" containerID="cri-o://19be777957213075e4546820575fdaec6a08d4cac8c59f524e471de80e74ee17" gracePeriod=30 Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.559178 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 06:54:03 crc kubenswrapper[4831]: I1203 06:54:03.582797 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.025725 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:54:04 crc kubenswrapper[4831]: W1203 06:54:04.028164 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod872f90d4_ad5a_4d6b_a529_4fc5f5fea83c.slice/crio-f55fb49ffdb7b30b6e01745d6231c72f61923c1bc5985a263b021ac2578f63bd WatchSource:0}: Error finding container f55fb49ffdb7b30b6e01745d6231c72f61923c1bc5985a263b021ac2578f63bd: Status 404 returned error can't find the container with id f55fb49ffdb7b30b6e01745d6231c72f61923c1bc5985a263b021ac2578f63bd Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.131097 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.185078 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a81d9172-7404-4091-ba7a-60c5ee3266ea","Type":"ContainerStarted","Data":"9b22e933a751859f13946b5050ec9635f92a3e23f0b80d2fb8daa55f71d1b4d0"} Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.185396 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a81d9172-7404-4091-ba7a-60c5ee3266ea","Type":"ContainerStarted","Data":"d787a50fc6d04f02f3aaf986de3ff09f15aa9daf2ccae9f05047a86c5df47853"} Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.185417 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a81d9172-7404-4091-ba7a-60c5ee3266ea","Type":"ContainerStarted","Data":"59dbf63d9435b73ac76cb1b2f5f437f65810278ae754480f8426ceb89bb775a1"} Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.187118 4831 generic.go:334] "Generic (PLEG): container finished" podID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerID="19be777957213075e4546820575fdaec6a08d4cac8c59f524e471de80e74ee17" exitCode=0 Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.187150 4831 generic.go:334] "Generic (PLEG): container finished" podID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerID="cf850d58a7b9d8eeb93c45d17249be6db7780116da4b43d4d4d65253be93eebe" exitCode=2 Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.187159 4831 generic.go:334] "Generic (PLEG): container finished" podID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerID="2933fcb207bc32227fdb1ba781d0341988b90601e7267857a242d7f9788296e6" exitCode=0 Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.187207 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bdd7583-b04b-4c20-929b-b69fb3d05aa9","Type":"ContainerDied","Data":"19be777957213075e4546820575fdaec6a08d4cac8c59f524e471de80e74ee17"} Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.187234 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bdd7583-b04b-4c20-929b-b69fb3d05aa9","Type":"ContainerDied","Data":"cf850d58a7b9d8eeb93c45d17249be6db7780116da4b43d4d4d65253be93eebe"} Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.187243 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bdd7583-b04b-4c20-929b-b69fb3d05aa9","Type":"ContainerDied","Data":"2933fcb207bc32227fdb1ba781d0341988b90601e7267857a242d7f9788296e6"} Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.188130 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5bf96e96-13ba-44c9-b16e-b1c2acbfc643","Type":"ContainerStarted","Data":"b7b04ebc30dc59fa6e6dda0dd683307313d31792abba044137eac746a0377a8a"} Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.190587 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c","Type":"ContainerStarted","Data":"f55fb49ffdb7b30b6e01745d6231c72f61923c1bc5985a263b021ac2578f63bd"} Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.201998 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.201979916 podStartE2EDuration="2.201979916s" podCreationTimestamp="2025-12-03 06:54:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:04.199992325 +0000 UTC m=+1381.543575833" watchObservedRunningTime="2025-12-03 06:54:04.201979916 +0000 UTC m=+1381.545563424" Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.605118 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 06:54:04 crc kubenswrapper[4831]: I1203 06:54:04.605209 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 06:54:05 crc kubenswrapper[4831]: I1203 06:54:05.026684 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f65d17d-bff1-412c-94ab-cf83c538a36c" path="/var/lib/kubelet/pods/0f65d17d-bff1-412c-94ab-cf83c538a36c/volumes" Dec 03 06:54:05 crc kubenswrapper[4831]: I1203 06:54:05.027924 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b72aaa56-c315-426d-87e3-7f1996fd1bdf" path="/var/lib/kubelet/pods/b72aaa56-c315-426d-87e3-7f1996fd1bdf/volumes" Dec 03 06:54:05 crc kubenswrapper[4831]: I1203 06:54:05.208016 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5bf96e96-13ba-44c9-b16e-b1c2acbfc643","Type":"ContainerStarted","Data":"a1a3fe54cd3d30665e97d6fbc19db3951584ee5c9ac2f3ff590c8299afe5e9f9"} Dec 03 06:54:05 crc kubenswrapper[4831]: I1203 06:54:05.208847 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 06:54:05 crc kubenswrapper[4831]: I1203 06:54:05.212095 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c","Type":"ContainerStarted","Data":"135ca5a6d135fc68371a549db1898457ff97cd0c1a268d01721f62961e990264"} Dec 03 06:54:05 crc kubenswrapper[4831]: I1203 06:54:05.237639 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.694106326 podStartE2EDuration="2.23761377s" podCreationTimestamp="2025-12-03 06:54:03 +0000 UTC" firstStartedPulling="2025-12-03 06:54:04.139979093 +0000 UTC m=+1381.483562601" lastFinishedPulling="2025-12-03 06:54:04.683486537 +0000 UTC m=+1382.027070045" observedRunningTime="2025-12-03 06:54:05.228619602 +0000 UTC m=+1382.572203180" watchObservedRunningTime="2025-12-03 06:54:05.23761377 +0000 UTC m=+1382.581197288" Dec 03 06:54:05 crc kubenswrapper[4831]: I1203 06:54:05.275752 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.275724712 podStartE2EDuration="2.275724712s" podCreationTimestamp="2025-12-03 06:54:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:05.252101889 +0000 UTC m=+1382.595685427" watchObservedRunningTime="2025-12-03 06:54:05.275724712 +0000 UTC m=+1382.619308260" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.043004 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.223132 4831 generic.go:334] "Generic (PLEG): container finished" podID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerID="b9a763d25783bb2ff842983da8225794ec4595c451d69e1f3cec040169dbf73f" exitCode=0 Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.224013 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.224010 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bdd7583-b04b-4c20-929b-b69fb3d05aa9","Type":"ContainerDied","Data":"b9a763d25783bb2ff842983da8225794ec4595c451d69e1f3cec040169dbf73f"} Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.224294 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"9bdd7583-b04b-4c20-929b-b69fb3d05aa9","Type":"ContainerDied","Data":"3ee0fe7037ea2749eadf4dbe56a6691c0ff277aaf807750306b452a35a5e238c"} Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.224347 4831 scope.go:117] "RemoveContainer" containerID="19be777957213075e4546820575fdaec6a08d4cac8c59f524e471de80e74ee17" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.225083 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-log-httpd\") pod \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.225139 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-config-data\") pod \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.225166 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-sg-core-conf-yaml\") pod \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.225251 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-scripts\") pod \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.226544 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "9bdd7583-b04b-4c20-929b-b69fb3d05aa9" (UID: "9bdd7583-b04b-4c20-929b-b69fb3d05aa9"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.227827 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-run-httpd\") pod \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.228440 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "9bdd7583-b04b-4c20-929b-b69fb3d05aa9" (UID: "9bdd7583-b04b-4c20-929b-b69fb3d05aa9"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.229609 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-combined-ca-bundle\") pod \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.229839 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxv2k\" (UniqueName: \"kubernetes.io/projected/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-kube-api-access-mxv2k\") pod \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\" (UID: \"9bdd7583-b04b-4c20-929b-b69fb3d05aa9\") " Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.230797 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.231795 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.231654 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-scripts" (OuterVolumeSpecName: "scripts") pod "9bdd7583-b04b-4c20-929b-b69fb3d05aa9" (UID: "9bdd7583-b04b-4c20-929b-b69fb3d05aa9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.234885 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-kube-api-access-mxv2k" (OuterVolumeSpecName: "kube-api-access-mxv2k") pod "9bdd7583-b04b-4c20-929b-b69fb3d05aa9" (UID: "9bdd7583-b04b-4c20-929b-b69fb3d05aa9"). InnerVolumeSpecName "kube-api-access-mxv2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.251416 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "9bdd7583-b04b-4c20-929b-b69fb3d05aa9" (UID: "9bdd7583-b04b-4c20-929b-b69fb3d05aa9"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.299528 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9bdd7583-b04b-4c20-929b-b69fb3d05aa9" (UID: "9bdd7583-b04b-4c20-929b-b69fb3d05aa9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.326903 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-config-data" (OuterVolumeSpecName: "config-data") pod "9bdd7583-b04b-4c20-929b-b69fb3d05aa9" (UID: "9bdd7583-b04b-4c20-929b-b69fb3d05aa9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.333743 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.333772 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxv2k\" (UniqueName: \"kubernetes.io/projected/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-kube-api-access-mxv2k\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.333783 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.333792 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.333800 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9bdd7583-b04b-4c20-929b-b69fb3d05aa9-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.334383 4831 scope.go:117] "RemoveContainer" containerID="cf850d58a7b9d8eeb93c45d17249be6db7780116da4b43d4d4d65253be93eebe" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.352999 4831 scope.go:117] "RemoveContainer" containerID="b9a763d25783bb2ff842983da8225794ec4595c451d69e1f3cec040169dbf73f" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.372446 4831 scope.go:117] "RemoveContainer" containerID="2933fcb207bc32227fdb1ba781d0341988b90601e7267857a242d7f9788296e6" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.390374 4831 scope.go:117] "RemoveContainer" containerID="19be777957213075e4546820575fdaec6a08d4cac8c59f524e471de80e74ee17" Dec 03 06:54:06 crc kubenswrapper[4831]: E1203 06:54:06.390793 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19be777957213075e4546820575fdaec6a08d4cac8c59f524e471de80e74ee17\": container with ID starting with 19be777957213075e4546820575fdaec6a08d4cac8c59f524e471de80e74ee17 not found: ID does not exist" containerID="19be777957213075e4546820575fdaec6a08d4cac8c59f524e471de80e74ee17" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.390874 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19be777957213075e4546820575fdaec6a08d4cac8c59f524e471de80e74ee17"} err="failed to get container status \"19be777957213075e4546820575fdaec6a08d4cac8c59f524e471de80e74ee17\": rpc error: code = NotFound desc = could not find container \"19be777957213075e4546820575fdaec6a08d4cac8c59f524e471de80e74ee17\": container with ID starting with 19be777957213075e4546820575fdaec6a08d4cac8c59f524e471de80e74ee17 not found: ID does not exist" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.390946 4831 scope.go:117] "RemoveContainer" containerID="cf850d58a7b9d8eeb93c45d17249be6db7780116da4b43d4d4d65253be93eebe" Dec 03 06:54:06 crc kubenswrapper[4831]: E1203 06:54:06.391303 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cf850d58a7b9d8eeb93c45d17249be6db7780116da4b43d4d4d65253be93eebe\": container with ID starting with cf850d58a7b9d8eeb93c45d17249be6db7780116da4b43d4d4d65253be93eebe not found: ID does not exist" containerID="cf850d58a7b9d8eeb93c45d17249be6db7780116da4b43d4d4d65253be93eebe" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.391388 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cf850d58a7b9d8eeb93c45d17249be6db7780116da4b43d4d4d65253be93eebe"} err="failed to get container status \"cf850d58a7b9d8eeb93c45d17249be6db7780116da4b43d4d4d65253be93eebe\": rpc error: code = NotFound desc = could not find container \"cf850d58a7b9d8eeb93c45d17249be6db7780116da4b43d4d4d65253be93eebe\": container with ID starting with cf850d58a7b9d8eeb93c45d17249be6db7780116da4b43d4d4d65253be93eebe not found: ID does not exist" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.391461 4831 scope.go:117] "RemoveContainer" containerID="b9a763d25783bb2ff842983da8225794ec4595c451d69e1f3cec040169dbf73f" Dec 03 06:54:06 crc kubenswrapper[4831]: E1203 06:54:06.391763 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9a763d25783bb2ff842983da8225794ec4595c451d69e1f3cec040169dbf73f\": container with ID starting with b9a763d25783bb2ff842983da8225794ec4595c451d69e1f3cec040169dbf73f not found: ID does not exist" containerID="b9a763d25783bb2ff842983da8225794ec4595c451d69e1f3cec040169dbf73f" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.391845 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9a763d25783bb2ff842983da8225794ec4595c451d69e1f3cec040169dbf73f"} err="failed to get container status \"b9a763d25783bb2ff842983da8225794ec4595c451d69e1f3cec040169dbf73f\": rpc error: code = NotFound desc = could not find container \"b9a763d25783bb2ff842983da8225794ec4595c451d69e1f3cec040169dbf73f\": container with ID starting with b9a763d25783bb2ff842983da8225794ec4595c451d69e1f3cec040169dbf73f not found: ID does not exist" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.391907 4831 scope.go:117] "RemoveContainer" containerID="2933fcb207bc32227fdb1ba781d0341988b90601e7267857a242d7f9788296e6" Dec 03 06:54:06 crc kubenswrapper[4831]: E1203 06:54:06.392179 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2933fcb207bc32227fdb1ba781d0341988b90601e7267857a242d7f9788296e6\": container with ID starting with 2933fcb207bc32227fdb1ba781d0341988b90601e7267857a242d7f9788296e6 not found: ID does not exist" containerID="2933fcb207bc32227fdb1ba781d0341988b90601e7267857a242d7f9788296e6" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.392251 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2933fcb207bc32227fdb1ba781d0341988b90601e7267857a242d7f9788296e6"} err="failed to get container status \"2933fcb207bc32227fdb1ba781d0341988b90601e7267857a242d7f9788296e6\": rpc error: code = NotFound desc = could not find container \"2933fcb207bc32227fdb1ba781d0341988b90601e7267857a242d7f9788296e6\": container with ID starting with 2933fcb207bc32227fdb1ba781d0341988b90601e7267857a242d7f9788296e6 not found: ID does not exist" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.593292 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.615063 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.625683 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:54:06 crc kubenswrapper[4831]: E1203 06:54:06.626217 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="sg-core" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.626242 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="sg-core" Dec 03 06:54:06 crc kubenswrapper[4831]: E1203 06:54:06.626275 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="ceilometer-notification-agent" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.626284 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="ceilometer-notification-agent" Dec 03 06:54:06 crc kubenswrapper[4831]: E1203 06:54:06.626309 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="proxy-httpd" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.626344 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="proxy-httpd" Dec 03 06:54:06 crc kubenswrapper[4831]: E1203 06:54:06.626375 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="ceilometer-central-agent" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.626384 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="ceilometer-central-agent" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.626638 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="ceilometer-notification-agent" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.626661 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="sg-core" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.626677 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="ceilometer-central-agent" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.626706 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" containerName="proxy-httpd" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.638745 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.638869 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.642212 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.642290 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.642555 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.740248 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-config-data\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.740349 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.740375 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35a9b980-06af-46ee-add5-b3e2ff18aca0-log-httpd\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.740411 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35a9b980-06af-46ee-add5-b3e2ff18aca0-run-httpd\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.740427 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6p5m\" (UniqueName: \"kubernetes.io/projected/35a9b980-06af-46ee-add5-b3e2ff18aca0-kube-api-access-c6p5m\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.740480 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.740513 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-scripts\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.740564 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.842696 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.842784 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35a9b980-06af-46ee-add5-b3e2ff18aca0-log-httpd\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.842866 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35a9b980-06af-46ee-add5-b3e2ff18aca0-run-httpd\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.842916 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6p5m\" (UniqueName: \"kubernetes.io/projected/35a9b980-06af-46ee-add5-b3e2ff18aca0-kube-api-access-c6p5m\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.842969 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.843038 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-scripts\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.843154 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.843199 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-config-data\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.844023 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35a9b980-06af-46ee-add5-b3e2ff18aca0-run-httpd\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.856972 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-scripts\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.857747 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35a9b980-06af-46ee-add5-b3e2ff18aca0-log-httpd\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.857981 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.858254 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.858530 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.858584 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-config-data\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:06 crc kubenswrapper[4831]: I1203 06:54:06.869718 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6p5m\" (UniqueName: \"kubernetes.io/projected/35a9b980-06af-46ee-add5-b3e2ff18aca0-kube-api-access-c6p5m\") pod \"ceilometer-0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " pod="openstack/ceilometer-0" Dec 03 06:54:07 crc kubenswrapper[4831]: I1203 06:54:07.013550 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:54:07 crc kubenswrapper[4831]: I1203 06:54:07.040058 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bdd7583-b04b-4c20-929b-b69fb3d05aa9" path="/var/lib/kubelet/pods/9bdd7583-b04b-4c20-929b-b69fb3d05aa9/volumes" Dec 03 06:54:07 crc kubenswrapper[4831]: I1203 06:54:07.482037 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:54:07 crc kubenswrapper[4831]: I1203 06:54:07.490567 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 06:54:08 crc kubenswrapper[4831]: I1203 06:54:08.248797 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35a9b980-06af-46ee-add5-b3e2ff18aca0","Type":"ContainerStarted","Data":"d86146c5e3fa8e2a6c2bb68dffb49cc05212aeb0a47ff8dde15fadc6f9d5e997"} Dec 03 06:54:08 crc kubenswrapper[4831]: I1203 06:54:08.560929 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 06:54:08 crc kubenswrapper[4831]: I1203 06:54:08.570994 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 06:54:09 crc kubenswrapper[4831]: I1203 06:54:09.258808 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35a9b980-06af-46ee-add5-b3e2ff18aca0","Type":"ContainerStarted","Data":"4e1d1159b04fcbc9fd74a20ebe473024e605494c84d329938f96907a3174f8a1"} Dec 03 06:54:09 crc kubenswrapper[4831]: I1203 06:54:09.259140 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35a9b980-06af-46ee-add5-b3e2ff18aca0","Type":"ContainerStarted","Data":"badd65421f191f150c744558e73edc35a03ff452cb27f555bedfcf599cff9f45"} Dec 03 06:54:09 crc kubenswrapper[4831]: I1203 06:54:09.605137 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 06:54:09 crc kubenswrapper[4831]: I1203 06:54:09.605393 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 06:54:10 crc kubenswrapper[4831]: I1203 06:54:10.280691 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35a9b980-06af-46ee-add5-b3e2ff18aca0","Type":"ContainerStarted","Data":"76a4c8f780ac4e725b7b137c558958b6c61cce39262bb1ded2729f800bf5e943"} Dec 03 06:54:10 crc kubenswrapper[4831]: I1203 06:54:10.620508 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:54:10 crc kubenswrapper[4831]: I1203 06:54:10.620814 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:54:12 crc kubenswrapper[4831]: I1203 06:54:12.305120 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35a9b980-06af-46ee-add5-b3e2ff18aca0","Type":"ContainerStarted","Data":"849e6901f93893cee8a7cc8b79e160199c5337d0d579443b541bf1c5790101f8"} Dec 03 06:54:12 crc kubenswrapper[4831]: I1203 06:54:12.306253 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 06:54:12 crc kubenswrapper[4831]: I1203 06:54:12.334734 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.705711882 podStartE2EDuration="6.334706173s" podCreationTimestamp="2025-12-03 06:54:06 +0000 UTC" firstStartedPulling="2025-12-03 06:54:07.490369776 +0000 UTC m=+1384.833953284" lastFinishedPulling="2025-12-03 06:54:11.119364047 +0000 UTC m=+1388.462947575" observedRunningTime="2025-12-03 06:54:12.331936227 +0000 UTC m=+1389.675519745" watchObservedRunningTime="2025-12-03 06:54:12.334706173 +0000 UTC m=+1389.678289731" Dec 03 06:54:12 crc kubenswrapper[4831]: I1203 06:54:12.830261 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 06:54:12 crc kubenswrapper[4831]: I1203 06:54:12.830332 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 06:54:13 crc kubenswrapper[4831]: I1203 06:54:13.560008 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 06:54:13 crc kubenswrapper[4831]: I1203 06:54:13.600229 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 06:54:13 crc kubenswrapper[4831]: I1203 06:54:13.601265 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 06:54:13 crc kubenswrapper[4831]: I1203 06:54:13.870577 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a81d9172-7404-4091-ba7a-60c5ee3266ea" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 06:54:13 crc kubenswrapper[4831]: I1203 06:54:13.870719 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="a81d9172-7404-4091-ba7a-60c5ee3266ea" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.194:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 06:54:14 crc kubenswrapper[4831]: I1203 06:54:14.364340 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 06:54:19 crc kubenswrapper[4831]: I1203 06:54:19.613663 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 06:54:19 crc kubenswrapper[4831]: I1203 06:54:19.616339 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 06:54:19 crc kubenswrapper[4831]: I1203 06:54:19.622396 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 06:54:19 crc kubenswrapper[4831]: I1203 06:54:19.622485 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.340929 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.415225 4831 generic.go:334] "Generic (PLEG): container finished" podID="d12bc766-285d-4de7-afca-be32ff19514f" containerID="b53ec2242e225d07b36b13977911027d860febc1488fba9a744a48720480f4ad" exitCode=137 Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.415265 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d12bc766-285d-4de7-afca-be32ff19514f","Type":"ContainerDied","Data":"b53ec2242e225d07b36b13977911027d860febc1488fba9a744a48720480f4ad"} Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.415273 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.415344 4831 scope.go:117] "RemoveContainer" containerID="b53ec2242e225d07b36b13977911027d860febc1488fba9a744a48720480f4ad" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.415329 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d12bc766-285d-4de7-afca-be32ff19514f","Type":"ContainerDied","Data":"436a0cd0f674d25b9d4aaa04bce763fb896ed7051620d55535bc4a497f800817"} Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.442576 4831 scope.go:117] "RemoveContainer" containerID="b53ec2242e225d07b36b13977911027d860febc1488fba9a744a48720480f4ad" Dec 03 06:54:22 crc kubenswrapper[4831]: E1203 06:54:22.443093 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b53ec2242e225d07b36b13977911027d860febc1488fba9a744a48720480f4ad\": container with ID starting with b53ec2242e225d07b36b13977911027d860febc1488fba9a744a48720480f4ad not found: ID does not exist" containerID="b53ec2242e225d07b36b13977911027d860febc1488fba9a744a48720480f4ad" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.443172 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b53ec2242e225d07b36b13977911027d860febc1488fba9a744a48720480f4ad"} err="failed to get container status \"b53ec2242e225d07b36b13977911027d860febc1488fba9a744a48720480f4ad\": rpc error: code = NotFound desc = could not find container \"b53ec2242e225d07b36b13977911027d860febc1488fba9a744a48720480f4ad\": container with ID starting with b53ec2242e225d07b36b13977911027d860febc1488fba9a744a48720480f4ad not found: ID does not exist" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.465774 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12bc766-285d-4de7-afca-be32ff19514f-combined-ca-bundle\") pod \"d12bc766-285d-4de7-afca-be32ff19514f\" (UID: \"d12bc766-285d-4de7-afca-be32ff19514f\") " Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.465918 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-82mmx\" (UniqueName: \"kubernetes.io/projected/d12bc766-285d-4de7-afca-be32ff19514f-kube-api-access-82mmx\") pod \"d12bc766-285d-4de7-afca-be32ff19514f\" (UID: \"d12bc766-285d-4de7-afca-be32ff19514f\") " Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.466207 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12bc766-285d-4de7-afca-be32ff19514f-config-data\") pod \"d12bc766-285d-4de7-afca-be32ff19514f\" (UID: \"d12bc766-285d-4de7-afca-be32ff19514f\") " Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.471986 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d12bc766-285d-4de7-afca-be32ff19514f-kube-api-access-82mmx" (OuterVolumeSpecName: "kube-api-access-82mmx") pod "d12bc766-285d-4de7-afca-be32ff19514f" (UID: "d12bc766-285d-4de7-afca-be32ff19514f"). InnerVolumeSpecName "kube-api-access-82mmx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.500933 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d12bc766-285d-4de7-afca-be32ff19514f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d12bc766-285d-4de7-afca-be32ff19514f" (UID: "d12bc766-285d-4de7-afca-be32ff19514f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.501727 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d12bc766-285d-4de7-afca-be32ff19514f-config-data" (OuterVolumeSpecName: "config-data") pod "d12bc766-285d-4de7-afca-be32ff19514f" (UID: "d12bc766-285d-4de7-afca-be32ff19514f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.568995 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d12bc766-285d-4de7-afca-be32ff19514f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.569032 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-82mmx\" (UniqueName: \"kubernetes.io/projected/d12bc766-285d-4de7-afca-be32ff19514f-kube-api-access-82mmx\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.569048 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d12bc766-285d-4de7-afca-be32ff19514f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.765154 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.779411 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.796900 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 06:54:22 crc kubenswrapper[4831]: E1203 06:54:22.797298 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d12bc766-285d-4de7-afca-be32ff19514f" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.797378 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d12bc766-285d-4de7-afca-be32ff19514f" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.797614 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d12bc766-285d-4de7-afca-be32ff19514f" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.798199 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.800971 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.802699 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.802964 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.820330 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.848036 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.849248 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.849435 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.855593 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.873733 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.873782 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.873818 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.873907 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.873940 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vjs5\" (UniqueName: \"kubernetes.io/projected/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-kube-api-access-2vjs5\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.975235 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.975286 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.975329 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.975360 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.975564 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2vjs5\" (UniqueName: \"kubernetes.io/projected/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-kube-api-access-2vjs5\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.979714 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.981842 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.982125 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.982897 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:22 crc kubenswrapper[4831]: I1203 06:54:22.992307 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vjs5\" (UniqueName: \"kubernetes.io/projected/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-kube-api-access-2vjs5\") pod \"nova-cell1-novncproxy-0\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.023592 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d12bc766-285d-4de7-afca-be32ff19514f" path="/var/lib/kubelet/pods/d12bc766-285d-4de7-afca-be32ff19514f/volumes" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.114656 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.425430 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.429543 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.593165 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.643755 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-lw9v5"] Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.645569 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.667840 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-lw9v5"] Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.792986 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.793087 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.793150 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztzkg\" (UniqueName: \"kubernetes.io/projected/4f8ca881-226a-4311-aada-636335beea0d-kube-api-access-ztzkg\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.793170 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.793252 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-config\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.793299 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.895008 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-config\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.896337 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-config\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.896426 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.901665 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.902042 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.902198 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztzkg\" (UniqueName: \"kubernetes.io/projected/4f8ca881-226a-4311-aada-636335beea0d-kube-api-access-ztzkg\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.902256 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.907852 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.908288 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.908788 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.908969 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.925890 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztzkg\" (UniqueName: \"kubernetes.io/projected/4f8ca881-226a-4311-aada-636335beea0d-kube-api-access-ztzkg\") pod \"dnsmasq-dns-cd5cbd7b9-lw9v5\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:23 crc kubenswrapper[4831]: I1203 06:54:23.994791 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:24 crc kubenswrapper[4831]: I1203 06:54:24.436419 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2","Type":"ContainerStarted","Data":"a083ce8f23553bd89f6c5a7fa97eb108d3aaa5ca11ad5ec36e50548b3530fae7"} Dec 03 06:54:24 crc kubenswrapper[4831]: I1203 06:54:24.436768 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2","Type":"ContainerStarted","Data":"27b6a9bc76e14bc76d928be747146d339bd492ea8531c28e69fd478b6d6d0d1d"} Dec 03 06:54:24 crc kubenswrapper[4831]: I1203 06:54:24.457616 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.457600272 podStartE2EDuration="2.457600272s" podCreationTimestamp="2025-12-03 06:54:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:24.450857673 +0000 UTC m=+1401.794441181" watchObservedRunningTime="2025-12-03 06:54:24.457600272 +0000 UTC m=+1401.801183780" Dec 03 06:54:24 crc kubenswrapper[4831]: I1203 06:54:24.509573 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-lw9v5"] Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.333620 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9v7gk"] Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.336474 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.365752 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9v7gk"] Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.387683 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-catalog-content\") pod \"redhat-operators-9v7gk\" (UID: \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\") " pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.387826 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmpjd\" (UniqueName: \"kubernetes.io/projected/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-kube-api-access-mmpjd\") pod \"redhat-operators-9v7gk\" (UID: \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\") " pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.387857 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-utilities\") pod \"redhat-operators-9v7gk\" (UID: \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\") " pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.447464 4831 generic.go:334] "Generic (PLEG): container finished" podID="4f8ca881-226a-4311-aada-636335beea0d" containerID="2d3b92207030050487cb5607a18a0a68b0fe9b7d540ea69d42fbdbfb33812e73" exitCode=0 Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.448722 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" event={"ID":"4f8ca881-226a-4311-aada-636335beea0d","Type":"ContainerDied","Data":"2d3b92207030050487cb5607a18a0a68b0fe9b7d540ea69d42fbdbfb33812e73"} Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.448756 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" event={"ID":"4f8ca881-226a-4311-aada-636335beea0d","Type":"ContainerStarted","Data":"4ed9b53c437be18c4fdd3fa4018341e807fe81e32f2a54f0a2ad58fe0b7646d7"} Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.489416 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-catalog-content\") pod \"redhat-operators-9v7gk\" (UID: \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\") " pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.489650 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmpjd\" (UniqueName: \"kubernetes.io/projected/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-kube-api-access-mmpjd\") pod \"redhat-operators-9v7gk\" (UID: \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\") " pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.489687 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-utilities\") pod \"redhat-operators-9v7gk\" (UID: \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\") " pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.489848 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-catalog-content\") pod \"redhat-operators-9v7gk\" (UID: \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\") " pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.490353 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-utilities\") pod \"redhat-operators-9v7gk\" (UID: \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\") " pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.521666 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmpjd\" (UniqueName: \"kubernetes.io/projected/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-kube-api-access-mmpjd\") pod \"redhat-operators-9v7gk\" (UID: \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\") " pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.662647 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.760475 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.765132 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="ceilometer-central-agent" containerID="cri-o://badd65421f191f150c744558e73edc35a03ff452cb27f555bedfcf599cff9f45" gracePeriod=30 Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.766585 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="proxy-httpd" containerID="cri-o://849e6901f93893cee8a7cc8b79e160199c5337d0d579443b541bf1c5790101f8" gracePeriod=30 Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.766625 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="sg-core" containerID="cri-o://76a4c8f780ac4e725b7b137c558958b6c61cce39262bb1ded2729f800bf5e943" gracePeriod=30 Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.766684 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="ceilometer-notification-agent" containerID="cri-o://4e1d1159b04fcbc9fd74a20ebe473024e605494c84d329938f96907a3174f8a1" gracePeriod=30 Dec 03 06:54:25 crc kubenswrapper[4831]: I1203 06:54:25.773525 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.192280 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9v7gk"] Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.458394 4831 generic.go:334] "Generic (PLEG): container finished" podID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerID="849e6901f93893cee8a7cc8b79e160199c5337d0d579443b541bf1c5790101f8" exitCode=0 Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.458633 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35a9b980-06af-46ee-add5-b3e2ff18aca0","Type":"ContainerDied","Data":"849e6901f93893cee8a7cc8b79e160199c5337d0d579443b541bf1c5790101f8"} Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.458712 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35a9b980-06af-46ee-add5-b3e2ff18aca0","Type":"ContainerDied","Data":"76a4c8f780ac4e725b7b137c558958b6c61cce39262bb1ded2729f800bf5e943"} Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.458671 4831 generic.go:334] "Generic (PLEG): container finished" podID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerID="76a4c8f780ac4e725b7b137c558958b6c61cce39262bb1ded2729f800bf5e943" exitCode=2 Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.458745 4831 generic.go:334] "Generic (PLEG): container finished" podID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerID="badd65421f191f150c744558e73edc35a03ff452cb27f555bedfcf599cff9f45" exitCode=0 Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.458844 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35a9b980-06af-46ee-add5-b3e2ff18aca0","Type":"ContainerDied","Data":"badd65421f191f150c744558e73edc35a03ff452cb27f555bedfcf599cff9f45"} Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.461666 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" event={"ID":"4f8ca881-226a-4311-aada-636335beea0d","Type":"ContainerStarted","Data":"77f57d07ea85aecac94834d02ec77c6992bc37074e8198760a59e04a568577ff"} Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.462722 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.464518 4831 generic.go:334] "Generic (PLEG): container finished" podID="90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" containerID="806854c35630f855949158352173cafd9f05ff45c418095dd9cd00e100ac7916" exitCode=0 Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.464544 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9v7gk" event={"ID":"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea","Type":"ContainerDied","Data":"806854c35630f855949158352173cafd9f05ff45c418095dd9cd00e100ac7916"} Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.464558 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9v7gk" event={"ID":"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea","Type":"ContainerStarted","Data":"6a2879ae7e8ba4be1b4dd474ead5064400215a9986e312f8a55c8b3507d6df9f"} Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.480127 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" podStartSLOduration=3.480109539 podStartE2EDuration="3.480109539s" podCreationTimestamp="2025-12-03 06:54:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:26.478269701 +0000 UTC m=+1403.821853209" watchObservedRunningTime="2025-12-03 06:54:26.480109539 +0000 UTC m=+1403.823693047" Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.895396 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.895892 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a81d9172-7404-4091-ba7a-60c5ee3266ea" containerName="nova-api-log" containerID="cri-o://d787a50fc6d04f02f3aaf986de3ff09f15aa9daf2ccae9f05047a86c5df47853" gracePeriod=30 Dec 03 06:54:26 crc kubenswrapper[4831]: I1203 06:54:26.895967 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="a81d9172-7404-4091-ba7a-60c5ee3266ea" containerName="nova-api-api" containerID="cri-o://9b22e933a751859f13946b5050ec9635f92a3e23f0b80d2fb8daa55f71d1b4d0" gracePeriod=30 Dec 03 06:54:27 crc kubenswrapper[4831]: I1203 06:54:27.475146 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9v7gk" event={"ID":"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea","Type":"ContainerStarted","Data":"506fae631e001da54d46c6e76df86aaba7c5dc58984f0944a9425d7134980943"} Dec 03 06:54:27 crc kubenswrapper[4831]: I1203 06:54:27.477824 4831 generic.go:334] "Generic (PLEG): container finished" podID="a81d9172-7404-4091-ba7a-60c5ee3266ea" containerID="d787a50fc6d04f02f3aaf986de3ff09f15aa9daf2ccae9f05047a86c5df47853" exitCode=143 Dec 03 06:54:27 crc kubenswrapper[4831]: I1203 06:54:27.477910 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a81d9172-7404-4091-ba7a-60c5ee3266ea","Type":"ContainerDied","Data":"d787a50fc6d04f02f3aaf986de3ff09f15aa9daf2ccae9f05047a86c5df47853"} Dec 03 06:54:28 crc kubenswrapper[4831]: I1203 06:54:28.115186 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:28 crc kubenswrapper[4831]: I1203 06:54:28.486488 4831 generic.go:334] "Generic (PLEG): container finished" podID="90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" containerID="506fae631e001da54d46c6e76df86aaba7c5dc58984f0944a9425d7134980943" exitCode=0 Dec 03 06:54:28 crc kubenswrapper[4831]: I1203 06:54:28.486530 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9v7gk" event={"ID":"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea","Type":"ContainerDied","Data":"506fae631e001da54d46c6e76df86aaba7c5dc58984f0944a9425d7134980943"} Dec 03 06:54:29 crc kubenswrapper[4831]: I1203 06:54:29.503738 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9v7gk" event={"ID":"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea","Type":"ContainerStarted","Data":"7dfcc269a310235e13605c2d5d6c940eaa1aaf3064b500114b1e8a1bcf577f0a"} Dec 03 06:54:29 crc kubenswrapper[4831]: I1203 06:54:29.528820 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9v7gk" podStartSLOduration=2.09670945 podStartE2EDuration="4.528797876s" podCreationTimestamp="2025-12-03 06:54:25 +0000 UTC" firstStartedPulling="2025-12-03 06:54:26.465698512 +0000 UTC m=+1403.809282020" lastFinishedPulling="2025-12-03 06:54:28.897786918 +0000 UTC m=+1406.241370446" observedRunningTime="2025-12-03 06:54:29.526637818 +0000 UTC m=+1406.870221366" watchObservedRunningTime="2025-12-03 06:54:29.528797876 +0000 UTC m=+1406.872381394" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.494695 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.503989 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.521939 4831 generic.go:334] "Generic (PLEG): container finished" podID="a81d9172-7404-4091-ba7a-60c5ee3266ea" containerID="9b22e933a751859f13946b5050ec9635f92a3e23f0b80d2fb8daa55f71d1b4d0" exitCode=0 Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.522008 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a81d9172-7404-4091-ba7a-60c5ee3266ea","Type":"ContainerDied","Data":"9b22e933a751859f13946b5050ec9635f92a3e23f0b80d2fb8daa55f71d1b4d0"} Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.522036 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"a81d9172-7404-4091-ba7a-60c5ee3266ea","Type":"ContainerDied","Data":"59dbf63d9435b73ac76cb1b2f5f437f65810278ae754480f8426ceb89bb775a1"} Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.522052 4831 scope.go:117] "RemoveContainer" containerID="9b22e933a751859f13946b5050ec9635f92a3e23f0b80d2fb8daa55f71d1b4d0" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.522045 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.529324 4831 generic.go:334] "Generic (PLEG): container finished" podID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerID="4e1d1159b04fcbc9fd74a20ebe473024e605494c84d329938f96907a3174f8a1" exitCode=0 Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.530255 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.530679 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35a9b980-06af-46ee-add5-b3e2ff18aca0","Type":"ContainerDied","Data":"4e1d1159b04fcbc9fd74a20ebe473024e605494c84d329938f96907a3174f8a1"} Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.530710 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"35a9b980-06af-46ee-add5-b3e2ff18aca0","Type":"ContainerDied","Data":"d86146c5e3fa8e2a6c2bb68dffb49cc05212aeb0a47ff8dde15fadc6f9d5e997"} Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.558330 4831 scope.go:117] "RemoveContainer" containerID="d787a50fc6d04f02f3aaf986de3ff09f15aa9daf2ccae9f05047a86c5df47853" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.589424 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c6p5m\" (UniqueName: \"kubernetes.io/projected/35a9b980-06af-46ee-add5-b3e2ff18aca0-kube-api-access-c6p5m\") pod \"35a9b980-06af-46ee-add5-b3e2ff18aca0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.589521 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-ceilometer-tls-certs\") pod \"35a9b980-06af-46ee-add5-b3e2ff18aca0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.589589 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-config-data\") pod \"35a9b980-06af-46ee-add5-b3e2ff18aca0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.589627 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-scripts\") pod \"35a9b980-06af-46ee-add5-b3e2ff18aca0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.589677 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35a9b980-06af-46ee-add5-b3e2ff18aca0-run-httpd\") pod \"35a9b980-06af-46ee-add5-b3e2ff18aca0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.589702 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-sg-core-conf-yaml\") pod \"35a9b980-06af-46ee-add5-b3e2ff18aca0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.589765 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-combined-ca-bundle\") pod \"35a9b980-06af-46ee-add5-b3e2ff18aca0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.589815 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35a9b980-06af-46ee-add5-b3e2ff18aca0-log-httpd\") pod \"35a9b980-06af-46ee-add5-b3e2ff18aca0\" (UID: \"35a9b980-06af-46ee-add5-b3e2ff18aca0\") " Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.590967 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a9b980-06af-46ee-add5-b3e2ff18aca0-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "35a9b980-06af-46ee-add5-b3e2ff18aca0" (UID: "35a9b980-06af-46ee-add5-b3e2ff18aca0"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.592022 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/35a9b980-06af-46ee-add5-b3e2ff18aca0-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "35a9b980-06af-46ee-add5-b3e2ff18aca0" (UID: "35a9b980-06af-46ee-add5-b3e2ff18aca0"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.592064 4831 scope.go:117] "RemoveContainer" containerID="9b22e933a751859f13946b5050ec9635f92a3e23f0b80d2fb8daa55f71d1b4d0" Dec 03 06:54:30 crc kubenswrapper[4831]: E1203 06:54:30.594720 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b22e933a751859f13946b5050ec9635f92a3e23f0b80d2fb8daa55f71d1b4d0\": container with ID starting with 9b22e933a751859f13946b5050ec9635f92a3e23f0b80d2fb8daa55f71d1b4d0 not found: ID does not exist" containerID="9b22e933a751859f13946b5050ec9635f92a3e23f0b80d2fb8daa55f71d1b4d0" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.594756 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b22e933a751859f13946b5050ec9635f92a3e23f0b80d2fb8daa55f71d1b4d0"} err="failed to get container status \"9b22e933a751859f13946b5050ec9635f92a3e23f0b80d2fb8daa55f71d1b4d0\": rpc error: code = NotFound desc = could not find container \"9b22e933a751859f13946b5050ec9635f92a3e23f0b80d2fb8daa55f71d1b4d0\": container with ID starting with 9b22e933a751859f13946b5050ec9635f92a3e23f0b80d2fb8daa55f71d1b4d0 not found: ID does not exist" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.594779 4831 scope.go:117] "RemoveContainer" containerID="d787a50fc6d04f02f3aaf986de3ff09f15aa9daf2ccae9f05047a86c5df47853" Dec 03 06:54:30 crc kubenswrapper[4831]: E1203 06:54:30.595732 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d787a50fc6d04f02f3aaf986de3ff09f15aa9daf2ccae9f05047a86c5df47853\": container with ID starting with d787a50fc6d04f02f3aaf986de3ff09f15aa9daf2ccae9f05047a86c5df47853 not found: ID does not exist" containerID="d787a50fc6d04f02f3aaf986de3ff09f15aa9daf2ccae9f05047a86c5df47853" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.595788 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d787a50fc6d04f02f3aaf986de3ff09f15aa9daf2ccae9f05047a86c5df47853"} err="failed to get container status \"d787a50fc6d04f02f3aaf986de3ff09f15aa9daf2ccae9f05047a86c5df47853\": rpc error: code = NotFound desc = could not find container \"d787a50fc6d04f02f3aaf986de3ff09f15aa9daf2ccae9f05047a86c5df47853\": container with ID starting with d787a50fc6d04f02f3aaf986de3ff09f15aa9daf2ccae9f05047a86c5df47853 not found: ID does not exist" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.595819 4831 scope.go:117] "RemoveContainer" containerID="849e6901f93893cee8a7cc8b79e160199c5337d0d579443b541bf1c5790101f8" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.600630 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/35a9b980-06af-46ee-add5-b3e2ff18aca0-kube-api-access-c6p5m" (OuterVolumeSpecName: "kube-api-access-c6p5m") pod "35a9b980-06af-46ee-add5-b3e2ff18aca0" (UID: "35a9b980-06af-46ee-add5-b3e2ff18aca0"). InnerVolumeSpecName "kube-api-access-c6p5m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.600638 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-scripts" (OuterVolumeSpecName: "scripts") pod "35a9b980-06af-46ee-add5-b3e2ff18aca0" (UID: "35a9b980-06af-46ee-add5-b3e2ff18aca0"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.619811 4831 scope.go:117] "RemoveContainer" containerID="76a4c8f780ac4e725b7b137c558958b6c61cce39262bb1ded2729f800bf5e943" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.631040 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "35a9b980-06af-46ee-add5-b3e2ff18aca0" (UID: "35a9b980-06af-46ee-add5-b3e2ff18aca0"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.648960 4831 scope.go:117] "RemoveContainer" containerID="4e1d1159b04fcbc9fd74a20ebe473024e605494c84d329938f96907a3174f8a1" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.667428 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "35a9b980-06af-46ee-add5-b3e2ff18aca0" (UID: "35a9b980-06af-46ee-add5-b3e2ff18aca0"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.668977 4831 scope.go:117] "RemoveContainer" containerID="badd65421f191f150c744558e73edc35a03ff452cb27f555bedfcf599cff9f45" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.687163 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "35a9b980-06af-46ee-add5-b3e2ff18aca0" (UID: "35a9b980-06af-46ee-add5-b3e2ff18aca0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.693214 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a81d9172-7404-4091-ba7a-60c5ee3266ea-logs\") pod \"a81d9172-7404-4091-ba7a-60c5ee3266ea\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.693281 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81d9172-7404-4091-ba7a-60c5ee3266ea-config-data\") pod \"a81d9172-7404-4091-ba7a-60c5ee3266ea\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.693309 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zsq5n\" (UniqueName: \"kubernetes.io/projected/a81d9172-7404-4091-ba7a-60c5ee3266ea-kube-api-access-zsq5n\") pod \"a81d9172-7404-4091-ba7a-60c5ee3266ea\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.693368 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81d9172-7404-4091-ba7a-60c5ee3266ea-combined-ca-bundle\") pod \"a81d9172-7404-4091-ba7a-60c5ee3266ea\" (UID: \"a81d9172-7404-4091-ba7a-60c5ee3266ea\") " Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.693739 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81d9172-7404-4091-ba7a-60c5ee3266ea-logs" (OuterVolumeSpecName: "logs") pod "a81d9172-7404-4091-ba7a-60c5ee3266ea" (UID: "a81d9172-7404-4091-ba7a-60c5ee3266ea"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.694095 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c6p5m\" (UniqueName: \"kubernetes.io/projected/35a9b980-06af-46ee-add5-b3e2ff18aca0-kube-api-access-c6p5m\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.694121 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a81d9172-7404-4091-ba7a-60c5ee3266ea-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.694134 4831 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.694146 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.694156 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35a9b980-06af-46ee-add5-b3e2ff18aca0-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.694166 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.694178 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.694188 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/35a9b980-06af-46ee-add5-b3e2ff18aca0-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.696747 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81d9172-7404-4091-ba7a-60c5ee3266ea-kube-api-access-zsq5n" (OuterVolumeSpecName: "kube-api-access-zsq5n") pod "a81d9172-7404-4091-ba7a-60c5ee3266ea" (UID: "a81d9172-7404-4091-ba7a-60c5ee3266ea"). InnerVolumeSpecName "kube-api-access-zsq5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.699006 4831 scope.go:117] "RemoveContainer" containerID="849e6901f93893cee8a7cc8b79e160199c5337d0d579443b541bf1c5790101f8" Dec 03 06:54:30 crc kubenswrapper[4831]: E1203 06:54:30.699497 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"849e6901f93893cee8a7cc8b79e160199c5337d0d579443b541bf1c5790101f8\": container with ID starting with 849e6901f93893cee8a7cc8b79e160199c5337d0d579443b541bf1c5790101f8 not found: ID does not exist" containerID="849e6901f93893cee8a7cc8b79e160199c5337d0d579443b541bf1c5790101f8" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.699560 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"849e6901f93893cee8a7cc8b79e160199c5337d0d579443b541bf1c5790101f8"} err="failed to get container status \"849e6901f93893cee8a7cc8b79e160199c5337d0d579443b541bf1c5790101f8\": rpc error: code = NotFound desc = could not find container \"849e6901f93893cee8a7cc8b79e160199c5337d0d579443b541bf1c5790101f8\": container with ID starting with 849e6901f93893cee8a7cc8b79e160199c5337d0d579443b541bf1c5790101f8 not found: ID does not exist" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.699593 4831 scope.go:117] "RemoveContainer" containerID="76a4c8f780ac4e725b7b137c558958b6c61cce39262bb1ded2729f800bf5e943" Dec 03 06:54:30 crc kubenswrapper[4831]: E1203 06:54:30.699867 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76a4c8f780ac4e725b7b137c558958b6c61cce39262bb1ded2729f800bf5e943\": container with ID starting with 76a4c8f780ac4e725b7b137c558958b6c61cce39262bb1ded2729f800bf5e943 not found: ID does not exist" containerID="76a4c8f780ac4e725b7b137c558958b6c61cce39262bb1ded2729f800bf5e943" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.699893 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76a4c8f780ac4e725b7b137c558958b6c61cce39262bb1ded2729f800bf5e943"} err="failed to get container status \"76a4c8f780ac4e725b7b137c558958b6c61cce39262bb1ded2729f800bf5e943\": rpc error: code = NotFound desc = could not find container \"76a4c8f780ac4e725b7b137c558958b6c61cce39262bb1ded2729f800bf5e943\": container with ID starting with 76a4c8f780ac4e725b7b137c558958b6c61cce39262bb1ded2729f800bf5e943 not found: ID does not exist" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.699910 4831 scope.go:117] "RemoveContainer" containerID="4e1d1159b04fcbc9fd74a20ebe473024e605494c84d329938f96907a3174f8a1" Dec 03 06:54:30 crc kubenswrapper[4831]: E1203 06:54:30.700095 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e1d1159b04fcbc9fd74a20ebe473024e605494c84d329938f96907a3174f8a1\": container with ID starting with 4e1d1159b04fcbc9fd74a20ebe473024e605494c84d329938f96907a3174f8a1 not found: ID does not exist" containerID="4e1d1159b04fcbc9fd74a20ebe473024e605494c84d329938f96907a3174f8a1" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.700122 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e1d1159b04fcbc9fd74a20ebe473024e605494c84d329938f96907a3174f8a1"} err="failed to get container status \"4e1d1159b04fcbc9fd74a20ebe473024e605494c84d329938f96907a3174f8a1\": rpc error: code = NotFound desc = could not find container \"4e1d1159b04fcbc9fd74a20ebe473024e605494c84d329938f96907a3174f8a1\": container with ID starting with 4e1d1159b04fcbc9fd74a20ebe473024e605494c84d329938f96907a3174f8a1 not found: ID does not exist" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.700138 4831 scope.go:117] "RemoveContainer" containerID="badd65421f191f150c744558e73edc35a03ff452cb27f555bedfcf599cff9f45" Dec 03 06:54:30 crc kubenswrapper[4831]: E1203 06:54:30.700340 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"badd65421f191f150c744558e73edc35a03ff452cb27f555bedfcf599cff9f45\": container with ID starting with badd65421f191f150c744558e73edc35a03ff452cb27f555bedfcf599cff9f45 not found: ID does not exist" containerID="badd65421f191f150c744558e73edc35a03ff452cb27f555bedfcf599cff9f45" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.700366 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"badd65421f191f150c744558e73edc35a03ff452cb27f555bedfcf599cff9f45"} err="failed to get container status \"badd65421f191f150c744558e73edc35a03ff452cb27f555bedfcf599cff9f45\": rpc error: code = NotFound desc = could not find container \"badd65421f191f150c744558e73edc35a03ff452cb27f555bedfcf599cff9f45\": container with ID starting with badd65421f191f150c744558e73edc35a03ff452cb27f555bedfcf599cff9f45 not found: ID does not exist" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.722403 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-config-data" (OuterVolumeSpecName: "config-data") pod "35a9b980-06af-46ee-add5-b3e2ff18aca0" (UID: "35a9b980-06af-46ee-add5-b3e2ff18aca0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.722959 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81d9172-7404-4091-ba7a-60c5ee3266ea-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a81d9172-7404-4091-ba7a-60c5ee3266ea" (UID: "a81d9172-7404-4091-ba7a-60c5ee3266ea"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.728195 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81d9172-7404-4091-ba7a-60c5ee3266ea-config-data" (OuterVolumeSpecName: "config-data") pod "a81d9172-7404-4091-ba7a-60c5ee3266ea" (UID: "a81d9172-7404-4091-ba7a-60c5ee3266ea"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.795978 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/35a9b980-06af-46ee-add5-b3e2ff18aca0-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.796024 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a81d9172-7404-4091-ba7a-60c5ee3266ea-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.796041 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zsq5n\" (UniqueName: \"kubernetes.io/projected/a81d9172-7404-4091-ba7a-60c5ee3266ea-kube-api-access-zsq5n\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.796055 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81d9172-7404-4091-ba7a-60c5ee3266ea-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.853561 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.865301 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.877226 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.887354 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.898523 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:30 crc kubenswrapper[4831]: E1203 06:54:30.898963 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="proxy-httpd" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.898987 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="proxy-httpd" Dec 03 06:54:30 crc kubenswrapper[4831]: E1203 06:54:30.899006 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81d9172-7404-4091-ba7a-60c5ee3266ea" containerName="nova-api-log" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.899012 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81d9172-7404-4091-ba7a-60c5ee3266ea" containerName="nova-api-log" Dec 03 06:54:30 crc kubenswrapper[4831]: E1203 06:54:30.899026 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="ceilometer-notification-agent" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.899032 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="ceilometer-notification-agent" Dec 03 06:54:30 crc kubenswrapper[4831]: E1203 06:54:30.899047 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="sg-core" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.899053 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="sg-core" Dec 03 06:54:30 crc kubenswrapper[4831]: E1203 06:54:30.899067 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="ceilometer-central-agent" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.899073 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="ceilometer-central-agent" Dec 03 06:54:30 crc kubenswrapper[4831]: E1203 06:54:30.899086 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81d9172-7404-4091-ba7a-60c5ee3266ea" containerName="nova-api-api" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.899092 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81d9172-7404-4091-ba7a-60c5ee3266ea" containerName="nova-api-api" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.899280 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="proxy-httpd" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.899291 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="ceilometer-central-agent" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.899304 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="ceilometer-notification-agent" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.899344 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81d9172-7404-4091-ba7a-60c5ee3266ea" containerName="nova-api-api" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.899356 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" containerName="sg-core" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.899369 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81d9172-7404-4091-ba7a-60c5ee3266ea" containerName="nova-api-log" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.900341 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.906253 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.906293 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.906253 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.911760 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.914668 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.920651 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.928270 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.928625 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.928771 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 06:54:30 crc kubenswrapper[4831]: I1203 06:54:30.936911 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.001420 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b898bb8-339a-41bc-a19b-a41195e16d8f-logs\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.001513 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.001575 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-config-data\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.001680 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.001704 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzpxp\" (UniqueName: \"kubernetes.io/projected/6b898bb8-339a-41bc-a19b-a41195e16d8f-kube-api-access-wzpxp\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.001731 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.052696 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="35a9b980-06af-46ee-add5-b3e2ff18aca0" path="/var/lib/kubelet/pods/35a9b980-06af-46ee-add5-b3e2ff18aca0/volumes" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.053664 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81d9172-7404-4091-ba7a-60c5ee3266ea" path="/var/lib/kubelet/pods/a81d9172-7404-4091-ba7a-60c5ee3266ea/volumes" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103054 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103088 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzpxp\" (UniqueName: \"kubernetes.io/projected/6b898bb8-339a-41bc-a19b-a41195e16d8f-kube-api-access-wzpxp\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103110 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103145 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b898bb8-339a-41bc-a19b-a41195e16d8f-logs\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103174 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-log-httpd\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103236 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103260 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-scripts\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103289 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103334 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103354 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-config-data\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103369 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-config-data\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103387 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6qqp\" (UniqueName: \"kubernetes.io/projected/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-kube-api-access-f6qqp\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103413 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-run-httpd\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103458 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.103686 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b898bb8-339a-41bc-a19b-a41195e16d8f-logs\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.110290 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-internal-tls-certs\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.110668 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-public-tls-certs\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.113065 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.122488 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-config-data\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.129525 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzpxp\" (UniqueName: \"kubernetes.io/projected/6b898bb8-339a-41bc-a19b-a41195e16d8f-kube-api-access-wzpxp\") pod \"nova-api-0\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.204995 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.205469 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-log-httpd\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.205535 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-scripts\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.205578 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.205616 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.205642 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-config-data\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.205670 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6qqp\" (UniqueName: \"kubernetes.io/projected/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-kube-api-access-f6qqp\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.205704 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-run-httpd\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.206071 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-log-httpd\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.206376 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-run-httpd\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.208647 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.209201 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.209731 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-scripts\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.211753 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-config-data\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.212727 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.223724 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6qqp\" (UniqueName: \"kubernetes.io/projected/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-kube-api-access-f6qqp\") pod \"ceilometer-0\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.234697 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.242859 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.759232 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:31 crc kubenswrapper[4831]: W1203 06:54:31.769137 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b898bb8_339a_41bc_a19b_a41195e16d8f.slice/crio-e47aff715904c0248c6c38dfbdb0eff0784453465b483749c833c61aeb353646 WatchSource:0}: Error finding container e47aff715904c0248c6c38dfbdb0eff0784453465b483749c833c61aeb353646: Status 404 returned error can't find the container with id e47aff715904c0248c6c38dfbdb0eff0784453465b483749c833c61aeb353646 Dec 03 06:54:31 crc kubenswrapper[4831]: I1203 06:54:31.847234 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:54:32 crc kubenswrapper[4831]: I1203 06:54:32.572996 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86a3b2cb-5698-41ab-939f-a7f1e3ccc998","Type":"ContainerStarted","Data":"7103e91ff04023ddb4c24a272266745e9e189b6b2220b5897dd2cdc470bfc0b1"} Dec 03 06:54:32 crc kubenswrapper[4831]: I1203 06:54:32.575272 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b898bb8-339a-41bc-a19b-a41195e16d8f","Type":"ContainerStarted","Data":"0683fcacd44a2588366c5fcd6cc3611fb9f316d87e1f2dde6d14ccb17889efd5"} Dec 03 06:54:32 crc kubenswrapper[4831]: I1203 06:54:32.575294 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b898bb8-339a-41bc-a19b-a41195e16d8f","Type":"ContainerStarted","Data":"494bdbb04a97a9462642db6ed2eac92b40a565c9874d9c4019b5695ebc6ac105"} Dec 03 06:54:32 crc kubenswrapper[4831]: I1203 06:54:32.575304 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b898bb8-339a-41bc-a19b-a41195e16d8f","Type":"ContainerStarted","Data":"e47aff715904c0248c6c38dfbdb0eff0784453465b483749c833c61aeb353646"} Dec 03 06:54:32 crc kubenswrapper[4831]: I1203 06:54:32.601653 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.60162199 podStartE2EDuration="2.60162199s" podCreationTimestamp="2025-12-03 06:54:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:32.593900621 +0000 UTC m=+1409.937484129" watchObservedRunningTime="2025-12-03 06:54:32.60162199 +0000 UTC m=+1409.945205498" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.115185 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.136641 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.594181 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86a3b2cb-5698-41ab-939f-a7f1e3ccc998","Type":"ContainerStarted","Data":"7e4f54e18bab0081a153be21f4ff39501c5a9212464c6700ffd2af4a479be6b7"} Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.594239 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86a3b2cb-5698-41ab-939f-a7f1e3ccc998","Type":"ContainerStarted","Data":"fcc1e170c1524fa94e168039a9e3d7633e7672aa2c44dec0ccc17b3dca0a710d"} Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.615099 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.764150 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-jqkhn"] Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.765296 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.771860 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.772337 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.786394 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jqkhn"] Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.880949 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-config-data\") pod \"nova-cell1-cell-mapping-jqkhn\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.881013 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jqkhn\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.881114 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-scripts\") pod \"nova-cell1-cell-mapping-jqkhn\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.881466 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfjz8\" (UniqueName: \"kubernetes.io/projected/aa6da26d-604c-42a4-8c46-c3437894a4ae-kube-api-access-jfjz8\") pod \"nova-cell1-cell-mapping-jqkhn\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.983710 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-config-data\") pod \"nova-cell1-cell-mapping-jqkhn\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.983808 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jqkhn\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.983848 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-scripts\") pod \"nova-cell1-cell-mapping-jqkhn\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.983959 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfjz8\" (UniqueName: \"kubernetes.io/projected/aa6da26d-604c-42a4-8c46-c3437894a4ae-kube-api-access-jfjz8\") pod \"nova-cell1-cell-mapping-jqkhn\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.992948 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-scripts\") pod \"nova-cell1-cell-mapping-jqkhn\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.993645 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-jqkhn\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.996907 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-config-data\") pod \"nova-cell1-cell-mapping-jqkhn\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:33 crc kubenswrapper[4831]: I1203 06:54:33.997179 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:54:34 crc kubenswrapper[4831]: I1203 06:54:34.003428 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfjz8\" (UniqueName: \"kubernetes.io/projected/aa6da26d-604c-42a4-8c46-c3437894a4ae-kube-api-access-jfjz8\") pod \"nova-cell1-cell-mapping-jqkhn\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:34 crc kubenswrapper[4831]: I1203 06:54:34.106908 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:34 crc kubenswrapper[4831]: I1203 06:54:34.124767 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-97qsj"] Dec 03 06:54:34 crc kubenswrapper[4831]: I1203 06:54:34.125167 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-97qsj" podUID="d0a762a2-e53e-4c92-9dee-600387fa5444" containerName="dnsmasq-dns" containerID="cri-o://7201edb891fb06ccebd0eb3a43372cdfac77a423ee5b65fa409e7b7a58e2a3ac" gracePeriod=10 Dec 03 06:54:34 crc kubenswrapper[4831]: I1203 06:54:34.857210 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-jqkhn"] Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.258705 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.418823 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-ovsdbserver-sb\") pod \"d0a762a2-e53e-4c92-9dee-600387fa5444\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.418880 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-ovsdbserver-nb\") pod \"d0a762a2-e53e-4c92-9dee-600387fa5444\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.418908 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9ztp\" (UniqueName: \"kubernetes.io/projected/d0a762a2-e53e-4c92-9dee-600387fa5444-kube-api-access-h9ztp\") pod \"d0a762a2-e53e-4c92-9dee-600387fa5444\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.419081 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-config\") pod \"d0a762a2-e53e-4c92-9dee-600387fa5444\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.419155 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-dns-svc\") pod \"d0a762a2-e53e-4c92-9dee-600387fa5444\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.419210 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-dns-swift-storage-0\") pod \"d0a762a2-e53e-4c92-9dee-600387fa5444\" (UID: \"d0a762a2-e53e-4c92-9dee-600387fa5444\") " Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.425212 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0a762a2-e53e-4c92-9dee-600387fa5444-kube-api-access-h9ztp" (OuterVolumeSpecName: "kube-api-access-h9ztp") pod "d0a762a2-e53e-4c92-9dee-600387fa5444" (UID: "d0a762a2-e53e-4c92-9dee-600387fa5444"). InnerVolumeSpecName "kube-api-access-h9ztp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.475237 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "d0a762a2-e53e-4c92-9dee-600387fa5444" (UID: "d0a762a2-e53e-4c92-9dee-600387fa5444"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.482106 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d0a762a2-e53e-4c92-9dee-600387fa5444" (UID: "d0a762a2-e53e-4c92-9dee-600387fa5444"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.486456 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d0a762a2-e53e-4c92-9dee-600387fa5444" (UID: "d0a762a2-e53e-4c92-9dee-600387fa5444"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.489891 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-config" (OuterVolumeSpecName: "config") pod "d0a762a2-e53e-4c92-9dee-600387fa5444" (UID: "d0a762a2-e53e-4c92-9dee-600387fa5444"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.506486 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d0a762a2-e53e-4c92-9dee-600387fa5444" (UID: "d0a762a2-e53e-4c92-9dee-600387fa5444"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.521136 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.521177 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.521194 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.521206 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.521216 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9ztp\" (UniqueName: \"kubernetes.io/projected/d0a762a2-e53e-4c92-9dee-600387fa5444-kube-api-access-h9ztp\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.521226 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0a762a2-e53e-4c92-9dee-600387fa5444-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.616280 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86a3b2cb-5698-41ab-939f-a7f1e3ccc998","Type":"ContainerStarted","Data":"e71ff33317182316540c85e64e3ed72394f4b793212b2b2ad2dcf5211991e1f4"} Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.617909 4831 generic.go:334] "Generic (PLEG): container finished" podID="d0a762a2-e53e-4c92-9dee-600387fa5444" containerID="7201edb891fb06ccebd0eb3a43372cdfac77a423ee5b65fa409e7b7a58e2a3ac" exitCode=0 Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.617980 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-97qsj" event={"ID":"d0a762a2-e53e-4c92-9dee-600387fa5444","Type":"ContainerDied","Data":"7201edb891fb06ccebd0eb3a43372cdfac77a423ee5b65fa409e7b7a58e2a3ac"} Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.618011 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-97qsj" event={"ID":"d0a762a2-e53e-4c92-9dee-600387fa5444","Type":"ContainerDied","Data":"c19cc2a3c853fa8a9e426cd70a7d51ae8d585ad36871dea8c7519b659efc8514"} Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.618028 4831 scope.go:117] "RemoveContainer" containerID="7201edb891fb06ccebd0eb3a43372cdfac77a423ee5b65fa409e7b7a58e2a3ac" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.618190 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-97qsj" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.620016 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jqkhn" event={"ID":"aa6da26d-604c-42a4-8c46-c3437894a4ae","Type":"ContainerStarted","Data":"90079cc1fa171a7d099cbc2a89a597829b7d80cd9239ad5c381d51f605510206"} Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.620062 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jqkhn" event={"ID":"aa6da26d-604c-42a4-8c46-c3437894a4ae","Type":"ContainerStarted","Data":"3192c975039f32876c19a9d68ae123f84dd5479d894fd3a1e16d054bdda2a3f4"} Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.642608 4831 scope.go:117] "RemoveContainer" containerID="cfec50ae3b416523c16951bccc3e0858691aa3190b42701d46b9e5ce5f8845c3" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.646588 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-jqkhn" podStartSLOduration=2.64657264 podStartE2EDuration="2.64657264s" podCreationTimestamp="2025-12-03 06:54:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:35.644334801 +0000 UTC m=+1412.987918309" watchObservedRunningTime="2025-12-03 06:54:35.64657264 +0000 UTC m=+1412.990156148" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.662851 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.662890 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.671814 4831 scope.go:117] "RemoveContainer" containerID="7201edb891fb06ccebd0eb3a43372cdfac77a423ee5b65fa409e7b7a58e2a3ac" Dec 03 06:54:35 crc kubenswrapper[4831]: E1203 06:54:35.672649 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7201edb891fb06ccebd0eb3a43372cdfac77a423ee5b65fa409e7b7a58e2a3ac\": container with ID starting with 7201edb891fb06ccebd0eb3a43372cdfac77a423ee5b65fa409e7b7a58e2a3ac not found: ID does not exist" containerID="7201edb891fb06ccebd0eb3a43372cdfac77a423ee5b65fa409e7b7a58e2a3ac" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.672680 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7201edb891fb06ccebd0eb3a43372cdfac77a423ee5b65fa409e7b7a58e2a3ac"} err="failed to get container status \"7201edb891fb06ccebd0eb3a43372cdfac77a423ee5b65fa409e7b7a58e2a3ac\": rpc error: code = NotFound desc = could not find container \"7201edb891fb06ccebd0eb3a43372cdfac77a423ee5b65fa409e7b7a58e2a3ac\": container with ID starting with 7201edb891fb06ccebd0eb3a43372cdfac77a423ee5b65fa409e7b7a58e2a3ac not found: ID does not exist" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.672700 4831 scope.go:117] "RemoveContainer" containerID="cfec50ae3b416523c16951bccc3e0858691aa3190b42701d46b9e5ce5f8845c3" Dec 03 06:54:35 crc kubenswrapper[4831]: E1203 06:54:35.672925 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfec50ae3b416523c16951bccc3e0858691aa3190b42701d46b9e5ce5f8845c3\": container with ID starting with cfec50ae3b416523c16951bccc3e0858691aa3190b42701d46b9e5ce5f8845c3 not found: ID does not exist" containerID="cfec50ae3b416523c16951bccc3e0858691aa3190b42701d46b9e5ce5f8845c3" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.672948 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfec50ae3b416523c16951bccc3e0858691aa3190b42701d46b9e5ce5f8845c3"} err="failed to get container status \"cfec50ae3b416523c16951bccc3e0858691aa3190b42701d46b9e5ce5f8845c3\": rpc error: code = NotFound desc = could not find container \"cfec50ae3b416523c16951bccc3e0858691aa3190b42701d46b9e5ce5f8845c3\": container with ID starting with cfec50ae3b416523c16951bccc3e0858691aa3190b42701d46b9e5ce5f8845c3 not found: ID does not exist" Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.674109 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-97qsj"] Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.685208 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-97qsj"] Dec 03 06:54:35 crc kubenswrapper[4831]: I1203 06:54:35.716510 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:36 crc kubenswrapper[4831]: I1203 06:54:36.632258 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86a3b2cb-5698-41ab-939f-a7f1e3ccc998","Type":"ContainerStarted","Data":"261de1853dcb7663f85295db044aad677b35575a677eb0bba2a868477180e3c8"} Dec 03 06:54:36 crc kubenswrapper[4831]: I1203 06:54:36.632643 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 06:54:36 crc kubenswrapper[4831]: I1203 06:54:36.684195 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:36 crc kubenswrapper[4831]: I1203 06:54:36.710918 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.45809786 podStartE2EDuration="6.710900695s" podCreationTimestamp="2025-12-03 06:54:30 +0000 UTC" firstStartedPulling="2025-12-03 06:54:31.849061394 +0000 UTC m=+1409.192644902" lastFinishedPulling="2025-12-03 06:54:36.101864229 +0000 UTC m=+1413.445447737" observedRunningTime="2025-12-03 06:54:36.659833451 +0000 UTC m=+1414.003416969" watchObservedRunningTime="2025-12-03 06:54:36.710900695 +0000 UTC m=+1414.054484203" Dec 03 06:54:36 crc kubenswrapper[4831]: I1203 06:54:36.745441 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9v7gk"] Dec 03 06:54:37 crc kubenswrapper[4831]: I1203 06:54:37.028538 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0a762a2-e53e-4c92-9dee-600387fa5444" path="/var/lib/kubelet/pods/d0a762a2-e53e-4c92-9dee-600387fa5444/volumes" Dec 03 06:54:38 crc kubenswrapper[4831]: I1203 06:54:38.651141 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9v7gk" podUID="90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" containerName="registry-server" containerID="cri-o://7dfcc269a310235e13605c2d5d6c940eaa1aaf3064b500114b1e8a1bcf577f0a" gracePeriod=2 Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.192656 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.303454 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-catalog-content\") pod \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\" (UID: \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\") " Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.303585 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-utilities\") pod \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\" (UID: \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\") " Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.303629 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmpjd\" (UniqueName: \"kubernetes.io/projected/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-kube-api-access-mmpjd\") pod \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\" (UID: \"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea\") " Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.304844 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-utilities" (OuterVolumeSpecName: "utilities") pod "90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" (UID: "90e5f8ce-b2fa-4ba5-83d4-a289cd516cea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.309948 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-kube-api-access-mmpjd" (OuterVolumeSpecName: "kube-api-access-mmpjd") pod "90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" (UID: "90e5f8ce-b2fa-4ba5-83d4-a289cd516cea"). InnerVolumeSpecName "kube-api-access-mmpjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.398626 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" (UID: "90e5f8ce-b2fa-4ba5-83d4-a289cd516cea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.406020 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.406058 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmpjd\" (UniqueName: \"kubernetes.io/projected/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-kube-api-access-mmpjd\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.406072 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.671698 4831 generic.go:334] "Generic (PLEG): container finished" podID="90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" containerID="7dfcc269a310235e13605c2d5d6c940eaa1aaf3064b500114b1e8a1bcf577f0a" exitCode=0 Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.671761 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9v7gk" event={"ID":"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea","Type":"ContainerDied","Data":"7dfcc269a310235e13605c2d5d6c940eaa1aaf3064b500114b1e8a1bcf577f0a"} Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.671817 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9v7gk" event={"ID":"90e5f8ce-b2fa-4ba5-83d4-a289cd516cea","Type":"ContainerDied","Data":"6a2879ae7e8ba4be1b4dd474ead5064400215a9986e312f8a55c8b3507d6df9f"} Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.671826 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9v7gk" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.671839 4831 scope.go:117] "RemoveContainer" containerID="7dfcc269a310235e13605c2d5d6c940eaa1aaf3064b500114b1e8a1bcf577f0a" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.704117 4831 scope.go:117] "RemoveContainer" containerID="506fae631e001da54d46c6e76df86aaba7c5dc58984f0944a9425d7134980943" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.735045 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9v7gk"] Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.752008 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9v7gk"] Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.763926 4831 scope.go:117] "RemoveContainer" containerID="806854c35630f855949158352173cafd9f05ff45c418095dd9cd00e100ac7916" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.804602 4831 scope.go:117] "RemoveContainer" containerID="7dfcc269a310235e13605c2d5d6c940eaa1aaf3064b500114b1e8a1bcf577f0a" Dec 03 06:54:39 crc kubenswrapper[4831]: E1203 06:54:39.804992 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dfcc269a310235e13605c2d5d6c940eaa1aaf3064b500114b1e8a1bcf577f0a\": container with ID starting with 7dfcc269a310235e13605c2d5d6c940eaa1aaf3064b500114b1e8a1bcf577f0a not found: ID does not exist" containerID="7dfcc269a310235e13605c2d5d6c940eaa1aaf3064b500114b1e8a1bcf577f0a" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.805039 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dfcc269a310235e13605c2d5d6c940eaa1aaf3064b500114b1e8a1bcf577f0a"} err="failed to get container status \"7dfcc269a310235e13605c2d5d6c940eaa1aaf3064b500114b1e8a1bcf577f0a\": rpc error: code = NotFound desc = could not find container \"7dfcc269a310235e13605c2d5d6c940eaa1aaf3064b500114b1e8a1bcf577f0a\": container with ID starting with 7dfcc269a310235e13605c2d5d6c940eaa1aaf3064b500114b1e8a1bcf577f0a not found: ID does not exist" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.805064 4831 scope.go:117] "RemoveContainer" containerID="506fae631e001da54d46c6e76df86aaba7c5dc58984f0944a9425d7134980943" Dec 03 06:54:39 crc kubenswrapper[4831]: E1203 06:54:39.805391 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"506fae631e001da54d46c6e76df86aaba7c5dc58984f0944a9425d7134980943\": container with ID starting with 506fae631e001da54d46c6e76df86aaba7c5dc58984f0944a9425d7134980943 not found: ID does not exist" containerID="506fae631e001da54d46c6e76df86aaba7c5dc58984f0944a9425d7134980943" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.805432 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"506fae631e001da54d46c6e76df86aaba7c5dc58984f0944a9425d7134980943"} err="failed to get container status \"506fae631e001da54d46c6e76df86aaba7c5dc58984f0944a9425d7134980943\": rpc error: code = NotFound desc = could not find container \"506fae631e001da54d46c6e76df86aaba7c5dc58984f0944a9425d7134980943\": container with ID starting with 506fae631e001da54d46c6e76df86aaba7c5dc58984f0944a9425d7134980943 not found: ID does not exist" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.805456 4831 scope.go:117] "RemoveContainer" containerID="806854c35630f855949158352173cafd9f05ff45c418095dd9cd00e100ac7916" Dec 03 06:54:39 crc kubenswrapper[4831]: E1203 06:54:39.805717 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"806854c35630f855949158352173cafd9f05ff45c418095dd9cd00e100ac7916\": container with ID starting with 806854c35630f855949158352173cafd9f05ff45c418095dd9cd00e100ac7916 not found: ID does not exist" containerID="806854c35630f855949158352173cafd9f05ff45c418095dd9cd00e100ac7916" Dec 03 06:54:39 crc kubenswrapper[4831]: I1203 06:54:39.805742 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"806854c35630f855949158352173cafd9f05ff45c418095dd9cd00e100ac7916"} err="failed to get container status \"806854c35630f855949158352173cafd9f05ff45c418095dd9cd00e100ac7916\": rpc error: code = NotFound desc = could not find container \"806854c35630f855949158352173cafd9f05ff45c418095dd9cd00e100ac7916\": container with ID starting with 806854c35630f855949158352173cafd9f05ff45c418095dd9cd00e100ac7916 not found: ID does not exist" Dec 03 06:54:40 crc kubenswrapper[4831]: I1203 06:54:40.687828 4831 generic.go:334] "Generic (PLEG): container finished" podID="aa6da26d-604c-42a4-8c46-c3437894a4ae" containerID="90079cc1fa171a7d099cbc2a89a597829b7d80cd9239ad5c381d51f605510206" exitCode=0 Dec 03 06:54:40 crc kubenswrapper[4831]: I1203 06:54:40.688278 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jqkhn" event={"ID":"aa6da26d-604c-42a4-8c46-c3437894a4ae","Type":"ContainerDied","Data":"90079cc1fa171a7d099cbc2a89a597829b7d80cd9239ad5c381d51f605510206"} Dec 03 06:54:41 crc kubenswrapper[4831]: I1203 06:54:41.032668 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" path="/var/lib/kubelet/pods/90e5f8ce-b2fa-4ba5-83d4-a289cd516cea/volumes" Dec 03 06:54:41 crc kubenswrapper[4831]: I1203 06:54:41.235708 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 06:54:41 crc kubenswrapper[4831]: I1203 06:54:41.235751 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.145992 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.251476 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6b898bb8-339a-41bc-a19b-a41195e16d8f" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.251867 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6b898bb8-339a-41bc-a19b-a41195e16d8f" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.201:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.271605 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-combined-ca-bundle\") pod \"aa6da26d-604c-42a4-8c46-c3437894a4ae\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.271994 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-scripts\") pod \"aa6da26d-604c-42a4-8c46-c3437894a4ae\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.272210 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfjz8\" (UniqueName: \"kubernetes.io/projected/aa6da26d-604c-42a4-8c46-c3437894a4ae-kube-api-access-jfjz8\") pod \"aa6da26d-604c-42a4-8c46-c3437894a4ae\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.272990 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-config-data\") pod \"aa6da26d-604c-42a4-8c46-c3437894a4ae\" (UID: \"aa6da26d-604c-42a4-8c46-c3437894a4ae\") " Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.277938 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-scripts" (OuterVolumeSpecName: "scripts") pod "aa6da26d-604c-42a4-8c46-c3437894a4ae" (UID: "aa6da26d-604c-42a4-8c46-c3437894a4ae"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.294405 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa6da26d-604c-42a4-8c46-c3437894a4ae-kube-api-access-jfjz8" (OuterVolumeSpecName: "kube-api-access-jfjz8") pod "aa6da26d-604c-42a4-8c46-c3437894a4ae" (UID: "aa6da26d-604c-42a4-8c46-c3437894a4ae"). InnerVolumeSpecName "kube-api-access-jfjz8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.303507 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa6da26d-604c-42a4-8c46-c3437894a4ae" (UID: "aa6da26d-604c-42a4-8c46-c3437894a4ae"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.310688 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-config-data" (OuterVolumeSpecName: "config-data") pod "aa6da26d-604c-42a4-8c46-c3437894a4ae" (UID: "aa6da26d-604c-42a4-8c46-c3437894a4ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.375807 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.375847 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.375857 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfjz8\" (UniqueName: \"kubernetes.io/projected/aa6da26d-604c-42a4-8c46-c3437894a4ae-kube-api-access-jfjz8\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.375870 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa6da26d-604c-42a4-8c46-c3437894a4ae-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.722462 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-jqkhn" event={"ID":"aa6da26d-604c-42a4-8c46-c3437894a4ae","Type":"ContainerDied","Data":"3192c975039f32876c19a9d68ae123f84dd5479d894fd3a1e16d054bdda2a3f4"} Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.722508 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3192c975039f32876c19a9d68ae123f84dd5479d894fd3a1e16d054bdda2a3f4" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.722588 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-jqkhn" Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.922732 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.923008 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6b898bb8-339a-41bc-a19b-a41195e16d8f" containerName="nova-api-log" containerID="cri-o://494bdbb04a97a9462642db6ed2eac92b40a565c9874d9c4019b5695ebc6ac105" gracePeriod=30 Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.923108 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6b898bb8-339a-41bc-a19b-a41195e16d8f" containerName="nova-api-api" containerID="cri-o://0683fcacd44a2588366c5fcd6cc3611fb9f316d87e1f2dde6d14ccb17889efd5" gracePeriod=30 Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.970255 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.971095 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerName="nova-metadata-metadata" containerID="cri-o://13ce7ea095470db4e684606e9687e7f6edd2203fc808ef459ac3baa5d6d59863" gracePeriod=30 Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.971331 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerName="nova-metadata-log" containerID="cri-o://c25727d41e7f65e645b2f8e57ec6dfb34ca6f5c24c8086394bb173525d95df03" gracePeriod=30 Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.990072 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:54:42 crc kubenswrapper[4831]: I1203 06:54:42.990336 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="872f90d4-ad5a-4d6b-a529-4fc5f5fea83c" containerName="nova-scheduler-scheduler" containerID="cri-o://135ca5a6d135fc68371a549db1898457ff97cd0c1a268d01721f62961e990264" gracePeriod=30 Dec 03 06:54:43 crc kubenswrapper[4831]: E1203 06:54:43.561752 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="135ca5a6d135fc68371a549db1898457ff97cd0c1a268d01721f62961e990264" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 06:54:43 crc kubenswrapper[4831]: E1203 06:54:43.563993 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="135ca5a6d135fc68371a549db1898457ff97cd0c1a268d01721f62961e990264" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 06:54:43 crc kubenswrapper[4831]: E1203 06:54:43.566613 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="135ca5a6d135fc68371a549db1898457ff97cd0c1a268d01721f62961e990264" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 06:54:43 crc kubenswrapper[4831]: E1203 06:54:43.566676 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="872f90d4-ad5a-4d6b-a529-4fc5f5fea83c" containerName="nova-scheduler-scheduler" Dec 03 06:54:43 crc kubenswrapper[4831]: I1203 06:54:43.734042 4831 generic.go:334] "Generic (PLEG): container finished" podID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerID="c25727d41e7f65e645b2f8e57ec6dfb34ca6f5c24c8086394bb173525d95df03" exitCode=143 Dec 03 06:54:43 crc kubenswrapper[4831]: I1203 06:54:43.734130 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9eb2fbe-db81-4563-82c4-b20e364a64e2","Type":"ContainerDied","Data":"c25727d41e7f65e645b2f8e57ec6dfb34ca6f5c24c8086394bb173525d95df03"} Dec 03 06:54:43 crc kubenswrapper[4831]: I1203 06:54:43.735849 4831 generic.go:334] "Generic (PLEG): container finished" podID="6b898bb8-339a-41bc-a19b-a41195e16d8f" containerID="494bdbb04a97a9462642db6ed2eac92b40a565c9874d9c4019b5695ebc6ac105" exitCode=143 Dec 03 06:54:43 crc kubenswrapper[4831]: I1203 06:54:43.735889 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b898bb8-339a-41bc-a19b-a41195e16d8f","Type":"ContainerDied","Data":"494bdbb04a97a9462642db6ed2eac92b40a565c9874d9c4019b5695ebc6ac105"} Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.115222 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:38974->10.217.0.193:8775: read: connection reset by peer" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.115271 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.193:8775/\": read tcp 10.217.0.2:38972->10.217.0.193:8775: read: connection reset by peer" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.573644 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.664775 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9eb2fbe-db81-4563-82c4-b20e364a64e2-logs\") pod \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.664905 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-nova-metadata-tls-certs\") pod \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.664936 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-config-data\") pod \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.664973 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-combined-ca-bundle\") pod \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.665020 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-677w7\" (UniqueName: \"kubernetes.io/projected/b9eb2fbe-db81-4563-82c4-b20e364a64e2-kube-api-access-677w7\") pod \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\" (UID: \"b9eb2fbe-db81-4563-82c4-b20e364a64e2\") " Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.666442 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9eb2fbe-db81-4563-82c4-b20e364a64e2-logs" (OuterVolumeSpecName: "logs") pod "b9eb2fbe-db81-4563-82c4-b20e364a64e2" (UID: "b9eb2fbe-db81-4563-82c4-b20e364a64e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.671037 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9eb2fbe-db81-4563-82c4-b20e364a64e2-kube-api-access-677w7" (OuterVolumeSpecName: "kube-api-access-677w7") pod "b9eb2fbe-db81-4563-82c4-b20e364a64e2" (UID: "b9eb2fbe-db81-4563-82c4-b20e364a64e2"). InnerVolumeSpecName "kube-api-access-677w7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.698561 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b9eb2fbe-db81-4563-82c4-b20e364a64e2" (UID: "b9eb2fbe-db81-4563-82c4-b20e364a64e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.727067 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-config-data" (OuterVolumeSpecName: "config-data") pod "b9eb2fbe-db81-4563-82c4-b20e364a64e2" (UID: "b9eb2fbe-db81-4563-82c4-b20e364a64e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.767262 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.767303 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-677w7\" (UniqueName: \"kubernetes.io/projected/b9eb2fbe-db81-4563-82c4-b20e364a64e2-kube-api-access-677w7\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.767335 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9eb2fbe-db81-4563-82c4-b20e364a64e2-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.767350 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.768711 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "b9eb2fbe-db81-4563-82c4-b20e364a64e2" (UID: "b9eb2fbe-db81-4563-82c4-b20e364a64e2"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.776369 4831 generic.go:334] "Generic (PLEG): container finished" podID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerID="13ce7ea095470db4e684606e9687e7f6edd2203fc808ef459ac3baa5d6d59863" exitCode=0 Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.776436 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9eb2fbe-db81-4563-82c4-b20e364a64e2","Type":"ContainerDied","Data":"13ce7ea095470db4e684606e9687e7f6edd2203fc808ef459ac3baa5d6d59863"} Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.776529 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"b9eb2fbe-db81-4563-82c4-b20e364a64e2","Type":"ContainerDied","Data":"49d6e71b52757a5ca8d446de334ba984d88d2a09292421a8c4768f053659fb3c"} Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.776558 4831 scope.go:117] "RemoveContainer" containerID="13ce7ea095470db4e684606e9687e7f6edd2203fc808ef459ac3baa5d6d59863" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.776478 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.800605 4831 scope.go:117] "RemoveContainer" containerID="c25727d41e7f65e645b2f8e57ec6dfb34ca6f5c24c8086394bb173525d95df03" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.813603 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.825854 4831 scope.go:117] "RemoveContainer" containerID="13ce7ea095470db4e684606e9687e7f6edd2203fc808ef459ac3baa5d6d59863" Dec 03 06:54:46 crc kubenswrapper[4831]: E1203 06:54:46.826302 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13ce7ea095470db4e684606e9687e7f6edd2203fc808ef459ac3baa5d6d59863\": container with ID starting with 13ce7ea095470db4e684606e9687e7f6edd2203fc808ef459ac3baa5d6d59863 not found: ID does not exist" containerID="13ce7ea095470db4e684606e9687e7f6edd2203fc808ef459ac3baa5d6d59863" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.826372 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ce7ea095470db4e684606e9687e7f6edd2203fc808ef459ac3baa5d6d59863"} err="failed to get container status \"13ce7ea095470db4e684606e9687e7f6edd2203fc808ef459ac3baa5d6d59863\": rpc error: code = NotFound desc = could not find container \"13ce7ea095470db4e684606e9687e7f6edd2203fc808ef459ac3baa5d6d59863\": container with ID starting with 13ce7ea095470db4e684606e9687e7f6edd2203fc808ef459ac3baa5d6d59863 not found: ID does not exist" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.826399 4831 scope.go:117] "RemoveContainer" containerID="c25727d41e7f65e645b2f8e57ec6dfb34ca6f5c24c8086394bb173525d95df03" Dec 03 06:54:46 crc kubenswrapper[4831]: E1203 06:54:46.826760 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c25727d41e7f65e645b2f8e57ec6dfb34ca6f5c24c8086394bb173525d95df03\": container with ID starting with c25727d41e7f65e645b2f8e57ec6dfb34ca6f5c24c8086394bb173525d95df03 not found: ID does not exist" containerID="c25727d41e7f65e645b2f8e57ec6dfb34ca6f5c24c8086394bb173525d95df03" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.826784 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c25727d41e7f65e645b2f8e57ec6dfb34ca6f5c24c8086394bb173525d95df03"} err="failed to get container status \"c25727d41e7f65e645b2f8e57ec6dfb34ca6f5c24c8086394bb173525d95df03\": rpc error: code = NotFound desc = could not find container \"c25727d41e7f65e645b2f8e57ec6dfb34ca6f5c24c8086394bb173525d95df03\": container with ID starting with c25727d41e7f65e645b2f8e57ec6dfb34ca6f5c24c8086394bb173525d95df03 not found: ID does not exist" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.835963 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.849571 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:54:46 crc kubenswrapper[4831]: E1203 06:54:46.850130 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" containerName="registry-server" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.850155 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" containerName="registry-server" Dec 03 06:54:46 crc kubenswrapper[4831]: E1203 06:54:46.850167 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" containerName="extract-utilities" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.850177 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" containerName="extract-utilities" Dec 03 06:54:46 crc kubenswrapper[4831]: E1203 06:54:46.850193 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerName="nova-metadata-metadata" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.850201 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerName="nova-metadata-metadata" Dec 03 06:54:46 crc kubenswrapper[4831]: E1203 06:54:46.850219 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa6da26d-604c-42a4-8c46-c3437894a4ae" containerName="nova-manage" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.850226 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa6da26d-604c-42a4-8c46-c3437894a4ae" containerName="nova-manage" Dec 03 06:54:46 crc kubenswrapper[4831]: E1203 06:54:46.850243 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerName="nova-metadata-log" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.850250 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerName="nova-metadata-log" Dec 03 06:54:46 crc kubenswrapper[4831]: E1203 06:54:46.850263 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a762a2-e53e-4c92-9dee-600387fa5444" containerName="init" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.850270 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a762a2-e53e-4c92-9dee-600387fa5444" containerName="init" Dec 03 06:54:46 crc kubenswrapper[4831]: E1203 06:54:46.850288 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" containerName="extract-content" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.850295 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" containerName="extract-content" Dec 03 06:54:46 crc kubenswrapper[4831]: E1203 06:54:46.850303 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0a762a2-e53e-4c92-9dee-600387fa5444" containerName="dnsmasq-dns" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.850336 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0a762a2-e53e-4c92-9dee-600387fa5444" containerName="dnsmasq-dns" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.850551 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerName="nova-metadata-metadata" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.850568 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="90e5f8ce-b2fa-4ba5-83d4-a289cd516cea" containerName="registry-server" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.850592 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0a762a2-e53e-4c92-9dee-600387fa5444" containerName="dnsmasq-dns" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.850606 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" containerName="nova-metadata-log" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.850618 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa6da26d-604c-42a4-8c46-c3437894a4ae" containerName="nova-manage" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.851731 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.854842 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.855079 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.863412 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.870670 4831 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/b9eb2fbe-db81-4563-82c4-b20e364a64e2-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.972279 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.972443 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-config-data\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.972468 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.972539 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnzss\" (UniqueName: \"kubernetes.io/projected/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-kube-api-access-gnzss\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:46 crc kubenswrapper[4831]: I1203 06:54:46.972653 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-logs\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.023919 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9eb2fbe-db81-4563-82c4-b20e364a64e2" path="/var/lib/kubelet/pods/b9eb2fbe-db81-4563-82c4-b20e364a64e2/volumes" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.074371 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.074490 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-config-data\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.074517 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.074589 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnzss\" (UniqueName: \"kubernetes.io/projected/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-kube-api-access-gnzss\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.074651 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-logs\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.075948 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-logs\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.078731 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.079269 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-config-data\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.083076 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.103158 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnzss\" (UniqueName: \"kubernetes.io/projected/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-kube-api-access-gnzss\") pod \"nova-metadata-0\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " pod="openstack/nova-metadata-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.168447 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.621029 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.788696 4831 generic.go:334] "Generic (PLEG): container finished" podID="872f90d4-ad5a-4d6b-a529-4fc5f5fea83c" containerID="135ca5a6d135fc68371a549db1898457ff97cd0c1a268d01721f62961e990264" exitCode=0 Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.788902 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c","Type":"ContainerDied","Data":"135ca5a6d135fc68371a549db1898457ff97cd0c1a268d01721f62961e990264"} Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.790265 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586","Type":"ContainerStarted","Data":"3adbb64821e5a6d0a9e9d25bb799ddb3dcbe390f239f4223862330f591416d59"} Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.793330 4831 generic.go:334] "Generic (PLEG): container finished" podID="6b898bb8-339a-41bc-a19b-a41195e16d8f" containerID="0683fcacd44a2588366c5fcd6cc3611fb9f316d87e1f2dde6d14ccb17889efd5" exitCode=0 Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.793375 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b898bb8-339a-41bc-a19b-a41195e16d8f","Type":"ContainerDied","Data":"0683fcacd44a2588366c5fcd6cc3611fb9f316d87e1f2dde6d14ccb17889efd5"} Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.793400 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6b898bb8-339a-41bc-a19b-a41195e16d8f","Type":"ContainerDied","Data":"e47aff715904c0248c6c38dfbdb0eff0784453465b483749c833c61aeb353646"} Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.793410 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e47aff715904c0248c6c38dfbdb0eff0784453465b483749c833c61aeb353646" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.796055 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.845154 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.889692 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-config-data\") pod \"6b898bb8-339a-41bc-a19b-a41195e16d8f\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.889827 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6jkd\" (UniqueName: \"kubernetes.io/projected/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-kube-api-access-d6jkd\") pod \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\" (UID: \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\") " Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.889932 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-config-data\") pod \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\" (UID: \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\") " Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.889976 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b898bb8-339a-41bc-a19b-a41195e16d8f-logs\") pod \"6b898bb8-339a-41bc-a19b-a41195e16d8f\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.890049 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-public-tls-certs\") pod \"6b898bb8-339a-41bc-a19b-a41195e16d8f\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.890071 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzpxp\" (UniqueName: \"kubernetes.io/projected/6b898bb8-339a-41bc-a19b-a41195e16d8f-kube-api-access-wzpxp\") pod \"6b898bb8-339a-41bc-a19b-a41195e16d8f\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.890156 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-combined-ca-bundle\") pod \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\" (UID: \"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c\") " Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.890190 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-combined-ca-bundle\") pod \"6b898bb8-339a-41bc-a19b-a41195e16d8f\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.890792 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-internal-tls-certs\") pod \"6b898bb8-339a-41bc-a19b-a41195e16d8f\" (UID: \"6b898bb8-339a-41bc-a19b-a41195e16d8f\") " Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.890958 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b898bb8-339a-41bc-a19b-a41195e16d8f-logs" (OuterVolumeSpecName: "logs") pod "6b898bb8-339a-41bc-a19b-a41195e16d8f" (UID: "6b898bb8-339a-41bc-a19b-a41195e16d8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.891735 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b898bb8-339a-41bc-a19b-a41195e16d8f-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.895160 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b898bb8-339a-41bc-a19b-a41195e16d8f-kube-api-access-wzpxp" (OuterVolumeSpecName: "kube-api-access-wzpxp") pod "6b898bb8-339a-41bc-a19b-a41195e16d8f" (UID: "6b898bb8-339a-41bc-a19b-a41195e16d8f"). InnerVolumeSpecName "kube-api-access-wzpxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.898976 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-kube-api-access-d6jkd" (OuterVolumeSpecName: "kube-api-access-d6jkd") pod "872f90d4-ad5a-4d6b-a529-4fc5f5fea83c" (UID: "872f90d4-ad5a-4d6b-a529-4fc5f5fea83c"). InnerVolumeSpecName "kube-api-access-d6jkd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.920833 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6b898bb8-339a-41bc-a19b-a41195e16d8f" (UID: "6b898bb8-339a-41bc-a19b-a41195e16d8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.926199 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-config-data" (OuterVolumeSpecName: "config-data") pod "6b898bb8-339a-41bc-a19b-a41195e16d8f" (UID: "6b898bb8-339a-41bc-a19b-a41195e16d8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.927582 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-config-data" (OuterVolumeSpecName: "config-data") pod "872f90d4-ad5a-4d6b-a529-4fc5f5fea83c" (UID: "872f90d4-ad5a-4d6b-a529-4fc5f5fea83c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.934501 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "872f90d4-ad5a-4d6b-a529-4fc5f5fea83c" (UID: "872f90d4-ad5a-4d6b-a529-4fc5f5fea83c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.959456 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "6b898bb8-339a-41bc-a19b-a41195e16d8f" (UID: "6b898bb8-339a-41bc-a19b-a41195e16d8f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.962828 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6b898bb8-339a-41bc-a19b-a41195e16d8f" (UID: "6b898bb8-339a-41bc-a19b-a41195e16d8f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.998511 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.998537 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6jkd\" (UniqueName: \"kubernetes.io/projected/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-kube-api-access-d6jkd\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.998547 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.998555 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.998585 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzpxp\" (UniqueName: \"kubernetes.io/projected/6b898bb8-339a-41bc-a19b-a41195e16d8f-kube-api-access-wzpxp\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.998597 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.998605 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:47 crc kubenswrapper[4831]: I1203 06:54:47.998614 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6b898bb8-339a-41bc-a19b-a41195e16d8f-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.810824 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"872f90d4-ad5a-4d6b-a529-4fc5f5fea83c","Type":"ContainerDied","Data":"f55fb49ffdb7b30b6e01745d6231c72f61923c1bc5985a263b021ac2578f63bd"} Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.810839 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.811160 4831 scope.go:117] "RemoveContainer" containerID="135ca5a6d135fc68371a549db1898457ff97cd0c1a268d01721f62961e990264" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.815124 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586","Type":"ContainerStarted","Data":"f6d28e8b3135eebdfd374f9c303c4d8edb75a0da36bd550a21205abedf1942c2"} Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.815149 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.815166 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586","Type":"ContainerStarted","Data":"7d0ad17b9c4b331c9b8e4e43f1949619c278138c064021159c668b145db79938"} Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.859418 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.859399297 podStartE2EDuration="2.859399297s" podCreationTimestamp="2025-12-03 06:54:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:48.84529226 +0000 UTC m=+1426.188875778" watchObservedRunningTime="2025-12-03 06:54:48.859399297 +0000 UTC m=+1426.202982815" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.880075 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.913254 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.949823 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.960622 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:48 crc kubenswrapper[4831]: E1203 06:54:48.961275 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b898bb8-339a-41bc-a19b-a41195e16d8f" containerName="nova-api-api" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.961299 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b898bb8-339a-41bc-a19b-a41195e16d8f" containerName="nova-api-api" Dec 03 06:54:48 crc kubenswrapper[4831]: E1203 06:54:48.961382 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b898bb8-339a-41bc-a19b-a41195e16d8f" containerName="nova-api-log" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.961393 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b898bb8-339a-41bc-a19b-a41195e16d8f" containerName="nova-api-log" Dec 03 06:54:48 crc kubenswrapper[4831]: E1203 06:54:48.961439 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="872f90d4-ad5a-4d6b-a529-4fc5f5fea83c" containerName="nova-scheduler-scheduler" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.961450 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="872f90d4-ad5a-4d6b-a529-4fc5f5fea83c" containerName="nova-scheduler-scheduler" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.961801 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b898bb8-339a-41bc-a19b-a41195e16d8f" containerName="nova-api-api" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.961857 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b898bb8-339a-41bc-a19b-a41195e16d8f" containerName="nova-api-log" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.961877 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="872f90d4-ad5a-4d6b-a529-4fc5f5fea83c" containerName="nova-scheduler-scheduler" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.963612 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.965434 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.967772 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.968195 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.974813 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.986170 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.993746 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.995176 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 06:54:48 crc kubenswrapper[4831]: I1203 06:54:48.997021 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.006683 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.015916 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.015975 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.016004 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4x2j\" (UniqueName: \"kubernetes.io/projected/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-kube-api-access-j4x2j\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.016071 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-logs\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.016141 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-public-tls-certs\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.016161 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-config-data\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.024932 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b898bb8-339a-41bc-a19b-a41195e16d8f" path="/var/lib/kubelet/pods/6b898bb8-339a-41bc-a19b-a41195e16d8f/volumes" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.025607 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="872f90d4-ad5a-4d6b-a529-4fc5f5fea83c" path="/var/lib/kubelet/pods/872f90d4-ad5a-4d6b-a529-4fc5f5fea83c/volumes" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.117651 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-public-tls-certs\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.117718 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-config-data\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.117749 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5dxt\" (UniqueName: \"kubernetes.io/projected/aadb65a0-295d-4fcf-b148-44480346d357-kube-api-access-h5dxt\") pod \"nova-scheduler-0\" (UID: \"aadb65a0-295d-4fcf-b148-44480346d357\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.117817 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.117846 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.117863 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j4x2j\" (UniqueName: \"kubernetes.io/projected/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-kube-api-access-j4x2j\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.117900 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aadb65a0-295d-4fcf-b148-44480346d357-config-data\") pod \"nova-scheduler-0\" (UID: \"aadb65a0-295d-4fcf-b148-44480346d357\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.117948 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-logs\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.117975 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aadb65a0-295d-4fcf-b148-44480346d357-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aadb65a0-295d-4fcf-b148-44480346d357\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.119801 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-logs\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.123140 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.123255 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-config-data\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.123362 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-internal-tls-certs\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.137830 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-public-tls-certs\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.141463 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4x2j\" (UniqueName: \"kubernetes.io/projected/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-kube-api-access-j4x2j\") pod \"nova-api-0\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.219819 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aadb65a0-295d-4fcf-b148-44480346d357-config-data\") pod \"nova-scheduler-0\" (UID: \"aadb65a0-295d-4fcf-b148-44480346d357\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.219982 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aadb65a0-295d-4fcf-b148-44480346d357-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aadb65a0-295d-4fcf-b148-44480346d357\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.220139 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h5dxt\" (UniqueName: \"kubernetes.io/projected/aadb65a0-295d-4fcf-b148-44480346d357-kube-api-access-h5dxt\") pod \"nova-scheduler-0\" (UID: \"aadb65a0-295d-4fcf-b148-44480346d357\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.227509 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aadb65a0-295d-4fcf-b148-44480346d357-config-data\") pod \"nova-scheduler-0\" (UID: \"aadb65a0-295d-4fcf-b148-44480346d357\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.227742 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aadb65a0-295d-4fcf-b148-44480346d357-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"aadb65a0-295d-4fcf-b148-44480346d357\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.240278 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5dxt\" (UniqueName: \"kubernetes.io/projected/aadb65a0-295d-4fcf-b148-44480346d357-kube-api-access-h5dxt\") pod \"nova-scheduler-0\" (UID: \"aadb65a0-295d-4fcf-b148-44480346d357\") " pod="openstack/nova-scheduler-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.280126 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.308772 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.786753 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:54:49 crc kubenswrapper[4831]: W1203 06:54:49.793325 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a438bff_fbe4_4ae4_8d0f_3eecc1819f50.slice/crio-7c9fd6d53d6b13b72e1b080be38d06ab0ab8c4de2d04ebb61ebaf54998d99e46 WatchSource:0}: Error finding container 7c9fd6d53d6b13b72e1b080be38d06ab0ab8c4de2d04ebb61ebaf54998d99e46: Status 404 returned error can't find the container with id 7c9fd6d53d6b13b72e1b080be38d06ab0ab8c4de2d04ebb61ebaf54998d99e46 Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.838990 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50","Type":"ContainerStarted","Data":"7c9fd6d53d6b13b72e1b080be38d06ab0ab8c4de2d04ebb61ebaf54998d99e46"} Dec 03 06:54:49 crc kubenswrapper[4831]: I1203 06:54:49.917297 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:54:49 crc kubenswrapper[4831]: W1203 06:54:49.926768 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaadb65a0_295d_4fcf_b148_44480346d357.slice/crio-cd394ff880e63d3ae898a844be66ad605e336a324338fbbcd3be61ddca712c73 WatchSource:0}: Error finding container cd394ff880e63d3ae898a844be66ad605e336a324338fbbcd3be61ddca712c73: Status 404 returned error can't find the container with id cd394ff880e63d3ae898a844be66ad605e336a324338fbbcd3be61ddca712c73 Dec 03 06:54:50 crc kubenswrapper[4831]: I1203 06:54:50.855167 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50","Type":"ContainerStarted","Data":"a5a52868ab8bcbffa60ec710ffeaf3085cae1de015dd39d00cd7003328b22a28"} Dec 03 06:54:50 crc kubenswrapper[4831]: I1203 06:54:50.855679 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50","Type":"ContainerStarted","Data":"d9f25cfe98a3d7189a069459787b3914756469c02ce6e44bffed77d599dd4887"} Dec 03 06:54:50 crc kubenswrapper[4831]: I1203 06:54:50.858263 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aadb65a0-295d-4fcf-b148-44480346d357","Type":"ContainerStarted","Data":"42a7022437eee40dbce5d592f4a83cabd1f8ff8152cabd031da50508af22483d"} Dec 03 06:54:50 crc kubenswrapper[4831]: I1203 06:54:50.858312 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aadb65a0-295d-4fcf-b148-44480346d357","Type":"ContainerStarted","Data":"cd394ff880e63d3ae898a844be66ad605e336a324338fbbcd3be61ddca712c73"} Dec 03 06:54:50 crc kubenswrapper[4831]: I1203 06:54:50.879622 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.879584921 podStartE2EDuration="2.879584921s" podCreationTimestamp="2025-12-03 06:54:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:50.877681382 +0000 UTC m=+1428.221264970" watchObservedRunningTime="2025-12-03 06:54:50.879584921 +0000 UTC m=+1428.223168469" Dec 03 06:54:50 crc kubenswrapper[4831]: I1203 06:54:50.922427 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.922394058 podStartE2EDuration="2.922394058s" podCreationTimestamp="2025-12-03 06:54:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:54:50.906148645 +0000 UTC m=+1428.249732183" watchObservedRunningTime="2025-12-03 06:54:50.922394058 +0000 UTC m=+1428.265977606" Dec 03 06:54:52 crc kubenswrapper[4831]: I1203 06:54:52.169201 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 06:54:52 crc kubenswrapper[4831]: I1203 06:54:52.169974 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 06:54:54 crc kubenswrapper[4831]: I1203 06:54:54.309580 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 06:54:57 crc kubenswrapper[4831]: I1203 06:54:57.169730 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 06:54:57 crc kubenswrapper[4831]: I1203 06:54:57.170325 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 06:54:58 crc kubenswrapper[4831]: I1203 06:54:58.183481 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:54:58 crc kubenswrapper[4831]: I1203 06:54:58.183831 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:54:59 crc kubenswrapper[4831]: I1203 06:54:59.281421 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 06:54:59 crc kubenswrapper[4831]: I1203 06:54:59.281844 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 06:54:59 crc kubenswrapper[4831]: I1203 06:54:59.309574 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 06:54:59 crc kubenswrapper[4831]: I1203 06:54:59.353773 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 06:55:00 crc kubenswrapper[4831]: I1203 06:55:00.002105 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 06:55:00 crc kubenswrapper[4831]: I1203 06:55:00.290932 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:55:00 crc kubenswrapper[4831]: I1203 06:55:00.290979 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.205:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:55:01 crc kubenswrapper[4831]: I1203 06:55:01.261839 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 06:55:07 crc kubenswrapper[4831]: I1203 06:55:07.180963 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 06:55:07 crc kubenswrapper[4831]: I1203 06:55:07.181808 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 06:55:07 crc kubenswrapper[4831]: I1203 06:55:07.190587 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 06:55:07 crc kubenswrapper[4831]: I1203 06:55:07.191253 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 06:55:09 crc kubenswrapper[4831]: I1203 06:55:09.290662 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 06:55:09 crc kubenswrapper[4831]: I1203 06:55:09.291825 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 06:55:09 crc kubenswrapper[4831]: I1203 06:55:09.296144 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 06:55:09 crc kubenswrapper[4831]: I1203 06:55:09.304954 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 06:55:10 crc kubenswrapper[4831]: I1203 06:55:10.099859 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 06:55:10 crc kubenswrapper[4831]: I1203 06:55:10.110670 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.570417 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-768d8958fd-hbthr"] Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.573339 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.592002 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-5d79b99f8c-4tl6f"] Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.593575 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.610135 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d79b99f8c-4tl6f"] Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.626704 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-768d8958fd-hbthr"] Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.673168 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.673406 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="4cc3003a-0145-4c6f-bf53-10c7c574874f" containerName="openstackclient" containerID="cri-o://3aec4649bc0f7597a824ffeb353a8d7f39afe26459a358163be0cbfca4a1b51b" gracePeriod=2 Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.682627 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.742018 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d95vz\" (UniqueName: \"kubernetes.io/projected/46af7209-8790-44ab-b255-8c84c3f5255a-kube-api-access-d95vz\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.742114 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-config-data-custom\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.742153 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-config-data\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.742175 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-combined-ca-bundle\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.742201 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-config-data-custom\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.742231 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46af7209-8790-44ab-b255-8c84c3f5255a-logs\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.742251 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-combined-ca-bundle\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.742295 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-config-data\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.742356 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b96e376-8104-4a92-b1ec-6078943d0b50-logs\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.742385 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6mx4\" (UniqueName: \"kubernetes.io/projected/4b96e376-8104-4a92-b1ec-6078943d0b50-kube-api-access-x6mx4\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.742485 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-869cfdc5c4-7898s"] Dec 03 06:55:27 crc kubenswrapper[4831]: E1203 06:55:27.742890 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4cc3003a-0145-4c6f-bf53-10c7c574874f" containerName="openstackclient" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.742903 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4cc3003a-0145-4c6f-bf53-10c7c574874f" containerName="openstackclient" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.743092 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4cc3003a-0145-4c6f-bf53-10c7c574874f" containerName="openstackclient" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.744034 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.763396 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.781834 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-869cfdc5c4-7898s"] Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.847476 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-combined-ca-bundle\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.847753 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-config-data-custom\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.847842 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-config-data\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.847939 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-public-tls-certs\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.848033 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-config-data\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.848110 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-combined-ca-bundle\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.848197 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b16411-0a37-4432-965b-746c2d70d00b-logs\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.848277 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-config-data-custom\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.848371 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-internal-tls-certs\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.848469 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46af7209-8790-44ab-b255-8c84c3f5255a-logs\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.848550 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-config-data-custom\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.848624 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-combined-ca-bundle\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.848742 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-config-data\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.848832 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b96e376-8104-4a92-b1ec-6078943d0b50-logs\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.848903 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqpdf\" (UniqueName: \"kubernetes.io/projected/e0b16411-0a37-4432-965b-746c2d70d00b-kube-api-access-kqpdf\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.848984 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6mx4\" (UniqueName: \"kubernetes.io/projected/4b96e376-8104-4a92-b1ec-6078943d0b50-kube-api-access-x6mx4\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.849077 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d95vz\" (UniqueName: \"kubernetes.io/projected/46af7209-8790-44ab-b255-8c84c3f5255a-kube-api-access-d95vz\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.872617 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46af7209-8790-44ab-b255-8c84c3f5255a-logs\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.873073 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b96e376-8104-4a92-b1ec-6078943d0b50-logs\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.873623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-config-data-custom\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.882388 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-combined-ca-bundle\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.882871 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-config-data\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.893070 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-combined-ca-bundle\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.893950 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-config-data\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.894060 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d95vz\" (UniqueName: \"kubernetes.io/projected/46af7209-8790-44ab-b255-8c84c3f5255a-kube-api-access-d95vz\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.894427 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-config-data-custom\") pod \"barbican-worker-5d79b99f8c-4tl6f\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.915333 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6mx4\" (UniqueName: \"kubernetes.io/projected/4b96e376-8104-4a92-b1ec-6078943d0b50-kube-api-access-x6mx4\") pod \"barbican-keystone-listener-768d8958fd-hbthr\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.918762 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.949939 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.957778 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d0fdc967-7fb5-4702-b184-6953e8aefd19" containerName="openstack-network-exporter" containerID="cri-o://0546a20824087822da4e65e207c3344405637cecf083c66a5cb892cd81e57ccb" gracePeriod=300 Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.966241 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-combined-ca-bundle\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.966303 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-config-data\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.966366 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-public-tls-certs\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.966432 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b16411-0a37-4432-965b-746c2d70d00b-logs\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.966478 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-internal-tls-certs\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.966525 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-config-data-custom\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.966609 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqpdf\" (UniqueName: \"kubernetes.io/projected/e0b16411-0a37-4432-965b-746c2d70d00b-kube-api-access-kqpdf\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: E1203 06:55:27.967680 4831 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 03 06:55:27 crc kubenswrapper[4831]: E1203 06:55:27.967751 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data podName:dc0cbb94-92ec-4369-b609-f3186f302c66 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:28.467726188 +0000 UTC m=+1465.811309776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data") pod "rabbitmq-server-0" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66") : configmap "rabbitmq-config-data" not found Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.974004 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b16411-0a37-4432-965b-746c2d70d00b-logs\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.981635 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-combined-ca-bundle\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.994267 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-config-data-custom\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:27 crc kubenswrapper[4831]: I1203 06:55:27.997681 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-internal-tls-certs\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.005863 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-public-tls-certs\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.007474 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-config-data\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.027776 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqpdf\" (UniqueName: \"kubernetes.io/projected/e0b16411-0a37-4432-965b-746c2d70d00b-kube-api-access-kqpdf\") pod \"barbican-api-869cfdc5c4-7898s\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.088795 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.199965 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.368957 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutronb468-account-delete-9hl65"] Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.372956 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronb468-account-delete-9hl65" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.375562 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-nb-0" podUID="d0fdc967-7fb5-4702-b184-6953e8aefd19" containerName="ovsdbserver-nb" containerID="cri-o://bde47117f94d4b3d7621163e09c3a26678467d5110db8728318ccec32cffa515" gracePeriod=300 Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.384021 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.463634 4831 generic.go:334] "Generic (PLEG): container finished" podID="d0fdc967-7fb5-4702-b184-6953e8aefd19" containerID="0546a20824087822da4e65e207c3344405637cecf083c66a5cb892cd81e57ccb" exitCode=2 Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.463686 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d0fdc967-7fb5-4702-b184-6953e8aefd19","Type":"ContainerDied","Data":"0546a20824087822da4e65e207c3344405637cecf083c66a5cb892cd81e57ccb"} Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.513031 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwdv6\" (UniqueName: \"kubernetes.io/projected/6f54df74-81ac-43f7-9075-51cb26200c4e-kube-api-access-qwdv6\") pod \"neutronb468-account-delete-9hl65\" (UID: \"6f54df74-81ac-43f7-9075-51cb26200c4e\") " pod="openstack/neutronb468-account-delete-9hl65" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.513649 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts\") pod \"neutronb468-account-delete-9hl65\" (UID: \"6f54df74-81ac-43f7-9075-51cb26200c4e\") " pod="openstack/neutronb468-account-delete-9hl65" Dec 03 06:55:28 crc kubenswrapper[4831]: E1203 06:55:28.513893 4831 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 03 06:55:28 crc kubenswrapper[4831]: E1203 06:55:28.513941 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data podName:dc0cbb94-92ec-4369-b609-f3186f302c66 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:29.513927415 +0000 UTC m=+1466.857510923 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data") pod "rabbitmq-server-0" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66") : configmap "rabbitmq-config-data" not found Dec 03 06:55:28 crc kubenswrapper[4831]: E1203 06:55:28.581307 4831 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 03 06:55:28 crc kubenswrapper[4831]: E1203 06:55:28.581511 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data podName:8d6ac806-4ac5-4de4-b6a0-b265032150f4 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:29.081492551 +0000 UTC m=+1466.425076059 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data") pod "rabbitmq-cell1-server-0" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4") : configmap "rabbitmq-cell1-config-data" not found Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.582845 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronb468-account-delete-9hl65"] Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.623519 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts\") pod \"neutronb468-account-delete-9hl65\" (UID: \"6f54df74-81ac-43f7-9075-51cb26200c4e\") " pod="openstack/neutronb468-account-delete-9hl65" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.623945 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwdv6\" (UniqueName: \"kubernetes.io/projected/6f54df74-81ac-43f7-9075-51cb26200c4e-kube-api-access-qwdv6\") pod \"neutronb468-account-delete-9hl65\" (UID: \"6f54df74-81ac-43f7-9075-51cb26200c4e\") " pod="openstack/neutronb468-account-delete-9hl65" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.625051 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts\") pod \"neutronb468-account-delete-9hl65\" (UID: \"6f54df74-81ac-43f7-9075-51cb26200c4e\") " pod="openstack/neutronb468-account-delete-9hl65" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.673515 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.763403 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="5b548290-abc5-4c67-862c-16aa03a652da" containerName="ovn-northd" containerID="cri-o://f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa" gracePeriod=30 Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.764070 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-northd-0" podUID="5b548290-abc5-4c67-862c-16aa03a652da" containerName="openstack-network-exporter" containerID="cri-o://86e3f1fa5d839f3ee714b40f2722df67d64bd2305d8193e7a010af7f96797f76" gracePeriod=30 Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.780628 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwdv6\" (UniqueName: \"kubernetes.io/projected/6f54df74-81ac-43f7-9075-51cb26200c4e-kube-api-access-qwdv6\") pod \"neutronb468-account-delete-9hl65\" (UID: \"6f54df74-81ac-43f7-9075-51cb26200c4e\") " pod="openstack/neutronb468-account-delete-9hl65" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.802652 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement2619-account-delete-d6v4b"] Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.803974 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement2619-account-delete-d6v4b" Dec 03 06:55:28 crc kubenswrapper[4831]: W1203 06:55:28.804542 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46af7209_8790_44ab_b255_8c84c3f5255a.slice/crio-31812e5f50a2f4d7b7c81b6ee905feac628f4bd890a170bb5b7ecbb2605a2e1d WatchSource:0}: Error finding container 31812e5f50a2f4d7b7c81b6ee905feac628f4bd890a170bb5b7ecbb2605a2e1d: Status 404 returned error can't find the container with id 31812e5f50a2f4d7b7c81b6ee905feac628f4bd890a170bb5b7ecbb2605a2e1d Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.807638 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutronb468-account-delete-9hl65" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.865151 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement2619-account-delete-d6v4b"] Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.897883 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cth2\" (UniqueName: \"kubernetes.io/projected/20722e3f-f810-4ac7-80d7-09cae400150a-kube-api-access-9cth2\") pod \"placement2619-account-delete-d6v4b\" (UID: \"20722e3f-f810-4ac7-80d7-09cae400150a\") " pod="openstack/placement2619-account-delete-d6v4b" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.898080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20722e3f-f810-4ac7-80d7-09cae400150a-operator-scripts\") pod \"placement2619-account-delete-d6v4b\" (UID: \"20722e3f-f810-4ac7-80d7-09cae400150a\") " pod="openstack/placement2619-account-delete-d6v4b" Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.970407 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder3a6d-account-delete-b6ptg"] Dec 03 06:55:28 crc kubenswrapper[4831]: I1203 06:55:28.973226 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder3a6d-account-delete-b6ptg" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.006288 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cth2\" (UniqueName: \"kubernetes.io/projected/20722e3f-f810-4ac7-80d7-09cae400150a-kube-api-access-9cth2\") pod \"placement2619-account-delete-d6v4b\" (UID: \"20722e3f-f810-4ac7-80d7-09cae400150a\") " pod="openstack/placement2619-account-delete-d6v4b" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.006359 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20722e3f-f810-4ac7-80d7-09cae400150a-operator-scripts\") pod \"placement2619-account-delete-d6v4b\" (UID: \"20722e3f-f810-4ac7-80d7-09cae400150a\") " pod="openstack/placement2619-account-delete-d6v4b" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.007221 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20722e3f-f810-4ac7-80d7-09cae400150a-operator-scripts\") pod \"placement2619-account-delete-d6v4b\" (UID: \"20722e3f-f810-4ac7-80d7-09cae400150a\") " pod="openstack/placement2619-account-delete-d6v4b" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.011464 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder3a6d-account-delete-b6ptg"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.111579 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8hrh\" (UniqueName: \"kubernetes.io/projected/e69614e3-a574-42e1-adc2-09861e9974e5-kube-api-access-n8hrh\") pod \"cinder3a6d-account-delete-b6ptg\" (UID: \"e69614e3-a574-42e1-adc2-09861e9974e5\") " pod="openstack/cinder3a6d-account-delete-b6ptg" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.111675 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e69614e3-a574-42e1-adc2-09861e9974e5-operator-scripts\") pod \"cinder3a6d-account-delete-b6ptg\" (UID: \"e69614e3-a574-42e1-adc2-09861e9974e5\") " pod="openstack/cinder3a6d-account-delete-b6ptg" Dec 03 06:55:29 crc kubenswrapper[4831]: E1203 06:55:29.111971 4831 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 03 06:55:29 crc kubenswrapper[4831]: E1203 06:55:29.112010 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data podName:8d6ac806-4ac5-4de4-b6a0-b265032150f4 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:30.111995411 +0000 UTC m=+1467.455578919 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data") pod "rabbitmq-cell1-server-0" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4") : configmap "rabbitmq-cell1-config-data" not found Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.116119 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cth2\" (UniqueName: \"kubernetes.io/projected/20722e3f-f810-4ac7-80d7-09cae400150a-kube-api-access-9cth2\") pod \"placement2619-account-delete-d6v4b\" (UID: \"20722e3f-f810-4ac7-80d7-09cae400150a\") " pod="openstack/placement2619-account-delete-d6v4b" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.214495 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8hrh\" (UniqueName: \"kubernetes.io/projected/e69614e3-a574-42e1-adc2-09861e9974e5-kube-api-access-n8hrh\") pod \"cinder3a6d-account-delete-b6ptg\" (UID: \"e69614e3-a574-42e1-adc2-09861e9974e5\") " pod="openstack/cinder3a6d-account-delete-b6ptg" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.214584 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e69614e3-a574-42e1-adc2-09861e9974e5-operator-scripts\") pod \"cinder3a6d-account-delete-b6ptg\" (UID: \"e69614e3-a574-42e1-adc2-09861e9974e5\") " pod="openstack/cinder3a6d-account-delete-b6ptg" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.215406 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e69614e3-a574-42e1-adc2-09861e9974e5-operator-scripts\") pod \"cinder3a6d-account-delete-b6ptg\" (UID: \"e69614e3-a574-42e1-adc2-09861e9974e5\") " pod="openstack/cinder3a6d-account-delete-b6ptg" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.268934 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-hh6jm"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.269222 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-68mzw"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.269235 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-68mzw"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.269248 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-hh6jm"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.269257 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-dkxbk"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.269268 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-5d79b99f8c-4tl6f"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.269280 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance3e24-account-delete-nkv48"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.270345 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-dkxbk"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.270370 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance3e24-account-delete-nkv48"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.270383 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-w7h89"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.270399 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-8s4vh"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.270415 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-95nwv"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.270686 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance3e24-account-delete-nkv48" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.272057 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-metrics-8s4vh" podUID="3b5c67f9-4d9b-428a-a974-9162d81b1f02" containerName="openstack-network-exporter" containerID="cri-o://85779452492d794c13e8c469cf65390b164504234a47b5ad1564537fb4273f5a" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.282949 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-nxpq5"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.283762 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8hrh\" (UniqueName: \"kubernetes.io/projected/e69614e3-a574-42e1-adc2-09861e9974e5-kube-api-access-n8hrh\") pod \"cinder3a6d-account-delete-b6ptg\" (UID: \"e69614e3-a574-42e1-adc2-09861e9974e5\") " pod="openstack/cinder3a6d-account-delete-b6ptg" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.302369 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-nxpq5"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.306841 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="5b548290-abc5-4c67-862c-16aa03a652da" containerName="ovn-northd" probeResult="failure" output=< Dec 03 06:55:29 crc kubenswrapper[4831]: 2025-12-03T06:55:29Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Dec 03 06:55:29 crc kubenswrapper[4831]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Dec 03 06:55:29 crc kubenswrapper[4831]: 2025-12-03T06:55:29Z|00001|unixctl|WARN|failed to connect to /tmp/ovn-northd.1.ctl Dec 03 06:55:29 crc kubenswrapper[4831]: ovn-appctl: cannot connect to "/tmp/ovn-northd.1.ctl" (No such file or directory) Dec 03 06:55:29 crc kubenswrapper[4831]: > Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.318861 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54eb5ffe-7e2f-4a33-9689-0470affe10e0-operator-scripts\") pod \"glance3e24-account-delete-nkv48\" (UID: \"54eb5ffe-7e2f-4a33-9689-0470affe10e0\") " pod="openstack/glance3e24-account-delete-nkv48" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.318992 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flv5c\" (UniqueName: \"kubernetes.io/projected/54eb5ffe-7e2f-4a33-9689-0470affe10e0-kube-api-access-flv5c\") pod \"glance3e24-account-delete-nkv48\" (UID: \"54eb5ffe-7e2f-4a33-9689-0470affe10e0\") " pod="openstack/glance3e24-account-delete-nkv48" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.325924 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican3e51-account-delete-4jnl5"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.358607 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-lw9v5"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.361292 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican3e51-account-delete-4jnl5" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.362015 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" podUID="4f8ca881-226a-4311-aada-636335beea0d" containerName="dnsmasq-dns" containerID="cri-o://77f57d07ea85aecac94834d02ec77c6992bc37074e8198760a59e04a568577ff" gracePeriod=10 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.373768 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican3e51-account-delete-4jnl5"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.387525 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.387839 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="1d3273a2-0cc5-4a9d-8f8c-5828592973e8" containerName="openstack-network-exporter" containerID="cri-o://f49e9e078c0c7dc85b489ebd0a280551621b057e73799f1a37d159eb18f5dff8" gracePeriod=300 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.410937 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-t4m8r"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.420957 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54eb5ffe-7e2f-4a33-9689-0470affe10e0-operator-scripts\") pod \"glance3e24-account-delete-nkv48\" (UID: \"54eb5ffe-7e2f-4a33-9689-0470affe10e0\") " pod="openstack/glance3e24-account-delete-nkv48" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.421011 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpjfw\" (UniqueName: \"kubernetes.io/projected/81084d6a-9987-4466-8f89-455aa3ff2627-kube-api-access-vpjfw\") pod \"barbican3e51-account-delete-4jnl5\" (UID: \"81084d6a-9987-4466-8f89-455aa3ff2627\") " pod="openstack/barbican3e51-account-delete-4jnl5" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.421053 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81084d6a-9987-4466-8f89-455aa3ff2627-operator-scripts\") pod \"barbican3e51-account-delete-4jnl5\" (UID: \"81084d6a-9987-4466-8f89-455aa3ff2627\") " pod="openstack/barbican3e51-account-delete-4jnl5" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.421084 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flv5c\" (UniqueName: \"kubernetes.io/projected/54eb5ffe-7e2f-4a33-9689-0470affe10e0-kube-api-access-flv5c\") pod \"glance3e24-account-delete-nkv48\" (UID: \"54eb5ffe-7e2f-4a33-9689-0470affe10e0\") " pod="openstack/glance3e24-account-delete-nkv48" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.421909 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54eb5ffe-7e2f-4a33-9689-0470affe10e0-operator-scripts\") pod \"glance3e24-account-delete-nkv48\" (UID: \"54eb5ffe-7e2f-4a33-9689-0470affe10e0\") " pod="openstack/glance3e24-account-delete-nkv48" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.423193 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-844cdc6797-kqpvp"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.423413 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-844cdc6797-kqpvp" podUID="1f442a70-f040-4b0e-853d-6ce1f4caf63d" containerName="neutron-api" containerID="cri-o://dbe9b6c70e9b9ed94d9e8daa47ea19329c02389842e6765ffe90adb685ffb61d" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.423789 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-844cdc6797-kqpvp" podUID="1f442a70-f040-4b0e-853d-6ce1f4caf63d" containerName="neutron-httpd" containerID="cri-o://504d875e8e97248f8475ef8011f761759ddfc7620a7fb784353a5a8aabe3869e" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.440987 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flv5c\" (UniqueName: \"kubernetes.io/projected/54eb5ffe-7e2f-4a33-9689-0470affe10e0-kube-api-access-flv5c\") pod \"glance3e24-account-delete-nkv48\" (UID: \"54eb5ffe-7e2f-4a33-9689-0470affe10e0\") " pod="openstack/glance3e24-account-delete-nkv48" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.453510 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.454116 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-server" containerID="cri-o://f22d5226986449b8e588d26eb4a1afaecba31a4dd8aa09a17eaa0c7ac5bec2f5" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.454639 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="swift-recon-cron" containerID="cri-o://fd5780a1f6eae3fedba096674fc112a06bd17051dfaba7a6dbcd0187d08efbe7" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.454703 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="rsync" containerID="cri-o://4f4ec3457fbe407088edd9608e37a0d64f94f5f8a856063df097964d16e89a41" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.454766 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-expirer" containerID="cri-o://c3f41ef8ab13f137c3b5591e5b203f6dabd3e35924d132650708724f52cb311a" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.461482 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-updater" containerID="cri-o://028761397bb478f7acb2d80c81c0b10fa950424ecf33e2cf4134085e78dfa0ea" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.461574 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-auditor" containerID="cri-o://a45f8080689abc78feae747c5b6498cddb0646c863625a0fcdc630c1aa9a119f" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.461656 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-replicator" containerID="cri-o://46ddf46be9c06097453cf59db024eae7e40683f0a9989b74823fd4ae301bb17d" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.461710 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-server" containerID="cri-o://b9831c9567de92c654defb6777172aef010ae80be0cc6394662c2231bb36974f" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.461750 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-updater" containerID="cri-o://d637e5874dd130a1ed498e16b28107fbbaefcfdf4116b7e7df8a194953476096" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.461794 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-auditor" containerID="cri-o://2ba943381a5e11302635d211cf0600c94162fbd3abf92948e4a2fb1e82ec5bde" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.461845 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-replicator" containerID="cri-o://6a0f59a871e573533ec16955bf826ca1763261b3d21e5512bb24a0e62bd0b73f" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.461885 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-server" containerID="cri-o://a8bb5d58227d4221c9bfbfe3c07b51f7961676fc4bef20fa5826956eaf08b8f4" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.461921 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-reaper" containerID="cri-o://56d4eeeadba1da79db3faf5f409f1a7b9ecb1b330a57a0eb2389a1bfa3a52c83" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.461994 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-auditor" containerID="cri-o://8dafbfe58e259c950e7b63464cdbfb891e68abc0a86b7d4e0c9bc5406283a1a8" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.462036 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-storage-0" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-replicator" containerID="cri-o://3c231ebfbf6bf636033051e5a33d02c71564164db1667cbd16b072382a0b97b0" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.477706 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovsdbserver-sb-0" podUID="1d3273a2-0cc5-4a9d-8f8c-5828592973e8" containerName="ovsdbserver-sb" containerID="cri-o://f53e3cafe0fbef5b8163e8c9a9b80c9692ebb304d6df1d4e41059696b296c210" gracePeriod=300 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.477836 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-t4m8r"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.504612 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5d9bffbcdd-ztjkw"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.504903 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5d9bffbcdd-ztjkw" podUID="cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" containerName="placement-log" containerID="cri-o://0c9038e6a18689fceba5e0a50ac6d4a0a041c704037c77f6242ca7ae37b78999" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.506616 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-5d9bffbcdd-ztjkw" podUID="cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" containerName="placement-api" containerID="cri-o://4dfb2678a75458c31ce4e501a156049718eef44858bc8bb12bca6a8c8f4adbfa" gracePeriod=30 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.518469 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-f2nlc"] Dec 03 06:55:29 crc kubenswrapper[4831]: E1203 06:55:29.521990 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0fdc967_7fb5_4702_b184_6953e8aefd19.slice/crio-bde47117f94d4b3d7621163e09c3a26678467d5110db8728318ccec32cffa515.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0fdc967_7fb5_4702_b184_6953e8aefd19.slice/crio-conmon-bde47117f94d4b3d7621163e09c3a26678467d5110db8728318ccec32cffa515.scope\": RecentStats: unable to find data in memory cache]" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.522431 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpjfw\" (UniqueName: \"kubernetes.io/projected/81084d6a-9987-4466-8f89-455aa3ff2627-kube-api-access-vpjfw\") pod \"barbican3e51-account-delete-4jnl5\" (UID: \"81084d6a-9987-4466-8f89-455aa3ff2627\") " pod="openstack/barbican3e51-account-delete-4jnl5" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.522498 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81084d6a-9987-4466-8f89-455aa3ff2627-operator-scripts\") pod \"barbican3e51-account-delete-4jnl5\" (UID: \"81084d6a-9987-4466-8f89-455aa3ff2627\") " pod="openstack/barbican3e51-account-delete-4jnl5" Dec 03 06:55:29 crc kubenswrapper[4831]: E1203 06:55:29.522789 4831 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 03 06:55:29 crc kubenswrapper[4831]: E1203 06:55:29.522836 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data podName:dc0cbb94-92ec-4369-b609-f3186f302c66 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:31.52281932 +0000 UTC m=+1468.866402828 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data") pod "rabbitmq-server-0" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66") : configmap "rabbitmq-config-data" not found Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.524292 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81084d6a-9987-4466-8f89-455aa3ff2627-operator-scripts\") pod \"barbican3e51-account-delete-4jnl5\" (UID: \"81084d6a-9987-4466-8f89-455aa3ff2627\") " pod="openstack/barbican3e51-account-delete-4jnl5" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.638819 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpjfw\" (UniqueName: \"kubernetes.io/projected/81084d6a-9987-4466-8f89-455aa3ff2627-kube-api-access-vpjfw\") pod \"barbican3e51-account-delete-4jnl5\" (UID: \"81084d6a-9987-4466-8f89-455aa3ff2627\") " pod="openstack/barbican3e51-account-delete-4jnl5" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.661207 4831 generic.go:334] "Generic (PLEG): container finished" podID="5b548290-abc5-4c67-862c-16aa03a652da" containerID="86e3f1fa5d839f3ee714b40f2722df67d64bd2305d8193e7a010af7f96797f76" exitCode=2 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.661293 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b548290-abc5-4c67-862c-16aa03a652da","Type":"ContainerDied","Data":"86e3f1fa5d839f3ee714b40f2722df67d64bd2305d8193e7a010af7f96797f76"} Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.678551 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-f2nlc"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.680297 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8s4vh_3b5c67f9-4d9b-428a-a974-9162d81b1f02/openstack-network-exporter/0.log" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.681300 4831 generic.go:334] "Generic (PLEG): container finished" podID="3b5c67f9-4d9b-428a-a974-9162d81b1f02" containerID="85779452492d794c13e8c469cf65390b164504234a47b5ad1564537fb4273f5a" exitCode=2 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.681763 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8s4vh" event={"ID":"3b5c67f9-4d9b-428a-a974-9162d81b1f02","Type":"ContainerDied","Data":"85779452492d794c13e8c469cf65390b164504234a47b5ad1564537fb4273f5a"} Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.743506 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-jqkhn"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.764642 4831 generic.go:334] "Generic (PLEG): container finished" podID="4f8ca881-226a-4311-aada-636335beea0d" containerID="77f57d07ea85aecac94834d02ec77c6992bc37074e8198760a59e04a568577ff" exitCode=0 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.764738 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" event={"ID":"4f8ca881-226a-4311-aada-636335beea0d","Type":"ContainerDied","Data":"77f57d07ea85aecac94834d02ec77c6992bc37074e8198760a59e04a568577ff"} Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.773230 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d0fdc967-7fb5-4702-b184-6953e8aefd19/ovsdbserver-nb/0.log" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.773275 4831 generic.go:334] "Generic (PLEG): container finished" podID="d0fdc967-7fb5-4702-b184-6953e8aefd19" containerID="bde47117f94d4b3d7621163e09c3a26678467d5110db8728318ccec32cffa515" exitCode=143 Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.773358 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d0fdc967-7fb5-4702-b184-6953e8aefd19","Type":"ContainerDied","Data":"bde47117f94d4b3d7621163e09c3a26678467d5110db8728318ccec32cffa515"} Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.776300 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d79b99f8c-4tl6f" event={"ID":"46af7209-8790-44ab-b255-8c84c3f5255a","Type":"ContainerStarted","Data":"31812e5f50a2f4d7b7c81b6ee905feac628f4bd890a170bb5b7ecbb2605a2e1d"} Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.799194 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novacell07556-account-delete-rtljc"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.811207 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell07556-account-delete-rtljc" Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.840030 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-jqkhn"] Dec 03 06:55:29 crc kubenswrapper[4831]: I1203 06:55:29.855251 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-vw5wm"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:29.885885 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef057cb-0d02-49d4-a20d-ac9a3f85484e-operator-scripts\") pod \"novacell07556-account-delete-rtljc\" (UID: \"2ef057cb-0d02-49d4-a20d-ac9a3f85484e\") " pod="openstack/novacell07556-account-delete-rtljc" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:29.886709 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn22b\" (UniqueName: \"kubernetes.io/projected/2ef057cb-0d02-49d4-a20d-ac9a3f85484e-kube-api-access-mn22b\") pod \"novacell07556-account-delete-rtljc\" (UID: \"2ef057cb-0d02-49d4-a20d-ac9a3f85484e\") " pod="openstack/novacell07556-account-delete-rtljc" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:29.924893 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell07556-account-delete-rtljc"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:29.942644 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-vw5wm"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:29.997447 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mn22b\" (UniqueName: \"kubernetes.io/projected/2ef057cb-0d02-49d4-a20d-ac9a3f85484e-kube-api-access-mn22b\") pod \"novacell07556-account-delete-rtljc\" (UID: \"2ef057cb-0d02-49d4-a20d-ac9a3f85484e\") " pod="openstack/novacell07556-account-delete-rtljc" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:29.997539 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef057cb-0d02-49d4-a20d-ac9a3f85484e-operator-scripts\") pod \"novacell07556-account-delete-rtljc\" (UID: \"2ef057cb-0d02-49d4-a20d-ac9a3f85484e\") " pod="openstack/novacell07556-account-delete-rtljc" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.013484 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement2619-account-delete-d6v4b" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.014147 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef057cb-0d02-49d4-a20d-ac9a3f85484e-operator-scripts\") pod \"novacell07556-account-delete-rtljc\" (UID: \"2ef057cb-0d02-49d4-a20d-ac9a3f85484e\") " pod="openstack/novacell07556-account-delete-rtljc" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.019901 4831 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/cinder-api-0" secret="" err="secret \"cinder-cinder-dockercfg-qh756\" not found" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.092535 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.093056 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="fa796212-03db-4860-93f4-d2918ed44070" containerName="cinder-scheduler" containerID="cri-o://e1b665f5de4ff9fea0d7f46233aebcb76bd558b21574f6845a1c4a0745851315" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.093408 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn22b\" (UniqueName: \"kubernetes.io/projected/2ef057cb-0d02-49d4-a20d-ac9a3f85484e-kube-api-access-mn22b\") pod \"novacell07556-account-delete-rtljc\" (UID: \"2ef057cb-0d02-49d4-a20d-ac9a3f85484e\") " pod="openstack/novacell07556-account-delete-rtljc" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.093490 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="fa796212-03db-4860-93f4-d2918ed44070" containerName="probe" containerID="cri-o://fd1e277c8951121dadc44ba354b735192014953fab00452b13030855e8591cd1" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.102754 4831 secret.go:188] Couldn't get secret openstack/cinder-api-config-data: secret "cinder-api-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.102806 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom podName:1eab7a3f-11f0-4d00-b436-93cc30c2e8e1 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:30.602790815 +0000 UTC m=+1467.946374323 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom") pod "cinder-api-0" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1") : secret "cinder-api-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.102842 4831 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.102861 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data podName:1eab7a3f-11f0-4d00-b436-93cc30c2e8e1 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:30.602854737 +0000 UTC m=+1467.946438245 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data") pod "cinder-api-0" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1") : secret "cinder-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.103840 4831 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.103866 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts podName:1eab7a3f-11f0-4d00-b436-93cc30c2e8e1 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:30.603858227 +0000 UTC m=+1467.947441735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts") pod "cinder-api-0" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1") : secret "cinder-scripts" not found Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.138610 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/novaapi8edc-account-delete-2pvdg"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.139934 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi8edc-account-delete-2pvdg" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.178151 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder3a6d-account-delete-b6ptg" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.180521 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi8edc-account-delete-2pvdg"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.204784 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2rnn\" (UniqueName: \"kubernetes.io/projected/56055ee5-407e-4ced-865f-03585e5f7f7b-kube-api-access-h2rnn\") pod \"novaapi8edc-account-delete-2pvdg\" (UID: \"56055ee5-407e-4ced-865f-03585e5f7f7b\") " pod="openstack/novaapi8edc-account-delete-2pvdg" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.204824 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56055ee5-407e-4ced-865f-03585e5f7f7b-operator-scripts\") pod \"novaapi8edc-account-delete-2pvdg\" (UID: \"56055ee5-407e-4ced-865f-03585e5f7f7b\") " pod="openstack/novaapi8edc-account-delete-2pvdg" Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.205082 4831 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.205126 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data podName:8d6ac806-4ac5-4de4-b6a0-b265032150f4 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:32.205110837 +0000 UTC m=+1469.548694345 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data") pod "rabbitmq-cell1-server-0" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4") : configmap "rabbitmq-cell1-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.205491 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance3e24-account-delete-nkv48" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.208639 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican3e51-account-delete-4jnl5" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.275101 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovs-vswitchd" containerID="cri-o://e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" gracePeriod=29 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.276125 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novacell07556-account-delete-rtljc" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.284508 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.299769 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d0fdc967-7fb5-4702-b184-6953e8aefd19/ovsdbserver-nb/0.log" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.299830 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.301140 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f53e3cafe0fbef5b8163e8c9a9b80c9692ebb304d6df1d4e41059696b296c210 is running failed: container process not found" containerID="f53e3cafe0fbef5b8163e8c9a9b80c9692ebb304d6df1d4e41059696b296c210" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.301661 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f53e3cafe0fbef5b8163e8c9a9b80c9692ebb304d6df1d4e41059696b296c210 is running failed: container process not found" containerID="f53e3cafe0fbef5b8163e8c9a9b80c9692ebb304d6df1d4e41059696b296c210" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.302004 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f53e3cafe0fbef5b8163e8c9a9b80c9692ebb304d6df1d4e41059696b296c210 is running failed: container process not found" containerID="f53e3cafe0fbef5b8163e8c9a9b80c9692ebb304d6df1d4e41059696b296c210" cmd=["/usr/bin/pidof","ovsdb-server"] Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.302064 4831 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f53e3cafe0fbef5b8163e8c9a9b80c9692ebb304d6df1d4e41059696b296c210 is running failed: container process not found" probeType="Readiness" pod="openstack/ovsdbserver-sb-0" podUID="1d3273a2-0cc5-4a9d-8f8c-5828592973e8" containerName="ovsdbserver-sb" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.306729 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2rnn\" (UniqueName: \"kubernetes.io/projected/56055ee5-407e-4ced-865f-03585e5f7f7b-kube-api-access-h2rnn\") pod \"novaapi8edc-account-delete-2pvdg\" (UID: \"56055ee5-407e-4ced-865f-03585e5f7f7b\") " pod="openstack/novaapi8edc-account-delete-2pvdg" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.306769 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56055ee5-407e-4ced-865f-03585e5f7f7b-operator-scripts\") pod \"novaapi8edc-account-delete-2pvdg\" (UID: \"56055ee5-407e-4ced-865f-03585e5f7f7b\") " pod="openstack/novaapi8edc-account-delete-2pvdg" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.307572 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56055ee5-407e-4ced-865f-03585e5f7f7b-operator-scripts\") pod \"novaapi8edc-account-delete-2pvdg\" (UID: \"56055ee5-407e-4ced-865f-03585e5f7f7b\") " pod="openstack/novaapi8edc-account-delete-2pvdg" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.309847 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.322666 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2rnn\" (UniqueName: \"kubernetes.io/projected/56055ee5-407e-4ced-865f-03585e5f7f7b-kube-api-access-h2rnn\") pod \"novaapi8edc-account-delete-2pvdg\" (UID: \"56055ee5-407e-4ced-865f-03585e5f7f7b\") " pod="openstack/novaapi8edc-account-delete-2pvdg" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.332904 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.333154 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="267687cf-58da-42e0-852e-c8c87f2ea42a" containerName="glance-log" containerID="cri-o://a6fcbb703b3c57a0767a6e84e0a057dc745d0f18dbb25eedf9c9e63269b59524" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.333623 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="267687cf-58da-42e0-852e-c8c87f2ea42a" containerName="glance-httpd" containerID="cri-o://134ca55163ecb5e5269d7ec7014e4b029385ad75e92d2133a4bef2bd84b55d9f" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.354602 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.354919 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7a5d88e3-73a3-4f3d-af31-af675ab452bd" containerName="glance-log" containerID="cri-o://67de1998f02ce4f499b3387ae3a93d5ab8ad64720e8c404dce1cf6612058988f" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.355381 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="7a5d88e3-73a3-4f3d-af31-af675ab452bd" containerName="glance-httpd" containerID="cri-o://74e0644a9792269f43c86550ff06e6fa5782d3aaca7a69696c14e40e26a5beec" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.386877 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.406471 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.408002 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-nb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"d0fdc967-7fb5-4702-b184-6953e8aefd19\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.408031 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d0fdc967-7fb5-4702-b184-6953e8aefd19-ovsdb-rundir\") pod \"d0fdc967-7fb5-4702-b184-6953e8aefd19\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.408069 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-ovsdbserver-nb-tls-certs\") pod \"d0fdc967-7fb5-4702-b184-6953e8aefd19\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.408145 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2htg6\" (UniqueName: \"kubernetes.io/projected/d0fdc967-7fb5-4702-b184-6953e8aefd19-kube-api-access-2htg6\") pod \"d0fdc967-7fb5-4702-b184-6953e8aefd19\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.408183 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-metrics-certs-tls-certs\") pod \"d0fdc967-7fb5-4702-b184-6953e8aefd19\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.408210 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-combined-ca-bundle\") pod \"d0fdc967-7fb5-4702-b184-6953e8aefd19\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.408275 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fdc967-7fb5-4702-b184-6953e8aefd19-config\") pod \"d0fdc967-7fb5-4702-b184-6953e8aefd19\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.408376 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0fdc967-7fb5-4702-b184-6953e8aefd19-scripts\") pod \"d0fdc967-7fb5-4702-b184-6953e8aefd19\" (UID: \"d0fdc967-7fb5-4702-b184-6953e8aefd19\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.414072 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0fdc967-7fb5-4702-b184-6953e8aefd19-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "d0fdc967-7fb5-4702-b184-6953e8aefd19" (UID: "d0fdc967-7fb5-4702-b184-6953e8aefd19"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.417780 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0fdc967-7fb5-4702-b184-6953e8aefd19-scripts" (OuterVolumeSpecName: "scripts") pod "d0fdc967-7fb5-4702-b184-6953e8aefd19" (UID: "d0fdc967-7fb5-4702-b184-6953e8aefd19"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.418937 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0fdc967-7fb5-4702-b184-6953e8aefd19-config" (OuterVolumeSpecName: "config") pod "d0fdc967-7fb5-4702-b184-6953e8aefd19" (UID: "d0fdc967-7fb5-4702-b184-6953e8aefd19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.420944 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "ovndbcluster-nb-etc-ovn") pod "d0fdc967-7fb5-4702-b184-6953e8aefd19" (UID: "d0fdc967-7fb5-4702-b184-6953e8aefd19"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.426778 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0fdc967-7fb5-4702-b184-6953e8aefd19-kube-api-access-2htg6" (OuterVolumeSpecName: "kube-api-access-2htg6") pod "d0fdc967-7fb5-4702-b184-6953e8aefd19" (UID: "d0fdc967-7fb5-4702-b184-6953e8aefd19"). InnerVolumeSpecName "kube-api-access-2htg6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.458785 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="dc0cbb94-92ec-4369-b609-f3186f302c66" containerName="rabbitmq" containerID="cri-o://309cf75d8cbb562efc0116b04c83dd3b65a4b14e1de35f409f188bc0a497e2b6" gracePeriod=604800 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.460285 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-869cfdc5c4-7898s"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.475014 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="8d6ac806-4ac5-4de4-b6a0-b265032150f4" containerName="rabbitmq" containerID="cri-o://e701230003bdb542945df73fbff521649b1c0dbe31d133988a35ce0bd844cb8e" gracePeriod=604800 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.511278 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.511754 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerName="nova-metadata-log" containerID="cri-o://7d0ad17b9c4b331c9b8e4e43f1949619c278138c064021159c668b145db79938" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.512177 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerName="nova-metadata-metadata" containerID="cri-o://f6d28e8b3135eebdfd374f9c303c4d8edb75a0da36bd550a21205abedf1942c2" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.527005 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2htg6\" (UniqueName: \"kubernetes.io/projected/d0fdc967-7fb5-4702-b184-6953e8aefd19-kube-api-access-2htg6\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.527029 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0fdc967-7fb5-4702-b184-6953e8aefd19-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.527038 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/d0fdc967-7fb5-4702-b184-6953e8aefd19-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.527059 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.527068 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/d0fdc967-7fb5-4702-b184-6953e8aefd19-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.540986 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5d79b99f8c-4tl6f"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.548947 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-55749f9879-hprsg"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.549283 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-55749f9879-hprsg" podUID="3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" containerName="barbican-worker-log" containerID="cri-o://9e502551a8c629987632ae4db8d5bb2dcc87f5d6796281aca903d0518128dc14" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.549452 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-55749f9879-hprsg" podUID="3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" containerName="barbican-worker" containerID="cri-o://499ef1cada2eb4986f003c7c29c7c54c82c99a884b5bcf1f1b4f4fbda65a3300" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.566069 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-574cdc6988-72ggg"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.566375 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-574cdc6988-72ggg" podUID="770b98aa-f177-4c7b-b37e-1664c039f47d" containerName="barbican-api-log" containerID="cri-o://bfdaec1e442053ec75746ee712e4fee25a008c9a2e11ba56c206f7698abbcfcd" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.566893 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-574cdc6988-72ggg" podUID="770b98aa-f177-4c7b-b37e-1664c039f47d" containerName="barbican-api" containerID="cri-o://9c37d8393b980bdf5ae1d421b8b8297aec83f883d91765e900851761b2758842" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.593444 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-869cfdc5c4-7898s"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.593962 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi8edc-account-delete-2pvdg" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.620480 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.620749 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" podUID="fd770919-d80e-4947-b52d-673beb117374" containerName="barbican-keystone-listener-log" containerID="cri-o://aca1e2538c4e0fa0e9737c5d8550f625f94f8366334b8f2873812e5895924cfc" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.620903 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" podUID="fd770919-d80e-4947-b52d-673beb117374" containerName="barbican-keystone-listener" containerID="cri-o://4de12fb31b83a10dc830b04fa625356818e600bca291e7e4c8c179b0fba4550c" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.626403 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.628985 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.629080 4831 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.629124 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts podName:1eab7a3f-11f0-4d00-b436-93cc30c2e8e1 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:31.629110176 +0000 UTC m=+1468.972693684 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts") pod "cinder-api-0" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1") : secret "cinder-scripts" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.629202 4831 secret.go:188] Couldn't get secret openstack/cinder-api-config-data: secret "cinder-api-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.629224 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom podName:1eab7a3f-11f0-4d00-b436-93cc30c2e8e1 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:31.629217769 +0000 UTC m=+1468.972801277 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom") pod "cinder-api-0" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1") : secret "cinder-api-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.629256 4831 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.629274 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data podName:1eab7a3f-11f0-4d00-b436-93cc30c2e8e1 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:31.62926717 +0000 UTC m=+1468.972850678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data") pod "cinder-api-0" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1") : secret "cinder-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.639033 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-768d8958fd-hbthr"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.647587 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-ovsdbserver-nb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-nb-tls-certs") pod "d0fdc967-7fb5-4702-b184-6953e8aefd19" (UID: "d0fdc967-7fb5-4702-b184-6953e8aefd19"). InnerVolumeSpecName "ovsdbserver-nb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.648697 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.648986 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" containerName="nova-api-log" containerID="cri-o://d9f25cfe98a3d7189a069459787b3914756469c02ce6e44bffed77d599dd4887" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.649154 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" containerName="nova-api-api" containerID="cri-o://a5a52868ab8bcbffa60ec710ffeaf3085cae1de015dd39d00cd7003328b22a28" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.665400 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-d6pll"] Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.666366 4831 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 03 06:55:31 crc kubenswrapper[4831]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 03 06:55:31 crc kubenswrapper[4831]: + source /usr/local/bin/container-scripts/functions Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNBridge=br-int Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNRemote=tcp:localhost:6642 Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNEncapType=geneve Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNAvailabilityZones= Dec 03 06:55:31 crc kubenswrapper[4831]: ++ EnableChassisAsGateway=true Dec 03 06:55:31 crc kubenswrapper[4831]: ++ PhysicalNetworks= Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNHostName= Dec 03 06:55:31 crc kubenswrapper[4831]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 03 06:55:31 crc kubenswrapper[4831]: ++ ovs_dir=/var/lib/openvswitch Dec 03 06:55:31 crc kubenswrapper[4831]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 03 06:55:31 crc kubenswrapper[4831]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 03 06:55:31 crc kubenswrapper[4831]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 03 06:55:31 crc kubenswrapper[4831]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 06:55:31 crc kubenswrapper[4831]: + sleep 0.5 Dec 03 06:55:31 crc kubenswrapper[4831]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 06:55:31 crc kubenswrapper[4831]: + sleep 0.5 Dec 03 06:55:31 crc kubenswrapper[4831]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 06:55:31 crc kubenswrapper[4831]: + cleanup_ovsdb_server_semaphore Dec 03 06:55:31 crc kubenswrapper[4831]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 03 06:55:31 crc kubenswrapper[4831]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 03 06:55:31 crc kubenswrapper[4831]: > execCommand=["/usr/local/bin/container-scripts/stop-ovsdb-server.sh"] containerName="ovsdb-server" pod="openstack/ovn-controller-ovs-w7h89" message=< Dec 03 06:55:31 crc kubenswrapper[4831]: Exiting ovsdb-server (5) [ OK ] Dec 03 06:55:31 crc kubenswrapper[4831]: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 03 06:55:31 crc kubenswrapper[4831]: + source /usr/local/bin/container-scripts/functions Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNBridge=br-int Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNRemote=tcp:localhost:6642 Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNEncapType=geneve Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNAvailabilityZones= Dec 03 06:55:31 crc kubenswrapper[4831]: ++ EnableChassisAsGateway=true Dec 03 06:55:31 crc kubenswrapper[4831]: ++ PhysicalNetworks= Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNHostName= Dec 03 06:55:31 crc kubenswrapper[4831]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 03 06:55:31 crc kubenswrapper[4831]: ++ ovs_dir=/var/lib/openvswitch Dec 03 06:55:31 crc kubenswrapper[4831]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 03 06:55:31 crc kubenswrapper[4831]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 03 06:55:31 crc kubenswrapper[4831]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 03 06:55:31 crc kubenswrapper[4831]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 06:55:31 crc kubenswrapper[4831]: + sleep 0.5 Dec 03 06:55:31 crc kubenswrapper[4831]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 06:55:31 crc kubenswrapper[4831]: + sleep 0.5 Dec 03 06:55:31 crc kubenswrapper[4831]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 06:55:31 crc kubenswrapper[4831]: + cleanup_ovsdb_server_semaphore Dec 03 06:55:31 crc kubenswrapper[4831]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 03 06:55:31 crc kubenswrapper[4831]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 03 06:55:31 crc kubenswrapper[4831]: > Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:30.666438 4831 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 03 06:55:31 crc kubenswrapper[4831]: command '/usr/local/bin/container-scripts/stop-ovsdb-server.sh' exited with 137: ++ dirname /usr/local/bin/container-scripts/stop-ovsdb-server.sh Dec 03 06:55:31 crc kubenswrapper[4831]: + source /usr/local/bin/container-scripts/functions Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNBridge=br-int Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNRemote=tcp:localhost:6642 Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNEncapType=geneve Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNAvailabilityZones= Dec 03 06:55:31 crc kubenswrapper[4831]: ++ EnableChassisAsGateway=true Dec 03 06:55:31 crc kubenswrapper[4831]: ++ PhysicalNetworks= Dec 03 06:55:31 crc kubenswrapper[4831]: ++ OVNHostName= Dec 03 06:55:31 crc kubenswrapper[4831]: ++ DB_FILE=/etc/openvswitch/conf.db Dec 03 06:55:31 crc kubenswrapper[4831]: ++ ovs_dir=/var/lib/openvswitch Dec 03 06:55:31 crc kubenswrapper[4831]: ++ FLOWS_RESTORE_SCRIPT=/var/lib/openvswitch/flows-script Dec 03 06:55:31 crc kubenswrapper[4831]: ++ FLOWS_RESTORE_DIR=/var/lib/openvswitch/saved-flows Dec 03 06:55:31 crc kubenswrapper[4831]: ++ SAFE_TO_STOP_OVSDB_SERVER_SEMAPHORE=/var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 03 06:55:31 crc kubenswrapper[4831]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 06:55:31 crc kubenswrapper[4831]: + sleep 0.5 Dec 03 06:55:31 crc kubenswrapper[4831]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 06:55:31 crc kubenswrapper[4831]: + sleep 0.5 Dec 03 06:55:31 crc kubenswrapper[4831]: + '[' '!' -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server ']' Dec 03 06:55:31 crc kubenswrapper[4831]: + cleanup_ovsdb_server_semaphore Dec 03 06:55:31 crc kubenswrapper[4831]: + rm -f /var/lib/openvswitch/is_safe_to_stop_ovsdb_server Dec 03 06:55:31 crc kubenswrapper[4831]: + /usr/share/openvswitch/scripts/ovs-ctl stop --no-ovs-vswitchd Dec 03 06:55:31 crc kubenswrapper[4831]: > pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovsdb-server" containerID="cri-o://63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.666508 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovsdb-server" containerID="cri-o://63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" gracePeriod=29 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.678577 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0fdc967-7fb5-4702-b184-6953e8aefd19" (UID: "d0fdc967-7fb5-4702-b184-6953e8aefd19"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.695389 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-d6pll"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.730441 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.730702 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="d0bdb64e-afcf-408c-90ea-fa8532c1a6f2" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://a083ce8f23553bd89f6c5a7fa97eb108d3aaa5ca11ad5ec36e50548b3530fae7" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.731196 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-ovsdbserver-nb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.731225 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.740767 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-dc42-account-create-update-w5llk"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.742654 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8s4vh_3b5c67f9-4d9b-428a-a974-9162d81b1f02/openstack-network-exporter/0.log" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.742729 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.748233 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-cell1-galera-0" podUID="e358bf07-df54-4268-9421-f31c57f5594c" containerName="galera" containerID="cri-o://a1ad2c7b7ffe4f7516b2ef2e3d38d3444b1dcf04767f85dd6169d820a50b087a" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.752400 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-dc42-account-create-update-w5llk"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.767830 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "d0fdc967-7fb5-4702-b184-6953e8aefd19" (UID: "d0fdc967-7fb5-4702-b184-6953e8aefd19"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.771363 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gjh7q"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.820097 4831 generic.go:334] "Generic (PLEG): container finished" podID="267687cf-58da-42e0-852e-c8c87f2ea42a" containerID="a6fcbb703b3c57a0767a6e84e0a057dc745d0f18dbb25eedf9c9e63269b59524" exitCode=143 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.820192 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"267687cf-58da-42e0-852e-c8c87f2ea42a","Type":"ContainerDied","Data":"a6fcbb703b3c57a0767a6e84e0a057dc745d0f18dbb25eedf9c9e63269b59524"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.832752 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3b5c67f9-4d9b-428a-a974-9162d81b1f02-ovn-rundir\") pod \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.832867 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b5c67f9-4d9b-428a-a974-9162d81b1f02-metrics-certs-tls-certs\") pod \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.833008 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5c67f9-4d9b-428a-a974-9162d81b1f02-combined-ca-bundle\") pod \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.833091 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfd4c\" (UniqueName: \"kubernetes.io/projected/3b5c67f9-4d9b-428a-a974-9162d81b1f02-kube-api-access-gfd4c\") pod \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.833146 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b5c67f9-4d9b-428a-a974-9162d81b1f02-config\") pod \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.833173 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3b5c67f9-4d9b-428a-a974-9162d81b1f02-ovs-rundir\") pod \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\" (UID: \"3b5c67f9-4d9b-428a-a974-9162d81b1f02\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.834370 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0fdc967-7fb5-4702-b184-6953e8aefd19-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.834416 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b5c67f9-4d9b-428a-a974-9162d81b1f02-ovs-rundir" (OuterVolumeSpecName: "ovs-rundir") pod "3b5c67f9-4d9b-428a-a974-9162d81b1f02" (UID: "3b5c67f9-4d9b-428a-a974-9162d81b1f02"). InnerVolumeSpecName "ovs-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.834445 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b5c67f9-4d9b-428a-a974-9162d81b1f02-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "3b5c67f9-4d9b-428a-a974-9162d81b1f02" (UID: "3b5c67f9-4d9b-428a-a974-9162d81b1f02"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.838564 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b5c67f9-4d9b-428a-a974-9162d81b1f02-kube-api-access-gfd4c" (OuterVolumeSpecName: "kube-api-access-gfd4c") pod "3b5c67f9-4d9b-428a-a974-9162d81b1f02" (UID: "3b5c67f9-4d9b-428a-a974-9162d81b1f02"). InnerVolumeSpecName "kube-api-access-gfd4c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.838807 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.839030 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="c60bce87-ea0b-4b3d-8243-93ed40c232ff" containerName="nova-cell1-conductor-conductor" containerID="cri-o://d2b937efb5cb363c5d2b1ffebf2c05783efceefbe6e756b381e8b96438b287fb" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.840831 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3b5c67f9-4d9b-428a-a974-9162d81b1f02-config" (OuterVolumeSpecName: "config") pod "3b5c67f9-4d9b-428a-a974-9162d81b1f02" (UID: "3b5c67f9-4d9b-428a-a974-9162d81b1f02"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.841981 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.844410 4831 generic.go:334] "Generic (PLEG): container finished" podID="1f442a70-f040-4b0e-853d-6ce1f4caf63d" containerID="504d875e8e97248f8475ef8011f761759ddfc7620a7fb784353a5a8aabe3869e" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.844857 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-844cdc6797-kqpvp" event={"ID":"1f442a70-f040-4b0e-853d-6ce1f4caf63d","Type":"ContainerDied","Data":"504d875e8e97248f8475ef8011f761759ddfc7620a7fb784353a5a8aabe3869e"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.854799 4831 generic.go:334] "Generic (PLEG): container finished" podID="9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" containerID="d9f25cfe98a3d7189a069459787b3914756469c02ce6e44bffed77d599dd4887" exitCode=143 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.854872 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50","Type":"ContainerDied","Data":"d9f25cfe98a3d7189a069459787b3914756469c02ce6e44bffed77d599dd4887"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.856723 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-869cfdc5c4-7898s" event={"ID":"e0b16411-0a37-4432-965b-746c2d70d00b","Type":"ContainerStarted","Data":"687ba234ddf57a90f8ad1f9e42d53b83f605abffd086bec5e4128003fa2936b1"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.858286 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.859258 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-lw9v5" event={"ID":"4f8ca881-226a-4311-aada-636335beea0d","Type":"ContainerDied","Data":"4ed9b53c437be18c4fdd3fa4018341e807fe81e32f2a54f0a2ad58fe0b7646d7"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.859299 4831 scope.go:117] "RemoveContainer" containerID="77f57d07ea85aecac94834d02ec77c6992bc37074e8198760a59e04a568577ff" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.920732 4831 generic.go:334] "Generic (PLEG): container finished" podID="7a5d88e3-73a3-4f3d-af31-af675ab452bd" containerID="67de1998f02ce4f499b3387ae3a93d5ab8ad64720e8c404dce1cf6612058988f" exitCode=143 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.921012 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a5d88e3-73a3-4f3d-af31-af675ab452bd","Type":"ContainerDied","Data":"67de1998f02ce4f499b3387ae3a93d5ab8ad64720e8c404dce1cf6612058988f"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.925134 4831 generic.go:334] "Generic (PLEG): container finished" podID="770b98aa-f177-4c7b-b37e-1664c039f47d" containerID="bfdaec1e442053ec75746ee712e4fee25a008c9a2e11ba56c206f7698abbcfcd" exitCode=143 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.925211 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574cdc6988-72ggg" event={"ID":"770b98aa-f177-4c7b-b37e-1664c039f47d","Type":"ContainerDied","Data":"bfdaec1e442053ec75746ee712e4fee25a008c9a2e11ba56c206f7698abbcfcd"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.954774 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5c67f9-4d9b-428a-a974-9162d81b1f02-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3b5c67f9-4d9b-428a-a974-9162d81b1f02" (UID: "3b5c67f9-4d9b-428a-a974-9162d81b1f02"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.960414 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-ovsdbserver-sb\") pod \"4f8ca881-226a-4311-aada-636335beea0d\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.960558 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-dns-svc\") pod \"4f8ca881-226a-4311-aada-636335beea0d\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.960587 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-config\") pod \"4f8ca881-226a-4311-aada-636335beea0d\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.960702 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-dns-swift-storage-0\") pod \"4f8ca881-226a-4311-aada-636335beea0d\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.960752 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztzkg\" (UniqueName: \"kubernetes.io/projected/4f8ca881-226a-4311-aada-636335beea0d-kube-api-access-ztzkg\") pod \"4f8ca881-226a-4311-aada-636335beea0d\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.960792 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-ovsdbserver-nb\") pod \"4f8ca881-226a-4311-aada-636335beea0d\" (UID: \"4f8ca881-226a-4311-aada-636335beea0d\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.961196 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3b5c67f9-4d9b-428a-a974-9162d81b1f02-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.961208 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfd4c\" (UniqueName: \"kubernetes.io/projected/3b5c67f9-4d9b-428a-a974-9162d81b1f02-kube-api-access-gfd4c\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.961220 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b5c67f9-4d9b-428a-a974-9162d81b1f02-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.961228 4831 reconciler_common.go:293] "Volume detached for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/3b5c67f9-4d9b-428a-a974-9162d81b1f02-ovs-rundir\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.961236 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/3b5c67f9-4d9b-428a-a974-9162d81b1f02-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.985531 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-gjh7q"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.986232 4831 generic.go:334] "Generic (PLEG): container finished" podID="3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" containerID="9e502551a8c629987632ae4db8d5bb2dcc87f5d6796281aca903d0518128dc14" exitCode=143 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.986363 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55749f9879-hprsg" event={"ID":"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27","Type":"ContainerDied","Data":"9e502551a8c629987632ae4db8d5bb2dcc87f5d6796281aca903d0518128dc14"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:30.989522 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f8ca881-226a-4311-aada-636335beea0d-kube-api-access-ztzkg" (OuterVolumeSpecName: "kube-api-access-ztzkg") pod "4f8ca881-226a-4311-aada-636335beea0d" (UID: "4f8ca881-226a-4311-aada-636335beea0d"). InnerVolumeSpecName "kube-api-access-ztzkg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.000896 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f657b4b-bed8-4244-8727-2a3c59364041" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.001250 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-w7h89" event={"ID":"7f657b4b-bed8-4244-8727-2a3c59364041","Type":"ContainerDied","Data":"63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.005125 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_d0fdc967-7fb5-4702-b184-6953e8aefd19/ovsdbserver-nb/0.log" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.005184 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"d0fdc967-7fb5-4702-b184-6953e8aefd19","Type":"ContainerDied","Data":"144fc818d205f39c5f8dc72c8b47e039d45e0bb1ac2fe71d12542c87f95b9669"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.005272 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.008427 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b5c67f9-4d9b-428a-a974-9162d81b1f02-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "3b5c67f9-4d9b-428a-a974-9162d81b1f02" (UID: "3b5c67f9-4d9b-428a-a974-9162d81b1f02"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.015348 4831 generic.go:334] "Generic (PLEG): container finished" podID="4cc3003a-0145-4c6f-bf53-10c7c574874f" containerID="3aec4649bc0f7597a824ffeb353a8d7f39afe26459a358163be0cbfca4a1b51b" exitCode=137 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.017558 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1d3273a2-0cc5-4a9d-8f8c-5828592973e8/ovsdbserver-sb/0.log" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.017581 4831 generic.go:334] "Generic (PLEG): container finished" podID="1d3273a2-0cc5-4a9d-8f8c-5828592973e8" containerID="f49e9e078c0c7dc85b489ebd0a280551621b057e73799f1a37d159eb18f5dff8" exitCode=2 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.017592 4831 generic.go:334] "Generic (PLEG): container finished" podID="1d3273a2-0cc5-4a9d-8f8c-5828592973e8" containerID="f53e3cafe0fbef5b8163e8c9a9b80c9692ebb304d6df1d4e41059696b296c210" exitCode=143 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.034657 4831 generic.go:334] "Generic (PLEG): container finished" podID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerID="7d0ad17b9c4b331c9b8e4e43f1949619c278138c064021159c668b145db79938" exitCode=143 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.054625 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0678352c-a29a-4a58-908c-7066bbfe5825" path="/var/lib/kubelet/pods/0678352c-a29a-4a58-908c-7066bbfe5825/volumes" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.055177 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1e1232dc-f0a3-4694-ba92-127eaba9fda6" path="/var/lib/kubelet/pods/1e1232dc-f0a3-4694-ba92-127eaba9fda6/volumes" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.055703 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e478744-468b-40e8-b4a3-236bdd2bd5ca" path="/var/lib/kubelet/pods/6e478744-468b-40e8-b4a3-236bdd2bd5ca/volumes" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.057035 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ce49982-d05f-49dd-9b08-ae54e662b628" path="/var/lib/kubelet/pods/9ce49982-d05f-49dd-9b08-ae54e662b628/volumes" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.057907 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f2a305c-36bf-477e-8468-919407ea5d90" path="/var/lib/kubelet/pods/9f2a305c-36bf-477e-8468-919407ea5d90/volumes" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.058633 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a70b57c4-0e81-4c6f-95db-7e4dfb9231df" path="/var/lib/kubelet/pods/a70b57c4-0e81-4c6f-95db-7e4dfb9231df/volumes" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.060128 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa6da26d-604c-42a4-8c46-c3437894a4ae" path="/var/lib/kubelet/pods/aa6da26d-604c-42a4-8c46-c3437894a4ae/volumes" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.061137 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6e74a76-928d-4a03-ae60-6749fafef9ae" path="/var/lib/kubelet/pods/b6e74a76-928d-4a03-ae60-6749fafef9ae/volumes" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.061661 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c09903ad-c95e-4958-904f-11dc2e7e52cb" path="/var/lib/kubelet/pods/c09903ad-c95e-4958-904f-11dc2e7e52cb/volumes" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.062191 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c58a3c00-1c5b-4c17-88d7-459798e81d76" path="/var/lib/kubelet/pods/c58a3c00-1c5b-4c17-88d7-459798e81d76/volumes" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.062846 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztzkg\" (UniqueName: \"kubernetes.io/projected/4f8ca881-226a-4311-aada-636335beea0d-kube-api-access-ztzkg\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.062869 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/3b5c67f9-4d9b-428a-a974-9162d81b1f02-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.063536 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e07ae49e-a6fb-478a-8b6f-3f8f687f4afd" path="/var/lib/kubelet/pods/e07ae49e-a6fb-478a-8b6f-3f8f687f4afd/volumes" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090602 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="4f4ec3457fbe407088edd9608e37a0d64f94f5f8a856063df097964d16e89a41" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090629 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="c3f41ef8ab13f137c3b5591e5b203f6dabd3e35924d132650708724f52cb311a" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090639 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="028761397bb478f7acb2d80c81c0b10fa950424ecf33e2cf4134085e78dfa0ea" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090647 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="a45f8080689abc78feae747c5b6498cddb0646c863625a0fcdc630c1aa9a119f" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090658 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="46ddf46be9c06097453cf59db024eae7e40683f0a9989b74823fd4ae301bb17d" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090666 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="b9831c9567de92c654defb6777172aef010ae80be0cc6394662c2231bb36974f" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090676 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="d637e5874dd130a1ed498e16b28107fbbaefcfdf4116b7e7df8a194953476096" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090687 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="2ba943381a5e11302635d211cf0600c94162fbd3abf92948e4a2fb1e82ec5bde" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090695 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="6a0f59a871e573533ec16955bf826ca1763261b3d21e5512bb24a0e62bd0b73f" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090704 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="a8bb5d58227d4221c9bfbfe3c07b51f7961676fc4bef20fa5826956eaf08b8f4" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090712 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="56d4eeeadba1da79db3faf5f409f1a7b9ecb1b330a57a0eb2389a1bfa3a52c83" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090719 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="8dafbfe58e259c950e7b63464cdbfb891e68abc0a86b7d4e0c9bc5406283a1a8" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090726 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="3c231ebfbf6bf636033051e5a33d02c71564164db1667cbd16b072382a0b97b0" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.090734 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="f22d5226986449b8e588d26eb4a1afaecba31a4dd8aa09a17eaa0c7ac5bec2f5" exitCode=0 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.106487 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-config" (OuterVolumeSpecName: "config") pod "4f8ca881-226a-4311-aada-636335beea0d" (UID: "4f8ca881-226a-4311-aada-636335beea0d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.107910 4831 generic.go:334] "Generic (PLEG): container finished" podID="cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" containerID="0c9038e6a18689fceba5e0a50ac6d4a0a041c704037c77f6242ca7ae37b78999" exitCode=143 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.130948 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4f8ca881-226a-4311-aada-636335beea0d" (UID: "4f8ca881-226a-4311-aada-636335beea0d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.151845 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4f8ca881-226a-4311-aada-636335beea0d" (UID: "4f8ca881-226a-4311-aada-636335beea0d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.165544 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.165563 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.165571 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.169621 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-8s4vh_3b5c67f9-4d9b-428a-a974-9162d81b1f02/openstack-network-exporter/0.log" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.169716 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.180503 4831 generic.go:334] "Generic (PLEG): container finished" podID="fd770919-d80e-4947-b52d-673beb117374" containerID="aca1e2538c4e0fa0e9737c5d8550f625f94f8366334b8f2873812e5895924cfc" exitCode=143 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.180706 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" containerName="cinder-api-log" containerID="cri-o://db914d2ef3f9530c88bd1d73542d378efb6e0c34e42edbdd03ab88b16f91baa9" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.180883 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" containerName="cinder-api" containerID="cri-o://0fb9b727228c96fa187f5a0e156c0ff4cac686daaafe9b93470fdfcc2071ea6f" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.204160 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4f8ca881-226a-4311-aada-636335beea0d" (UID: "4f8ca881-226a-4311-aada-636335beea0d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.267202 4831 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.300460 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4f8ca881-226a-4311-aada-636335beea0d" (UID: "4f8ca881-226a-4311-aada-636335beea0d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334487 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334522 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wdqgf"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334534 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-wdqgf"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334552 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334572 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1d3273a2-0cc5-4a9d-8f8c-5828592973e8","Type":"ContainerDied","Data":"f49e9e078c0c7dc85b489ebd0a280551621b057e73799f1a37d159eb18f5dff8"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334597 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6d4758c757-wtw5x"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334612 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1d3273a2-0cc5-4a9d-8f8c-5828592973e8","Type":"ContainerDied","Data":"f53e3cafe0fbef5b8163e8c9a9b80c9692ebb304d6df1d4e41059696b296c210"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334622 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586","Type":"ContainerDied","Data":"7d0ad17b9c4b331c9b8e4e43f1949619c278138c064021159c668b145db79938"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334639 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"4f4ec3457fbe407088edd9608e37a0d64f94f5f8a856063df097964d16e89a41"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334651 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"c3f41ef8ab13f137c3b5591e5b203f6dabd3e35924d132650708724f52cb311a"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334661 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"028761397bb478f7acb2d80c81c0b10fa950424ecf33e2cf4134085e78dfa0ea"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334669 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"a45f8080689abc78feae747c5b6498cddb0646c863625a0fcdc630c1aa9a119f"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334677 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"46ddf46be9c06097453cf59db024eae7e40683f0a9989b74823fd4ae301bb17d"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334686 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"b9831c9567de92c654defb6777172aef010ae80be0cc6394662c2231bb36974f"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334694 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"d637e5874dd130a1ed498e16b28107fbbaefcfdf4116b7e7df8a194953476096"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334703 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"2ba943381a5e11302635d211cf0600c94162fbd3abf92948e4a2fb1e82ec5bde"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334711 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"6a0f59a871e573533ec16955bf826ca1763261b3d21e5512bb24a0e62bd0b73f"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334719 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"a8bb5d58227d4221c9bfbfe3c07b51f7961676fc4bef20fa5826956eaf08b8f4"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334727 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"56d4eeeadba1da79db3faf5f409f1a7b9ecb1b330a57a0eb2389a1bfa3a52c83"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334735 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"8dafbfe58e259c950e7b63464cdbfb891e68abc0a86b7d4e0c9bc5406283a1a8"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334744 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"3c231ebfbf6bf636033051e5a33d02c71564164db1667cbd16b072382a0b97b0"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334752 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"f22d5226986449b8e588d26eb4a1afaecba31a4dd8aa09a17eaa0c7ac5bec2f5"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334761 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d9bffbcdd-ztjkw" event={"ID":"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69","Type":"ContainerDied","Data":"0c9038e6a18689fceba5e0a50ac6d4a0a041c704037c77f6242ca7ae37b78999"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334771 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d79b99f8c-4tl6f" event={"ID":"46af7209-8790-44ab-b255-8c84c3f5255a","Type":"ContainerStarted","Data":"805905b9e770e618ea67bbd05b6bb50de90789f9322e1d4a3c6163352843ffa7"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334782 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-8s4vh" event={"ID":"3b5c67f9-4d9b-428a-a974-9162d81b1f02","Type":"ContainerDied","Data":"97b7e3a9d84ca408ddea79357809a1ee6dd0af40ff26afef0603c2ed7e5c186d"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334792 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" event={"ID":"fd770919-d80e-4947-b52d-673beb117374","Type":"ContainerDied","Data":"aca1e2538c4e0fa0e9737c5d8550f625f94f8366334b8f2873812e5895924cfc"} Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.334986 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6d4758c757-wtw5x" podUID="5d44a6a4-631f-4ccd-ac61-781623a04c11" containerName="proxy-httpd" containerID="cri-o://8b47e6d72bbe33bdcb24b9fec31efaddaf7bd0c2d98f21c24f88fbde38ed48a1" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.335634 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="1a18ae10-7f43-4072-b01c-1564735985be" containerName="nova-cell0-conductor-conductor" containerID="cri-o://3733faf327347b93ef92bc4d96624b9ad7e5d0b2f31cffa41222a7c371a88c3d" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.335836 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="aadb65a0-295d-4fcf-b148-44480346d357" containerName="nova-scheduler-scheduler" containerID="cri-o://42a7022437eee40dbce5d592f4a83cabd1f8ff8152cabd031da50508af22483d" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.336226 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/swift-proxy-6d4758c757-wtw5x" podUID="5d44a6a4-631f-4ccd-ac61-781623a04c11" containerName="proxy-server" containerID="cri-o://bbb0b5757917dd69f07b05eab8c7233245af6d5b34e3733e43bfb640fb96d6e3" gracePeriod=30 Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.357100 4831 scope.go:117] "RemoveContainer" containerID="2d3b92207030050487cb5607a18a0a68b0fe9b7d540ea69d42fbdbfb33812e73" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.375460 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4f8ca881-226a-4311-aada-636335beea0d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.512657 4831 scope.go:117] "RemoveContainer" containerID="0546a20824087822da4e65e207c3344405637cecf083c66a5cb892cd81e57ccb" Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.591768 4831 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.591844 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data podName:dc0cbb94-92ec-4369-b609-f3186f302c66 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:35.591825598 +0000 UTC m=+1472.935409106 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data") pod "rabbitmq-server-0" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66") : configmap "rabbitmq-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.592250 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-lw9v5"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.639037 4831 scope.go:117] "RemoveContainer" containerID="bde47117f94d4b3d7621163e09c3a26678467d5110db8728318ccec32cffa515" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.652970 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-lw9v5"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.653233 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1d3273a2-0cc5-4a9d-8f8c-5828592973e8/ovsdbserver-sb/0.log" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.653308 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.659775 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.687715 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.688218 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.696909 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4cc3003a-0145-4c6f-bf53-10c7c574874f-openstack-config\") pod \"4cc3003a-0145-4c6f-bf53-10c7c574874f\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.697237 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-metrics-certs-tls-certs\") pod \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.697260 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grn7z\" (UniqueName: \"kubernetes.io/projected/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-kube-api-access-grn7z\") pod \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.697308 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndbcluster-sb-etc-ovn\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.697351 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q7wk\" (UniqueName: \"kubernetes.io/projected/4cc3003a-0145-4c6f-bf53-10c7c574874f-kube-api-access-5q7wk\") pod \"4cc3003a-0145-4c6f-bf53-10c7c574874f\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.697397 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4cc3003a-0145-4c6f-bf53-10c7c574874f-openstack-config-secret\") pod \"4cc3003a-0145-4c6f-bf53-10c7c574874f\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.697829 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-combined-ca-bundle\") pod \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.697904 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-scripts\") pod \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.697985 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc3003a-0145-4c6f-bf53-10c7c574874f-combined-ca-bundle\") pod \"4cc3003a-0145-4c6f-bf53-10c7c574874f\" (UID: \"4cc3003a-0145-4c6f-bf53-10c7c574874f\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.698060 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-config\") pod \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.698168 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-ovsdbserver-sb-tls-certs\") pod \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.698239 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-ovsdb-rundir\") pod \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\" (UID: \"1d3273a2-0cc5-4a9d-8f8c-5828592973e8\") " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.703545 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-config" (OuterVolumeSpecName: "config") pod "1d3273a2-0cc5-4a9d-8f8c-5828592973e8" (UID: "1d3273a2-0cc5-4a9d-8f8c-5828592973e8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.705124 4831 secret.go:188] Couldn't get secret openstack/cinder-api-config-data: secret "cinder-api-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.705225 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom podName:1eab7a3f-11f0-4d00-b436-93cc30c2e8e1 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:33.705197223 +0000 UTC m=+1471.048780791 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom") pod "cinder-api-0" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1") : secret "cinder-api-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.706475 4831 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.706531 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data podName:1eab7a3f-11f0-4d00-b436-93cc30c2e8e1 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:33.706516865 +0000 UTC m=+1471.050100363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data") pod "cinder-api-0" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1") : secret "cinder-config-data" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.706566 4831 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.706586 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts podName:1eab7a3f-11f0-4d00-b436-93cc30c2e8e1 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:33.706577717 +0000 UTC m=+1471.050161215 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts") pod "cinder-api-0" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1") : secret "cinder-scripts" not found Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.706827 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.706947 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.706969 4831 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovsdb-server" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.707007 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-ovsdb-rundir" (OuterVolumeSpecName: "ovsdb-rundir") pod "1d3273a2-0cc5-4a9d-8f8c-5828592973e8" (UID: "1d3273a2-0cc5-4a9d-8f8c-5828592973e8"). InnerVolumeSpecName "ovsdb-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.707444 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-scripts" (OuterVolumeSpecName: "scripts") pod "1d3273a2-0cc5-4a9d-8f8c-5828592973e8" (UID: "1d3273a2-0cc5-4a9d-8f8c-5828592973e8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.707662 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-kube-api-access-grn7z" (OuterVolumeSpecName: "kube-api-access-grn7z") pod "1d3273a2-0cc5-4a9d-8f8c-5828592973e8" (UID: "1d3273a2-0cc5-4a9d-8f8c-5828592973e8"). InnerVolumeSpecName "kube-api-access-grn7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.708849 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cc3003a-0145-4c6f-bf53-10c7c574874f-kube-api-access-5q7wk" (OuterVolumeSpecName: "kube-api-access-5q7wk") pod "4cc3003a-0145-4c6f-bf53-10c7c574874f" (UID: "4cc3003a-0145-4c6f-bf53-10c7c574874f"). InnerVolumeSpecName "kube-api-access-5q7wk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.710136 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "ovndbcluster-sb-etc-ovn") pod "1d3273a2-0cc5-4a9d-8f8c-5828592973e8" (UID: "1d3273a2-0cc5-4a9d-8f8c-5828592973e8"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.710909 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.719734 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:31 crc kubenswrapper[4831]: E1203 06:55:31.719809 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovs-vswitchd" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.724729 4831 scope.go:117] "RemoveContainer" containerID="85779452492d794c13e8c469cf65390b164504234a47b5ad1564537fb4273f5a" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.800687 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.800720 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q7wk\" (UniqueName: \"kubernetes.io/projected/4cc3003a-0145-4c6f-bf53-10c7c574874f-kube-api-access-5q7wk\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.800735 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.800746 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.800758 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-ovsdb-rundir\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.800769 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grn7z\" (UniqueName: \"kubernetes.io/projected/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-kube-api-access-grn7z\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.850207 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cc3003a-0145-4c6f-bf53-10c7c574874f-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "4cc3003a-0145-4c6f-bf53-10c7c574874f" (UID: "4cc3003a-0145-4c6f-bf53-10c7c574874f"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.875723 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d3273a2-0cc5-4a9d-8f8c-5828592973e8" (UID: "1d3273a2-0cc5-4a9d-8f8c-5828592973e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.893753 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.910050 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/4cc3003a-0145-4c6f-bf53-10c7c574874f-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.910082 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.910095 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.934527 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cc3003a-0145-4c6f-bf53-10c7c574874f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4cc3003a-0145-4c6f-bf53-10c7c574874f" (UID: "4cc3003a-0145-4c6f-bf53-10c7c574874f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:31 crc kubenswrapper[4831]: I1203 06:55:31.949753 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cc3003a-0145-4c6f-bf53-10c7c574874f-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "4cc3003a-0145-4c6f-bf53-10c7c574874f" (UID: "4cc3003a-0145-4c6f-bf53-10c7c574874f"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.019904 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4cc3003a-0145-4c6f-bf53-10c7c574874f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.019930 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/4cc3003a-0145-4c6f-bf53-10c7c574874f-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.024837 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "1d3273a2-0cc5-4a9d-8f8c-5828592973e8" (UID: "1d3273a2-0cc5-4a9d-8f8c-5828592973e8"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.071111 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="8d6ac806-4ac5-4de4-b6a0-b265032150f4" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.078750 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-ovsdbserver-sb-tls-certs" (OuterVolumeSpecName: "ovsdbserver-sb-tls-certs") pod "1d3273a2-0cc5-4a9d-8f8c-5828592973e8" (UID: "1d3273a2-0cc5-4a9d-8f8c-5828592973e8"). InnerVolumeSpecName "ovsdbserver-sb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.123399 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-ovsdbserver-sb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.123466 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/1d3273a2-0cc5-4a9d-8f8c-5828592973e8-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: E1203 06:55:32.226445 4831 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 03 06:55:32 crc kubenswrapper[4831]: E1203 06:55:32.226532 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data podName:8d6ac806-4ac5-4de4-b6a0-b265032150f4 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:36.226509269 +0000 UTC m=+1473.570092787 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data") pod "rabbitmq-cell1-server-0" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4") : configmap "rabbitmq-cell1-config-data" not found Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.232862 4831 scope.go:117] "RemoveContainer" containerID="3aec4649bc0f7597a824ffeb353a8d7f39afe26459a358163be0cbfca4a1b51b" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.233036 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.258272 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.268275 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-869cfdc5c4-7898s" event={"ID":"e0b16411-0a37-4432-965b-746c2d70d00b","Type":"ContainerStarted","Data":"ff8dfb17dc67aace688571e91e7aee7d3a0f30d38866040c0fc7042820c8a400"} Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.268350 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-869cfdc5c4-7898s" event={"ID":"e0b16411-0a37-4432-965b-746c2d70d00b","Type":"ContainerStarted","Data":"02504848c3b09a3613ecf77f3dec52f7b03c139c38e10d22eeb182e3c700886c"} Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.268550 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-869cfdc5c4-7898s" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api-log" containerID="cri-o://02504848c3b09a3613ecf77f3dec52f7b03c139c38e10d22eeb182e3c700886c" gracePeriod=30 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.268682 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.268899 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.268957 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-869cfdc5c4-7898s" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api" containerID="cri-o://ff8dfb17dc67aace688571e91e7aee7d3a0f30d38866040c0fc7042820c8a400" gracePeriod=30 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.294596 4831 generic.go:334] "Generic (PLEG): container finished" podID="e358bf07-df54-4268-9421-f31c57f5594c" containerID="a1ad2c7b7ffe4f7516b2ef2e3d38d3444b1dcf04767f85dd6169d820a50b087a" exitCode=0 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.294865 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e358bf07-df54-4268-9421-f31c57f5594c","Type":"ContainerDied","Data":"a1ad2c7b7ffe4f7516b2ef2e3d38d3444b1dcf04767f85dd6169d820a50b087a"} Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.304565 4831 generic.go:334] "Generic (PLEG): container finished" podID="fd770919-d80e-4947-b52d-673beb117374" containerID="4de12fb31b83a10dc830b04fa625356818e600bca291e7e4c8c179b0fba4550c" exitCode=0 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.304661 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" event={"ID":"fd770919-d80e-4947-b52d-673beb117374","Type":"ContainerDied","Data":"4de12fb31b83a10dc830b04fa625356818e600bca291e7e4c8c179b0fba4550c"} Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.328715 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2vjs5\" (UniqueName: \"kubernetes.io/projected/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-kube-api-access-2vjs5\") pod \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.328835 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-config-data\") pod \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.328925 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-vencrypt-tls-certs\") pod \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.328973 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-combined-ca-bundle\") pod \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.329193 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-nova-novncproxy-tls-certs\") pod \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\" (UID: \"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.343349 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_1d3273a2-0cc5-4a9d-8f8c-5828592973e8/ovsdbserver-sb/0.log" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.343442 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"1d3273a2-0cc5-4a9d-8f8c-5828592973e8","Type":"ContainerDied","Data":"676c74f7f166e4b0b998e8092bfdfcbcb20b027f142d46b6971ffb74921f4f1c"} Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.343531 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.345131 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-kube-api-access-2vjs5" (OuterVolumeSpecName: "kube-api-access-2vjs5") pod "d0bdb64e-afcf-408c-90ea-fa8532c1a6f2" (UID: "d0bdb64e-afcf-408c-90ea-fa8532c1a6f2"). InnerVolumeSpecName "kube-api-access-2vjs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.350973 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.367789 4831 generic.go:334] "Generic (PLEG): container finished" podID="1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" containerID="db914d2ef3f9530c88bd1d73542d378efb6e0c34e42edbdd03ab88b16f91baa9" exitCode=143 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.367862 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1","Type":"ContainerDied","Data":"db914d2ef3f9530c88bd1d73542d378efb6e0c34e42edbdd03ab88b16f91baa9"} Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.385523 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.396030 4831 scope.go:117] "RemoveContainer" containerID="f49e9e078c0c7dc85b489ebd0a280551621b057e73799f1a37d159eb18f5dff8" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.401905 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d79b99f8c-4tl6f" event={"ID":"46af7209-8790-44ab-b255-8c84c3f5255a","Type":"ContainerStarted","Data":"5d6d8b6f2a4e23898996c34bfb0c94ef64025a7f4f9fdc4da668685ab790291e"} Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.402158 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5d79b99f8c-4tl6f" podUID="46af7209-8790-44ab-b255-8c84c3f5255a" containerName="barbican-worker-log" containerID="cri-o://805905b9e770e618ea67bbd05b6bb50de90789f9322e1d4a3c6163352843ffa7" gracePeriod=30 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.402293 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-worker-5d79b99f8c-4tl6f" podUID="46af7209-8790-44ab-b255-8c84c3f5255a" containerName="barbican-worker" containerID="cri-o://5d6d8b6f2a4e23898996c34bfb0c94ef64025a7f4f9fdc4da668685ab790291e" gracePeriod=30 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.430405 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-869cfdc5c4-7898s" podStartSLOduration=5.430384811 podStartE2EDuration="5.430384811s" podCreationTimestamp="2025-12-03 06:55:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:55:32.310943327 +0000 UTC m=+1469.654526845" watchObservedRunningTime="2025-12-03 06:55:32.430384811 +0000 UTC m=+1469.773968309" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.430896 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdb7n\" (UniqueName: \"kubernetes.io/projected/fd770919-d80e-4947-b52d-673beb117374-kube-api-access-xdb7n\") pod \"fd770919-d80e-4947-b52d-673beb117374\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.430998 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-config-data\") pod \"fd770919-d80e-4947-b52d-673beb117374\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.431200 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e358bf07-df54-4268-9421-f31c57f5594c-config-data-generated\") pod \"e358bf07-df54-4268-9421-f31c57f5594c\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.431287 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"e358bf07-df54-4268-9421-f31c57f5594c\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.431372 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-config-data-custom\") pod \"fd770919-d80e-4947-b52d-673beb117374\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.431523 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd770919-d80e-4947-b52d-673beb117374-logs\") pod \"fd770919-d80e-4947-b52d-673beb117374\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.431625 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e358bf07-df54-4268-9421-f31c57f5594c-galera-tls-certs\") pod \"e358bf07-df54-4268-9421-f31c57f5594c\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.431984 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-config-data-default\") pod \"e358bf07-df54-4268-9421-f31c57f5594c\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.432120 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-combined-ca-bundle\") pod \"fd770919-d80e-4947-b52d-673beb117374\" (UID: \"fd770919-d80e-4947-b52d-673beb117374\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.432217 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdgmf\" (UniqueName: \"kubernetes.io/projected/e358bf07-df54-4268-9421-f31c57f5594c-kube-api-access-tdgmf\") pod \"e358bf07-df54-4268-9421-f31c57f5594c\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.435504 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-kolla-config\") pod \"e358bf07-df54-4268-9421-f31c57f5594c\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.435613 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-operator-scripts\") pod \"e358bf07-df54-4268-9421-f31c57f5594c\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.435715 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e358bf07-df54-4268-9421-f31c57f5594c-combined-ca-bundle\") pod \"e358bf07-df54-4268-9421-f31c57f5594c\" (UID: \"e358bf07-df54-4268-9421-f31c57f5594c\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.436598 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2vjs5\" (UniqueName: \"kubernetes.io/projected/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-kube-api-access-2vjs5\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.436834 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e358bf07-df54-4268-9421-f31c57f5594c-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "e358bf07-df54-4268-9421-f31c57f5594c" (UID: "e358bf07-df54-4268-9421-f31c57f5594c"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.435016 4831 generic.go:334] "Generic (PLEG): container finished" podID="3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" containerID="499ef1cada2eb4986f003c7c29c7c54c82c99a884b5bcf1f1b4f4fbda65a3300" exitCode=0 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.435043 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55749f9879-hprsg" event={"ID":"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27","Type":"ContainerDied","Data":"499ef1cada2eb4986f003c7c29c7c54c82c99a884b5bcf1f1b4f4fbda65a3300"} Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.440110 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e358bf07-df54-4268-9421-f31c57f5594c" (UID: "e358bf07-df54-4268-9421-f31c57f5594c"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.440594 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd770919-d80e-4947-b52d-673beb117374-logs" (OuterVolumeSpecName: "logs") pod "fd770919-d80e-4947-b52d-673beb117374" (UID: "fd770919-d80e-4947-b52d-673beb117374"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.441264 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "e358bf07-df54-4268-9421-f31c57f5594c" (UID: "e358bf07-df54-4268-9421-f31c57f5594c"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.442826 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0bdb64e-afcf-408c-90ea-fa8532c1a6f2" (UID: "d0bdb64e-afcf-408c-90ea-fa8532c1a6f2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.443731 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e358bf07-df54-4268-9421-f31c57f5594c" (UID: "e358bf07-df54-4268-9421-f31c57f5594c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.443808 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fd770919-d80e-4947-b52d-673beb117374" (UID: "fd770919-d80e-4947-b52d-673beb117374"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.450157 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-768d8958fd-hbthr"] Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.456141 4831 generic.go:334] "Generic (PLEG): container finished" podID="5d44a6a4-631f-4ccd-ac61-781623a04c11" containerID="bbb0b5757917dd69f07b05eab8c7233245af6d5b34e3733e43bfb640fb96d6e3" exitCode=0 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.456174 4831 generic.go:334] "Generic (PLEG): container finished" podID="5d44a6a4-631f-4ccd-ac61-781623a04c11" containerID="8b47e6d72bbe33bdcb24b9fec31efaddaf7bd0c2d98f21c24f88fbde38ed48a1" exitCode=0 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.456232 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d4758c757-wtw5x" event={"ID":"5d44a6a4-631f-4ccd-ac61-781623a04c11","Type":"ContainerDied","Data":"bbb0b5757917dd69f07b05eab8c7233245af6d5b34e3733e43bfb640fb96d6e3"} Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.456257 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d4758c757-wtw5x" event={"ID":"5d44a6a4-631f-4ccd-ac61-781623a04c11","Type":"ContainerDied","Data":"8b47e6d72bbe33bdcb24b9fec31efaddaf7bd0c2d98f21c24f88fbde38ed48a1"} Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.472147 4831 generic.go:334] "Generic (PLEG): container finished" podID="d0bdb64e-afcf-408c-90ea-fa8532c1a6f2" containerID="a083ce8f23553bd89f6c5a7fa97eb108d3aaa5ca11ad5ec36e50548b3530fae7" exitCode=0 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.472345 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.473586 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2","Type":"ContainerDied","Data":"a083ce8f23553bd89f6c5a7fa97eb108d3aaa5ca11ad5ec36e50548b3530fae7"} Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.473625 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"d0bdb64e-afcf-408c-90ea-fa8532c1a6f2","Type":"ContainerDied","Data":"27b6a9bc76e14bc76d928be747146d339bd492ea8531c28e69fd478b6d6d0d1d"} Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.477201 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e358bf07-df54-4268-9421-f31c57f5594c-kube-api-access-tdgmf" (OuterVolumeSpecName: "kube-api-access-tdgmf") pod "e358bf07-df54-4268-9421-f31c57f5594c" (UID: "e358bf07-df54-4268-9421-f31c57f5594c"). InnerVolumeSpecName "kube-api-access-tdgmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.477808 4831 generic.go:334] "Generic (PLEG): container finished" podID="fa796212-03db-4860-93f4-d2918ed44070" containerID="fd1e277c8951121dadc44ba354b735192014953fab00452b13030855e8591cd1" exitCode=0 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.477912 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fa796212-03db-4860-93f4-d2918ed44070","Type":"ContainerDied","Data":"fd1e277c8951121dadc44ba354b735192014953fab00452b13030855e8591cd1"} Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.482456 4831 scope.go:117] "RemoveContainer" containerID="f53e3cafe0fbef5b8163e8c9a9b80c9692ebb304d6df1d4e41059696b296c210" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.489260 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "mysql-db") pod "e358bf07-df54-4268-9421-f31c57f5594c" (UID: "e358bf07-df54-4268-9421-f31c57f5594c"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.492849 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd770919-d80e-4947-b52d-673beb117374-kube-api-access-xdb7n" (OuterVolumeSpecName: "kube-api-access-xdb7n") pod "fd770919-d80e-4947-b52d-673beb117374" (UID: "fd770919-d80e-4947-b52d-673beb117374"). InnerVolumeSpecName "kube-api-access-xdb7n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.494130 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutronb468-account-delete-9hl65"] Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.501101 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-5d79b99f8c-4tl6f" podStartSLOduration=5.501081973 podStartE2EDuration="5.501081973s" podCreationTimestamp="2025-12-03 06:55:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:55:32.446737398 +0000 UTC m=+1469.790320916" watchObservedRunningTime="2025-12-03 06:55:32.501081973 +0000 UTC m=+1469.844665481" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.522835 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.523113 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-config-data" (OuterVolumeSpecName: "config-data") pod "d0bdb64e-afcf-408c-90ea-fa8532c1a6f2" (UID: "d0bdb64e-afcf-408c-90ea-fa8532c1a6f2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.533877 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.563137 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdb7n\" (UniqueName: \"kubernetes.io/projected/fd770919-d80e-4947-b52d-673beb117374-kube-api-access-xdb7n\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.563290 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/e358bf07-df54-4268-9421-f31c57f5594c-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.563400 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.563459 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.563512 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd770919-d80e-4947-b52d-673beb117374-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.563568 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.563620 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdgmf\" (UniqueName: \"kubernetes.io/projected/e358bf07-df54-4268-9421-f31c57f5594c-kube-api-access-tdgmf\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.563679 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.563731 4831 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.563786 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.563836 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e358bf07-df54-4268-9421-f31c57f5594c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.567616 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fd770919-d80e-4947-b52d-673beb117374" (UID: "fd770919-d80e-4947-b52d-673beb117374"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.567824 4831 scope.go:117] "RemoveContainer" containerID="a083ce8f23553bd89f6c5a7fa97eb108d3aaa5ca11ad5ec36e50548b3530fae7" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.571737 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-nova-novncproxy-tls-certs" (OuterVolumeSpecName: "nova-novncproxy-tls-certs") pod "d0bdb64e-afcf-408c-90ea-fa8532c1a6f2" (UID: "d0bdb64e-afcf-408c-90ea-fa8532c1a6f2"). InnerVolumeSpecName "nova-novncproxy-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.594156 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.619961 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e358bf07-df54-4268-9421-f31c57f5594c-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "e358bf07-df54-4268-9421-f31c57f5594c" (UID: "e358bf07-df54-4268-9421-f31c57f5594c"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.620082 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-vencrypt-tls-certs" (OuterVolumeSpecName: "vencrypt-tls-certs") pod "d0bdb64e-afcf-408c-90ea-fa8532c1a6f2" (UID: "d0bdb64e-afcf-408c-90ea-fa8532c1a6f2"). InnerVolumeSpecName "vencrypt-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.628454 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e358bf07-df54-4268-9421-f31c57f5594c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e358bf07-df54-4268-9421-f31c57f5594c" (UID: "e358bf07-df54-4268-9421-f31c57f5594c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.633326 4831 scope.go:117] "RemoveContainer" containerID="a083ce8f23553bd89f6c5a7fa97eb108d3aaa5ca11ad5ec36e50548b3530fae7" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.634468 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-config-data" (OuterVolumeSpecName: "config-data") pod "fd770919-d80e-4947-b52d-673beb117374" (UID: "fd770919-d80e-4947-b52d-673beb117374"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: E1203 06:55:32.637406 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a083ce8f23553bd89f6c5a7fa97eb108d3aaa5ca11ad5ec36e50548b3530fae7\": container with ID starting with a083ce8f23553bd89f6c5a7fa97eb108d3aaa5ca11ad5ec36e50548b3530fae7 not found: ID does not exist" containerID="a083ce8f23553bd89f6c5a7fa97eb108d3aaa5ca11ad5ec36e50548b3530fae7" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.637443 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a083ce8f23553bd89f6c5a7fa97eb108d3aaa5ca11ad5ec36e50548b3530fae7"} err="failed to get container status \"a083ce8f23553bd89f6c5a7fa97eb108d3aaa5ca11ad5ec36e50548b3530fae7\": rpc error: code = NotFound desc = could not find container \"a083ce8f23553bd89f6c5a7fa97eb108d3aaa5ca11ad5ec36e50548b3530fae7\": container with ID starting with a083ce8f23553bd89f6c5a7fa97eb108d3aaa5ca11ad5ec36e50548b3530fae7 not found: ID does not exist" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.666125 4831 reconciler_common.go:293] "Volume detached for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-vencrypt-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.666181 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e358bf07-df54-4268-9421-f31c57f5594c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.666194 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.666205 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.666215 4831 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/e358bf07-df54-4268-9421-f31c57f5594c-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.666235 4831 reconciler_common.go:293] "Volume detached for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2-nova-novncproxy-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.666248 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd770919-d80e-4947-b52d-673beb117374-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.728307 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="dc0cbb94-92ec-4369-b609-f3186f302c66" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.103:5671: connect: connection refused" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.733474 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.819082 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.819519 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="ceilometer-central-agent" containerID="cri-o://fcc1e170c1524fa94e168039a9e3d7633e7672aa2c44dec0ccc17b3dca0a710d" gracePeriod=30 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.819759 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="proxy-httpd" containerID="cri-o://261de1853dcb7663f85295db044aad677b35575a677eb0bba2a868477180e3c8" gracePeriod=30 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.819854 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="sg-core" containerID="cri-o://e71ff33317182316540c85e64e3ed72394f4b793212b2b2ad2dcf5211991e1f4" gracePeriod=30 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.819943 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="ceilometer-notification-agent" containerID="cri-o://7e4f54e18bab0081a153be21f4ff39501c5a9212464c6700ffd2af4a479be6b7" gracePeriod=30 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.872592 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-logs\") pod \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.872664 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m4tm6\" (UniqueName: \"kubernetes.io/projected/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-kube-api-access-m4tm6\") pod \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.872713 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-config-data-custom\") pod \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.872767 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-config-data\") pod \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.872819 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-combined-ca-bundle\") pod \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\" (UID: \"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27\") " Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.875060 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-logs" (OuterVolumeSpecName: "logs") pod "3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" (UID: "3a92648c-9b8b-4fcf-b028-3f59ce2ebf27"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.875384 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.875663 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="5bf96e96-13ba-44c9-b16e-b1c2acbfc643" containerName="kube-state-metrics" containerID="cri-o://a1a3fe54cd3d30665e97d6fbc19db3951584ee5c9ac2f3ff590c8299afe5e9f9" gracePeriod=30 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.909190 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.922829 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-kube-api-access-m4tm6" (OuterVolumeSpecName: "kube-api-access-m4tm6") pod "3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" (UID: "3a92648c-9b8b-4fcf-b028-3f59ce2ebf27"). InnerVolumeSpecName "kube-api-access-m4tm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.926931 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" (UID: "3a92648c-9b8b-4fcf-b028-3f59ce2ebf27"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.928994 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.948051 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:55:32 crc kubenswrapper[4831]: E1203 06:55:32.956795 4831 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err=< Dec 03 06:55:32 crc kubenswrapper[4831]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-03T06:55:30Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 03 06:55:32 crc kubenswrapper[4831]: /etc/init.d/functions: line 589: 407 Alarm clock "$@" Dec 03 06:55:32 crc kubenswrapper[4831]: > execCommand=["/usr/share/ovn/scripts/ovn-ctl","stop_controller"] containerName="ovn-controller" pod="openstack/ovn-controller-95nwv" message=< Dec 03 06:55:32 crc kubenswrapper[4831]: Exiting ovn-controller (1) [FAILED] Dec 03 06:55:32 crc kubenswrapper[4831]: Killing ovn-controller (1) [ OK ] Dec 03 06:55:32 crc kubenswrapper[4831]: 2025-12-03T06:55:30Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 03 06:55:32 crc kubenswrapper[4831]: /etc/init.d/functions: line 589: 407 Alarm clock "$@" Dec 03 06:55:32 crc kubenswrapper[4831]: > Dec 03 06:55:32 crc kubenswrapper[4831]: E1203 06:55:32.956833 4831 kuberuntime_container.go:691] "PreStop hook failed" err=< Dec 03 06:55:32 crc kubenswrapper[4831]: command '/usr/share/ovn/scripts/ovn-ctl stop_controller' exited with 137: 2025-12-03T06:55:30Z|00001|fatal_signal|WARN|terminating with signal 14 (Alarm clock) Dec 03 06:55:32 crc kubenswrapper[4831]: /etc/init.d/functions: line 589: 407 Alarm clock "$@" Dec 03 06:55:32 crc kubenswrapper[4831]: > pod="openstack/ovn-controller-95nwv" podUID="5fe1a689-1241-4c11-93ca-875e53319668" containerName="ovn-controller" containerID="cri-o://cfbeb31890e53191672906e82bdc1b8672d17ef21f7bf4d793893c62a489706c" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.956864 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-controller-95nwv" podUID="5fe1a689-1241-4c11-93ca-875e53319668" containerName="ovn-controller" containerID="cri-o://cfbeb31890e53191672906e82bdc1b8672d17ef21f7bf4d793893c62a489706c" gracePeriod=27 Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.957102 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-95nwv" podUID="5fe1a689-1241-4c11-93ca-875e53319668" containerName="ovn-controller" probeResult="failure" output="" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.991624 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.991689 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m4tm6\" (UniqueName: \"kubernetes.io/projected/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-kube-api-access-m4tm6\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.991706 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.995837 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 03 06:55:32 crc kubenswrapper[4831]: I1203 06:55:32.996041 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/memcached-0" podUID="e1ffe861-7d12-49e2-9737-fc100833da39" containerName="memcached" containerID="cri-o://2a6aed804c2a9f3ed8236fffc4704163803135cc2b4f57e197408d2f4c85bb43" gracePeriod=30 Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.045480 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" (UID: "3a92648c-9b8b-4fcf-b028-3f59ce2ebf27"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.096031 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d44a6a4-631f-4ccd-ac61-781623a04c11-log-httpd\") pod \"5d44a6a4-631f-4ccd-ac61-781623a04c11\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.096071 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-internal-tls-certs\") pod \"5d44a6a4-631f-4ccd-ac61-781623a04c11\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.096101 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d44a6a4-631f-4ccd-ac61-781623a04c11-run-httpd\") pod \"5d44a6a4-631f-4ccd-ac61-781623a04c11\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.096192 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-config-data\") pod \"5d44a6a4-631f-4ccd-ac61-781623a04c11\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.096232 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d44a6a4-631f-4ccd-ac61-781623a04c11-etc-swift\") pod \"5d44a6a4-631f-4ccd-ac61-781623a04c11\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.096249 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-combined-ca-bundle\") pod \"5d44a6a4-631f-4ccd-ac61-781623a04c11\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.096280 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcbbp\" (UniqueName: \"kubernetes.io/projected/5d44a6a4-631f-4ccd-ac61-781623a04c11-kube-api-access-pcbbp\") pod \"5d44a6a4-631f-4ccd-ac61-781623a04c11\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.096342 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-public-tls-certs\") pod \"5d44a6a4-631f-4ccd-ac61-781623a04c11\" (UID: \"5d44a6a4-631f-4ccd-ac61-781623a04c11\") " Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.096947 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.105870 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d44a6a4-631f-4ccd-ac61-781623a04c11-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "5d44a6a4-631f-4ccd-ac61-781623a04c11" (UID: "5d44a6a4-631f-4ccd-ac61-781623a04c11"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.106176 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d44a6a4-631f-4ccd-ac61-781623a04c11-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "5d44a6a4-631f-4ccd-ac61-781623a04c11" (UID: "5d44a6a4-631f-4ccd-ac61-781623a04c11"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.129625 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d44a6a4-631f-4ccd-ac61-781623a04c11-kube-api-access-pcbbp" (OuterVolumeSpecName: "kube-api-access-pcbbp") pod "5d44a6a4-631f-4ccd-ac61-781623a04c11" (UID: "5d44a6a4-631f-4ccd-ac61-781623a04c11"). InnerVolumeSpecName "kube-api-access-pcbbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.159961 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3273a2-0cc5-4a9d-8f8c-5828592973e8" path="/var/lib/kubelet/pods/1d3273a2-0cc5-4a9d-8f8c-5828592973e8/volumes" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.163116 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cc3003a-0145-4c6f-bf53-10c7c574874f" path="/var/lib/kubelet/pods/4cc3003a-0145-4c6f-bf53-10c7c574874f/volumes" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.164450 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f8ca881-226a-4311-aada-636335beea0d" path="/var/lib/kubelet/pods/4f8ca881-226a-4311-aada-636335beea0d/volumes" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.165146 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce67ad1c-2870-4109-8fc8-86f35414b1ea" path="/var/lib/kubelet/pods/ce67ad1c-2870-4109-8fc8-86f35414b1ea/volumes" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.172429 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0bdb64e-afcf-408c-90ea-fa8532c1a6f2" path="/var/lib/kubelet/pods/d0bdb64e-afcf-408c-90ea-fa8532c1a6f2/volumes" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.172526 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d44a6a4-631f-4ccd-ac61-781623a04c11-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "5d44a6a4-631f-4ccd-ac61-781623a04c11" (UID: "5d44a6a4-631f-4ccd-ac61-781623a04c11"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.199745 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/5d44a6a4-631f-4ccd-ac61-781623a04c11-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.199772 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcbbp\" (UniqueName: \"kubernetes.io/projected/5d44a6a4-631f-4ccd-ac61-781623a04c11-kube-api-access-pcbbp\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.199783 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d44a6a4-631f-4ccd-ac61-781623a04c11-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.199792 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5d44a6a4-631f-4ccd-ac61-781623a04c11-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.219447 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-config-data" (OuterVolumeSpecName: "config-data") pod "5d44a6a4-631f-4ccd-ac61-781623a04c11" (UID: "5d44a6a4-631f-4ccd-ac61-781623a04c11"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.241222 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-config-data" (OuterVolumeSpecName: "config-data") pod "3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" (UID: "3a92648c-9b8b-4fcf-b028-3f59ce2ebf27"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.302025 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.302072 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.311220 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d44a6a4-631f-4ccd-ac61-781623a04c11" (UID: "5d44a6a4-631f-4ccd-ac61-781623a04c11"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.388157 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5d44a6a4-631f-4ccd-ac61-781623a04c11" (UID: "5d44a6a4-631f-4ccd-ac61-781623a04c11"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.394503 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5d44a6a4-631f-4ccd-ac61-781623a04c11" (UID: "5d44a6a4-631f-4ccd-ac61-781623a04c11"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.404037 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.404071 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.404080 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d44a6a4-631f-4ccd-ac61-781623a04c11-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.442616 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-vgj9j"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.442652 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-vgj9j"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.442666 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-665dcf9f4f-8g7pd"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.442682 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-89rlq"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.442695 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-89rlq"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.442709 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone7cf1-account-delete-k424c"] Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443112 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" containerName="barbican-worker-log" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443125 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" containerName="barbican-worker-log" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443146 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" containerName="barbican-worker" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443154 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" containerName="barbican-worker" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443165 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e358bf07-df54-4268-9421-f31c57f5594c" containerName="mysql-bootstrap" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443170 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e358bf07-df54-4268-9421-f31c57f5594c" containerName="mysql-bootstrap" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443184 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e358bf07-df54-4268-9421-f31c57f5594c" containerName="galera" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443190 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e358bf07-df54-4268-9421-f31c57f5594c" containerName="galera" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443199 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d44a6a4-631f-4ccd-ac61-781623a04c11" containerName="proxy-httpd" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443205 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d44a6a4-631f-4ccd-ac61-781623a04c11" containerName="proxy-httpd" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443214 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d44a6a4-631f-4ccd-ac61-781623a04c11" containerName="proxy-server" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443219 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d44a6a4-631f-4ccd-ac61-781623a04c11" containerName="proxy-server" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443231 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0bdb64e-afcf-408c-90ea-fa8532c1a6f2" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443237 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0bdb64e-afcf-408c-90ea-fa8532c1a6f2" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443246 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0fdc967-7fb5-4702-b184-6953e8aefd19" containerName="openstack-network-exporter" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443251 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0fdc967-7fb5-4702-b184-6953e8aefd19" containerName="openstack-network-exporter" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443263 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd770919-d80e-4947-b52d-673beb117374" containerName="barbican-keystone-listener-log" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443268 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd770919-d80e-4947-b52d-673beb117374" containerName="barbican-keystone-listener-log" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443274 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd770919-d80e-4947-b52d-673beb117374" containerName="barbican-keystone-listener" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443279 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd770919-d80e-4947-b52d-673beb117374" containerName="barbican-keystone-listener" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443290 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0fdc967-7fb5-4702-b184-6953e8aefd19" containerName="ovsdbserver-nb" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443297 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0fdc967-7fb5-4702-b184-6953e8aefd19" containerName="ovsdbserver-nb" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443326 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3273a2-0cc5-4a9d-8f8c-5828592973e8" containerName="openstack-network-exporter" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443334 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3273a2-0cc5-4a9d-8f8c-5828592973e8" containerName="openstack-network-exporter" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443348 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8ca881-226a-4311-aada-636335beea0d" containerName="init" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443353 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8ca881-226a-4311-aada-636335beea0d" containerName="init" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443369 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3273a2-0cc5-4a9d-8f8c-5828592973e8" containerName="ovsdbserver-sb" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443375 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3273a2-0cc5-4a9d-8f8c-5828592973e8" containerName="ovsdbserver-sb" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443385 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b5c67f9-4d9b-428a-a974-9162d81b1f02" containerName="openstack-network-exporter" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443391 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b5c67f9-4d9b-428a-a974-9162d81b1f02" containerName="openstack-network-exporter" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.443402 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f8ca881-226a-4311-aada-636335beea0d" containerName="dnsmasq-dns" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443407 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f8ca881-226a-4311-aada-636335beea0d" containerName="dnsmasq-dns" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443569 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3273a2-0cc5-4a9d-8f8c-5828592973e8" containerName="ovsdbserver-sb" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443582 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0fdc967-7fb5-4702-b184-6953e8aefd19" containerName="openstack-network-exporter" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443589 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd770919-d80e-4947-b52d-673beb117374" containerName="barbican-keystone-listener-log" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443598 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d44a6a4-631f-4ccd-ac61-781623a04c11" containerName="proxy-httpd" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443606 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0bdb64e-afcf-408c-90ea-fa8532c1a6f2" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443617 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b5c67f9-4d9b-428a-a974-9162d81b1f02" containerName="openstack-network-exporter" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443625 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e358bf07-df54-4268-9421-f31c57f5594c" containerName="galera" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443636 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" containerName="barbican-worker" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443645 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" containerName="barbican-worker-log" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443652 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d44a6a4-631f-4ccd-ac61-781623a04c11" containerName="proxy-server" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443661 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3273a2-0cc5-4a9d-8f8c-5828592973e8" containerName="openstack-network-exporter" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443673 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f8ca881-226a-4311-aada-636335beea0d" containerName="dnsmasq-dns" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443679 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd770919-d80e-4947-b52d-673beb117374" containerName="barbican-keystone-listener" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.443689 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0fdc967-7fb5-4702-b184-6953e8aefd19" containerName="ovsdbserver-nb" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444203 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone7cf1-account-delete-k424c"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444220 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444233 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-9jjj4"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444243 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-9jjj4"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444254 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone7cf1-account-delete-k424c"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444263 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-7cf1-account-create-update-qc9fq"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444272 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-7cf1-account-create-update-qc9fq"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444289 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican3e51-account-delete-4jnl5"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444298 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novacell07556-account-delete-rtljc"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444308 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/novaapi8edc-account-delete-2pvdg"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444336 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance3e24-account-delete-nkv48"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444345 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-2ms27"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444355 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-2ms27"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444365 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b468-account-create-update-h97z9"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444965 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone7cf1-account-delete-k424c" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.444957 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/keystone-665dcf9f4f-8g7pd" podUID="5067e964-1daa-4bbd-8e2b-872ce1067389" containerName="keystone-api" containerID="cri-o://8f1ceb54accf6cc95db136b64ce1f4080a7894f0ee7b6160b877d7044f1c4354" gracePeriod=30 Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.450112 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronb468-account-delete-9hl65"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.478205 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b468-account-create-update-h97z9"] Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.494638 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.498251 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.500507 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.500537 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="5b548290-abc5-4c67-862c-16aa03a652da" containerName="ovn-northd" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.506752 4831 generic.go:334] "Generic (PLEG): container finished" podID="cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" containerID="4dfb2678a75458c31ce4e501a156049718eef44858bc8bb12bca6a8c8f4adbfa" exitCode=0 Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.506809 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d9bffbcdd-ztjkw" event={"ID":"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69","Type":"ContainerDied","Data":"4dfb2678a75458c31ce4e501a156049718eef44858bc8bb12bca6a8c8f4adbfa"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.537573 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder3a6d-account-delete-b6ptg"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.570005 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.570584 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d2b937efb5cb363c5d2b1ffebf2c05783efceefbe6e756b381e8b96438b287fb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.570771 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"e358bf07-df54-4268-9421-f31c57f5594c","Type":"ContainerDied","Data":"2e5e2bffb22c4da37da1e4f70e3ae92cb8da1e7129f3872335a5c10d4181003f"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.570839 4831 scope.go:117] "RemoveContainer" containerID="a1ad2c7b7ffe4f7516b2ef2e3d38d3444b1dcf04767f85dd6169d820a50b087a" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.572953 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d2b937efb5cb363c5d2b1ffebf2c05783efceefbe6e756b381e8b96438b287fb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.576463 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" event={"ID":"fd770919-d80e-4947-b52d-673beb117374","Type":"ContainerDied","Data":"e98936f85844d71031d7dda15ce52290cc961d207f2d56462def738170a46451"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.576719 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.583292 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-l964v"] Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.590906 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="d2b937efb5cb363c5d2b1ffebf2c05783efceefbe6e756b381e8b96438b287fb" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.590987 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="c60bce87-ea0b-4b3d-8243-93ed40c232ff" containerName="nova-cell1-conductor-conductor" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.605769 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-55749f9879-hprsg" event={"ID":"3a92648c-9b8b-4fcf-b028-3f59ce2ebf27","Type":"ContainerDied","Data":"2c73abbc92e0058e3d6c321b667f77cdef9878197d405a2c30654264d9aee524"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.605901 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.639850 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-95nwv_5fe1a689-1241-4c11-93ca-875e53319668/ovn-controller/0.log" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.639895 4831 generic.go:334] "Generic (PLEG): container finished" podID="5fe1a689-1241-4c11-93ca-875e53319668" containerID="cfbeb31890e53191672906e82bdc1b8672d17ef21f7bf4d793893c62a489706c" exitCode=143 Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.639949 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-95nwv" event={"ID":"5fe1a689-1241-4c11-93ca-875e53319668","Type":"ContainerDied","Data":"cfbeb31890e53191672906e82bdc1b8672d17ef21f7bf4d793893c62a489706c"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.641339 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" event={"ID":"4b96e376-8104-4a92-b1ec-6078943d0b50","Type":"ContainerStarted","Data":"0c8277d69038c0d9447efe65fefda45d9ad5b34d1becc16314549119edd58eca"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.641370 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" event={"ID":"4b96e376-8104-4a92-b1ec-6078943d0b50","Type":"ContainerStarted","Data":"9ccfc29d10c7135ff0ec34002d80d413f8e103fa94d72ba9d8371cb4bfe9fee0"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.643529 4831 generic.go:334] "Generic (PLEG): container finished" podID="5bf96e96-13ba-44c9-b16e-b1c2acbfc643" containerID="a1a3fe54cd3d30665e97d6fbc19db3951584ee5c9ac2f3ff590c8299afe5e9f9" exitCode=2 Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.643576 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5bf96e96-13ba-44c9-b16e-b1c2acbfc643","Type":"ContainerDied","Data":"a1a3fe54cd3d30665e97d6fbc19db3951584ee5c9ac2f3ff590c8299afe5e9f9"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.644991 4831 generic.go:334] "Generic (PLEG): container finished" podID="6f54df74-81ac-43f7-9075-51cb26200c4e" containerID="23562686db7ab5892626a9f6b6ffc89b32e0c913addd62ae2225fd861ed8de86" exitCode=1 Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.647535 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-l964v"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.647571 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronb468-account-delete-9hl65" event={"ID":"6f54df74-81ac-43f7-9075-51cb26200c4e","Type":"ContainerDied","Data":"23562686db7ab5892626a9f6b6ffc89b32e0c913addd62ae2225fd861ed8de86"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.647593 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronb468-account-delete-9hl65" event={"ID":"6f54df74-81ac-43f7-9075-51cb26200c4e","Type":"ContainerStarted","Data":"2b1f15a184562cc4375ff199a0fe5d2f16f37d8ddc349b48eb679d5612ccb59e"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.653705 4831 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openstack/neutronb468-account-delete-9hl65" secret="" err="secret \"galera-openstack-dockercfg-r7vst\" not found" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.653791 4831 scope.go:117] "RemoveContainer" containerID="23562686db7ab5892626a9f6b6ffc89b32e0c913addd62ae2225fd861ed8de86" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.664629 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement2619-account-delete-d6v4b"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.684679 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-2619-account-create-update-tlzdz"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.688735 4831 generic.go:334] "Generic (PLEG): container finished" podID="46af7209-8790-44ab-b255-8c84c3f5255a" containerID="805905b9e770e618ea67bbd05b6bb50de90789f9322e1d4a3c6163352843ffa7" exitCode=143 Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.688826 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d79b99f8c-4tl6f" event={"ID":"46af7209-8790-44ab-b255-8c84c3f5255a","Type":"ContainerDied","Data":"805905b9e770e618ea67bbd05b6bb50de90789f9322e1d4a3c6163352843ffa7"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.706741 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement2619-account-delete-d6v4b"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.718086 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-2619-account-create-update-tlzdz"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.723669 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-6d4758c757-wtw5x" event={"ID":"5d44a6a4-631f-4ccd-ac61-781623a04c11","Type":"ContainerDied","Data":"0b5fbb7474715b1f9abe3e4ac1638b3adf55f55ae3636addae962a7c63758e8a"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.723774 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-6d4758c757-wtw5x" Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.746006 4831 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.746599 4831 secret.go:188] Couldn't get secret openstack/cinder-api-config-data: secret "cinder-api-config-data" not found Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.746602 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts podName:6f54df74-81ac-43f7-9075-51cb26200c4e nodeName:}" failed. No retries permitted until 2025-12-03 06:55:34.246559424 +0000 UTC m=+1471.590142932 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts") pod "neutronb468-account-delete-9hl65" (UID: "6f54df74-81ac-43f7-9075-51cb26200c4e") : configmap "openstack-scripts" not found Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.746660 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom podName:1eab7a3f-11f0-4d00-b436-93cc30c2e8e1 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:37.746645557 +0000 UTC m=+1475.090229065 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data-custom" (UniqueName: "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom") pod "cinder-api-0" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1") : secret "cinder-api-config-data" not found Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.746160 4831 secret.go:188] Couldn't get secret openstack/cinder-scripts: secret "cinder-scripts" not found Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.746681 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts podName:1eab7a3f-11f0-4d00-b436-93cc30c2e8e1 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:37.746675628 +0000 UTC m=+1475.090259136 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "scripts" (UniqueName: "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts") pod "cinder-api-0" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1") : secret "cinder-scripts" not found Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.746129 4831 secret.go:188] Couldn't get secret openstack/cinder-config-data: secret "cinder-config-data" not found Dec 03 06:55:33 crc kubenswrapper[4831]: E1203 06:55:33.746701 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data podName:1eab7a3f-11f0-4d00-b436-93cc30c2e8e1 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:37.746696058 +0000 UTC m=+1475.090279566 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data") pod "cinder-api-0" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1") : secret "cinder-config-data" not found Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.781386 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-h8zhk"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.787070 4831 generic.go:334] "Generic (PLEG): container finished" podID="e0b16411-0a37-4432-965b-746c2d70d00b" containerID="02504848c3b09a3613ecf77f3dec52f7b03c139c38e10d22eeb182e3c700886c" exitCode=143 Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.787178 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-869cfdc5c4-7898s" event={"ID":"e0b16411-0a37-4432-965b-746c2d70d00b","Type":"ContainerDied","Data":"02504848c3b09a3613ecf77f3dec52f7b03c139c38e10d22eeb182e3c700886c"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.794470 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-h8zhk"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.804924 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-3a6d-account-create-update-gb5qw"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.809521 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:58822->10.217.0.204:8775: read: connection reset by peer" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.809812 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.204:8775/\": read tcp 10.217.0.2:58806->10.217.0.204:8775: read: connection reset by peer" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.822442 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-3a6d-account-create-update-gb5qw"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.855956 4831 generic.go:334] "Generic (PLEG): container finished" podID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerID="261de1853dcb7663f85295db044aad677b35575a677eb0bba2a868477180e3c8" exitCode=0 Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.855988 4831 generic.go:334] "Generic (PLEG): container finished" podID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerID="e71ff33317182316540c85e64e3ed72394f4b793212b2b2ad2dcf5211991e1f4" exitCode=2 Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.856032 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86a3b2cb-5698-41ab-939f-a7f1e3ccc998","Type":"ContainerDied","Data":"261de1853dcb7663f85295db044aad677b35575a677eb0bba2a868477180e3c8"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.856056 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86a3b2cb-5698-41ab-939f-a7f1e3ccc998","Type":"ContainerDied","Data":"e71ff33317182316540c85e64e3ed72394f4b793212b2b2ad2dcf5211991e1f4"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.871373 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder3a6d-account-delete-b6ptg"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.887035 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance3e24-account-delete-nkv48" event={"ID":"54eb5ffe-7e2f-4a33-9689-0470affe10e0","Type":"ContainerStarted","Data":"df3f1f9fe5181fd0912bd9e87bbd99cebc16052fe1a67dcafb470ec88c877bd0"} Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.887725 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-574cdc6988-72ggg" podUID="770b98aa-f177-4c7b-b37e-1664c039f47d" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": read tcp 10.217.0.2:34044->10.217.0.159:9311: read: connection reset by peer" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.887739 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-574cdc6988-72ggg" podUID="770b98aa-f177-4c7b-b37e-1664c039f47d" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.159:9311/healthcheck\": read tcp 10.217.0.2:34050->10.217.0.159:9311: read: connection reset by peer" Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.940756 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-vzchf"] Dec 03 06:55:33 crc kubenswrapper[4831]: I1203 06:55:33.979143 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-vzchf"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.057571 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3e24-account-create-update-zbd4p"] Dec 03 06:55:34 crc kubenswrapper[4831]: E1203 06:55:34.070955 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3733faf327347b93ef92bc4d96624b9ad7e5d0b2f31cffa41222a7c371a88c3d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 06:55:34 crc kubenswrapper[4831]: E1203 06:55:34.093987 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3733faf327347b93ef92bc4d96624b9ad7e5d0b2f31cffa41222a7c371a88c3d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.098444 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3e24-account-create-update-zbd4p"] Dec 03 06:55:34 crc kubenswrapper[4831]: E1203 06:55:34.107566 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="3733faf327347b93ef92bc4d96624b9ad7e5d0b2f31cffa41222a7c371a88c3d" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 06:55:34 crc kubenswrapper[4831]: E1203 06:55:34.107636 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="1a18ae10-7f43-4072-b01c-1564735985be" containerName="nova-cell0-conductor-conductor" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.126291 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance3e24-account-delete-nkv48"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.148238 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-8rkh8"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.148295 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-8rkh8"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.149059 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-3e51-account-create-update-k4ppw"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.174705 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican3e51-account-delete-4jnl5"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.186207 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-3e51-account-create-update-k4ppw"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.203616 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-w5hvd"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.208260 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstack-galera-0" podUID="55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" containerName="galera" containerID="cri-o://2083069d237ad9e634573977e455af0f77cad2d64fa39a6dce0010ff03639849" gracePeriod=30 Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.213841 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-w5hvd"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.229938 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-7556-account-create-update-h24bn"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.238109 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell07556-account-delete-rtljc"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.243663 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-7556-account-create-update-h24bn"] Dec 03 06:55:34 crc kubenswrapper[4831]: E1203 06:55:34.275725 4831 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 06:55:34 crc kubenswrapper[4831]: E1203 06:55:34.275836 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts podName:6f54df74-81ac-43f7-9075-51cb26200c4e nodeName:}" failed. No retries permitted until 2025-12-03 06:55:35.275814726 +0000 UTC m=+1472.619398234 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts") pod "neutronb468-account-delete-9hl65" (UID: "6f54df74-81ac-43f7-9075-51cb26200c4e") : configmap "openstack-scripts" not found Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.302252 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-q4vmd"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.325305 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-q4vmd"] Dec 03 06:55:34 crc kubenswrapper[4831]: E1203 06:55:34.325766 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42a7022437eee40dbce5d592f4a83cabd1f8ff8152cabd031da50508af22483d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 06:55:34 crc kubenswrapper[4831]: E1203 06:55:34.327126 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42a7022437eee40dbce5d592f4a83cabd1f8ff8152cabd031da50508af22483d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 06:55:34 crc kubenswrapper[4831]: E1203 06:55:34.330210 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="42a7022437eee40dbce5d592f4a83cabd1f8ff8152cabd031da50508af22483d" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 06:55:34 crc kubenswrapper[4831]: E1203 06:55:34.330259 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="aadb65a0-295d-4fcf-b148-44480346d357" containerName="nova-scheduler-scheduler" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.344374 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi8edc-account-delete-2pvdg"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.347088 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8edc-account-create-update-fnzhp"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.353912 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8edc-account-create-update-fnzhp"] Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.429190 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" containerName="cinder-api" probeResult="failure" output="Get \"https://10.217.0.173:8776/healthcheck\": dial tcp 10.217.0.173:8776: connect: connection refused" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.662576 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/neutron-844cdc6797-kqpvp" podUID="1f442a70-f040-4b0e-853d-6ce1f4caf63d" containerName="neutron-httpd" probeResult="failure" output="Get \"https://10.217.0.152:9696/\": dial tcp 10.217.0.152:9696: connect: connection refused" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.686531 4831 scope.go:117] "RemoveContainer" containerID="08a79d800c1a289cb8bcbf290c6e99647a9abaf260f75ecabb6f2350b81fe0f4" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.689453 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone7cf1-account-delete-k424c" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.797208 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.876536 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-combined-ca-bundle\") pod \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.876723 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-state-metrics-tls-certs\") pod \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.876778 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2qw6\" (UniqueName: \"kubernetes.io/projected/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-api-access-q2qw6\") pod \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.876834 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-state-metrics-tls-config\") pod \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\" (UID: \"5bf96e96-13ba-44c9-b16e-b1c2acbfc643\") " Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.896648 4831 scope.go:117] "RemoveContainer" containerID="4de12fb31b83a10dc830b04fa625356818e600bca291e7e4c8c179b0fba4550c" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.898001 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-api-access-q2qw6" (OuterVolumeSpecName: "kube-api-access-q2qw6") pod "5bf96e96-13ba-44c9-b16e-b1c2acbfc643" (UID: "5bf96e96-13ba-44c9-b16e-b1c2acbfc643"). InnerVolumeSpecName "kube-api-access-q2qw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.958913 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-state-metrics-tls-config" (OuterVolumeSpecName: "kube-state-metrics-tls-config") pod "5bf96e96-13ba-44c9-b16e-b1c2acbfc643" (UID: "5bf96e96-13ba-44c9-b16e-b1c2acbfc643"). InnerVolumeSpecName "kube-state-metrics-tls-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.961027 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-state-metrics-tls-certs" (OuterVolumeSpecName: "kube-state-metrics-tls-certs") pod "5bf96e96-13ba-44c9-b16e-b1c2acbfc643" (UID: "5bf96e96-13ba-44c9-b16e-b1c2acbfc643"). InnerVolumeSpecName "kube-state-metrics-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.975038 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican3e51-account-delete-4jnl5" event={"ID":"81084d6a-9987-4466-8f89-455aa3ff2627","Type":"ContainerStarted","Data":"3a6b737f1b9367c00835c9a404c4bc9ba862def65b579c5143283c26dba94964"} Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.986109 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi8edc-account-delete-2pvdg" event={"ID":"56055ee5-407e-4ced-865f-03585e5f7f7b","Type":"ContainerStarted","Data":"a3c72b37845e43bea5a19b3a3ec03f881d38c6c718efa1c8831dee03df6cff88"} Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.990449 4831 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-state-metrics-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.990473 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2qw6\" (UniqueName: \"kubernetes.io/projected/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-api-access-q2qw6\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.990482 4831 reconciler_common.go:293] "Volume detached for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-kube-state-metrics-tls-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.992205 4831 generic.go:334] "Generic (PLEG): container finished" podID="e1ffe861-7d12-49e2-9737-fc100833da39" containerID="2a6aed804c2a9f3ed8236fffc4704163803135cc2b4f57e197408d2f4c85bb43" exitCode=0 Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.992267 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e1ffe861-7d12-49e2-9737-fc100833da39","Type":"ContainerDied","Data":"2a6aed804c2a9f3ed8236fffc4704163803135cc2b4f57e197408d2f4c85bb43"} Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.994010 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell07556-account-delete-rtljc" event={"ID":"2ef057cb-0d02-49d4-a20d-ac9a3f85484e","Type":"ContainerStarted","Data":"d29ae68af819bb49831c646ba7cee33f52be33973d438d7bca8d1ed398a01440"} Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.997025 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-95nwv_5fe1a689-1241-4c11-93ca-875e53319668/ovn-controller/0.log" Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.997086 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-95nwv" event={"ID":"5fe1a689-1241-4c11-93ca-875e53319668","Type":"ContainerDied","Data":"23be6442d413a3e0cc1c8839e5cef9929c9b6459e187771ca9c32fd102e5b8a4"} Dec 03 06:55:34 crc kubenswrapper[4831]: I1203 06:55:34.997111 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="23be6442d413a3e0cc1c8839e5cef9929c9b6459e187771ca9c32fd102e5b8a4" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:34.998375 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5bf96e96-13ba-44c9-b16e-b1c2acbfc643" (UID: "5bf96e96-13ba-44c9-b16e-b1c2acbfc643"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.000151 4831 generic.go:334] "Generic (PLEG): container finished" podID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerID="fcc1e170c1524fa94e168039a9e3d7633e7672aa2c44dec0ccc17b3dca0a710d" exitCode=0 Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.000224 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86a3b2cb-5698-41ab-939f-a7f1e3ccc998","Type":"ContainerDied","Data":"fcc1e170c1524fa94e168039a9e3d7633e7672aa2c44dec0ccc17b3dca0a710d"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.005087 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"5bf96e96-13ba-44c9-b16e-b1c2acbfc643","Type":"ContainerDied","Data":"b7b04ebc30dc59fa6e6dda0dd683307313d31792abba044137eac746a0377a8a"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.005155 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.022231 4831 generic.go:334] "Generic (PLEG): container finished" podID="267687cf-58da-42e0-852e-c8c87f2ea42a" containerID="134ca55163ecb5e5269d7ec7014e4b029385ad75e92d2133a4bef2bd84b55d9f" exitCode=0 Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.035938 4831 generic.go:334] "Generic (PLEG): container finished" podID="9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" containerID="a5a52868ab8bcbffa60ec710ffeaf3085cae1de015dd39d00cd7003328b22a28" exitCode=0 Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.038352 4831 generic.go:334] "Generic (PLEG): container finished" podID="aadb65a0-295d-4fcf-b148-44480346d357" containerID="42a7022437eee40dbce5d592f4a83cabd1f8ff8152cabd031da50508af22483d" exitCode=0 Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.045432 4831 generic.go:334] "Generic (PLEG): container finished" podID="7a5d88e3-73a3-4f3d-af31-af675ab452bd" containerID="74e0644a9792269f43c86550ff06e6fa5782d3aaca7a69696c14e40e26a5beec" exitCode=0 Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.052043 4831 generic.go:334] "Generic (PLEG): container finished" podID="1a18ae10-7f43-4072-b01c-1564735985be" containerID="3733faf327347b93ef92bc4d96624b9ad7e5d0b2f31cffa41222a7c371a88c3d" exitCode=0 Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.056375 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0310482f-ea00-44c5-8450-6b3b3ebe5e5f" path="/var/lib/kubelet/pods/0310482f-ea00-44c5-8450-6b3b3ebe5e5f/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.056851 4831 generic.go:334] "Generic (PLEG): container finished" podID="c60bce87-ea0b-4b3d-8243-93ed40c232ff" containerID="d2b937efb5cb363c5d2b1ffebf2c05783efceefbe6e756b381e8b96438b287fb" exitCode=0 Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.059930 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="072f167c-3d27-49f2-aae9-84ddb84e6a8e" path="/var/lib/kubelet/pods/072f167c-3d27-49f2-aae9-84ddb84e6a8e/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.060827 4831 generic.go:334] "Generic (PLEG): container finished" podID="1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" containerID="0fb9b727228c96fa187f5a0e156c0ff4cac686daaafe9b93470fdfcc2071ea6f" exitCode=0 Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.062046 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29e57717-9dd6-441b-b3bc-563ce1951f14" path="/var/lib/kubelet/pods/29e57717-9dd6-441b-b3bc-563ce1951f14/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.079566 4831 generic.go:334] "Generic (PLEG): container finished" podID="770b98aa-f177-4c7b-b37e-1664c039f47d" containerID="9c37d8393b980bdf5ae1d421b8b8297aec83f883d91765e900851761b2758842" exitCode=0 Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.082867 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3614a116-7996-4106-a222-e2542d9dd89d" path="/var/lib/kubelet/pods/3614a116-7996-4106-a222-e2542d9dd89d/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.083737 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a2f4d8e-71bd-4abc-b0db-165cb43fea80" path="/var/lib/kubelet/pods/4a2f4d8e-71bd-4abc-b0db-165cb43fea80/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.084365 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59ff8d77-29ce-4d59-919c-4280b2489608" path="/var/lib/kubelet/pods/59ff8d77-29ce-4d59-919c-4280b2489608/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.085593 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="66411938-f6a0-4c6a-a478-2b9a451a2275" path="/var/lib/kubelet/pods/66411938-f6a0-4c6a-a478-2b9a451a2275/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.086491 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="67dbccb6-784e-4fba-8b01-227a5b3d1a3e" path="/var/lib/kubelet/pods/67dbccb6-784e-4fba-8b01-227a5b3d1a3e/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.087525 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="78440d07-b290-4710-a9b0-ce6c0b5efa0c" path="/var/lib/kubelet/pods/78440d07-b290-4710-a9b0-ce6c0b5efa0c/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.088207 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12b0f68-97d1-4253-b83b-692dbb30d970" path="/var/lib/kubelet/pods/b12b0f68-97d1-4253-b83b-692dbb30d970/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.088797 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c30990-3c55-4441-8325-1132b7411671" path="/var/lib/kubelet/pods/b2c30990-3c55-4441-8325-1132b7411671/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.089338 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b55e481e-04a7-4cc6-bddd-7f8a87ca6b77" path="/var/lib/kubelet/pods/b55e481e-04a7-4cc6-bddd-7f8a87ca6b77/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.090849 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba9ef9a7-5412-4f03-8656-86f20966986f" path="/var/lib/kubelet/pods/ba9ef9a7-5412-4f03-8656-86f20966986f/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.092063 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5bf96e96-13ba-44c9-b16e-b1c2acbfc643-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.095252 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bce2f30f-5796-4bcd-8c15-f9c71969ebb1" path="/var/lib/kubelet/pods/bce2f30f-5796-4bcd-8c15-f9c71969ebb1/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.096699 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0778b59-e2e9-4afe-ab6d-3d85eebba895" path="/var/lib/kubelet/pods/e0778b59-e2e9-4afe-ab6d-3d85eebba895/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.097223 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4cd3e86-d46d-47f3-8477-4dfff4134915" path="/var/lib/kubelet/pods/e4cd3e86-d46d-47f3-8477-4dfff4134915/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.097747 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa364658-00c2-41ba-bb0d-eaae5161de19" path="/var/lib/kubelet/pods/fa364658-00c2-41ba-bb0d-eaae5161de19/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.098824 4831 generic.go:334] "Generic (PLEG): container finished" podID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerID="f6d28e8b3135eebdfd374f9c303c4d8edb75a0da36bd550a21205abedf1942c2" exitCode=0 Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.098946 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone7cf1-account-delete-k424c" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.100221 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff" path="/var/lib/kubelet/pods/fcbcaae5-50c9-4679-9af5-d9cf5ba7a6ff/volumes" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.100791 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"267687cf-58da-42e0-852e-c8c87f2ea42a","Type":"ContainerDied","Data":"134ca55163ecb5e5269d7ec7014e4b029385ad75e92d2133a4bef2bd84b55d9f"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.100814 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"267687cf-58da-42e0-852e-c8c87f2ea42a","Type":"ContainerDied","Data":"8a05a40ba3b50363e84091dad83d45d1e43ef23cccb85cb2cf5a9c3d010401e0"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110597 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a05a40ba3b50363e84091dad83d45d1e43ef23cccb85cb2cf5a9c3d010401e0" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110625 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement2619-account-delete-d6v4b" event={"ID":"20722e3f-f810-4ac7-80d7-09cae400150a","Type":"ContainerStarted","Data":"dd287a893c75f576c5531a39dd2895a9a3b0aeab155d78a6353480d4431e316c"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110646 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50","Type":"ContainerDied","Data":"a5a52868ab8bcbffa60ec710ffeaf3085cae1de015dd39d00cd7003328b22a28"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110661 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50","Type":"ContainerDied","Data":"7c9fd6d53d6b13b72e1b080be38d06ab0ab8c4de2d04ebb61ebaf54998d99e46"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110669 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c9fd6d53d6b13b72e1b080be38d06ab0ab8c4de2d04ebb61ebaf54998d99e46" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110677 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aadb65a0-295d-4fcf-b148-44480346d357","Type":"ContainerDied","Data":"42a7022437eee40dbce5d592f4a83cabd1f8ff8152cabd031da50508af22483d"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110688 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a5d88e3-73a3-4f3d-af31-af675ab452bd","Type":"ContainerDied","Data":"74e0644a9792269f43c86550ff06e6fa5782d3aaca7a69696c14e40e26a5beec"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110700 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"7a5d88e3-73a3-4f3d-af31-af675ab452bd","Type":"ContainerDied","Data":"2dcbb7352b7ae850b9362974b1e839ce31bdf2ecd80c09b95e3c0056154962ad"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110708 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dcbb7352b7ae850b9362974b1e839ce31bdf2ecd80c09b95e3c0056154962ad" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110715 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1a18ae10-7f43-4072-b01c-1564735985be","Type":"ContainerDied","Data":"3733faf327347b93ef92bc4d96624b9ad7e5d0b2f31cffa41222a7c371a88c3d"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110729 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5d9bffbcdd-ztjkw" event={"ID":"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69","Type":"ContainerDied","Data":"48ad030d7e7240a5a4a994cc89a03c1b56483b26d918f8e50821b561778b7b57"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110739 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48ad030d7e7240a5a4a994cc89a03c1b56483b26d918f8e50821b561778b7b57" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110747 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c60bce87-ea0b-4b3d-8243-93ed40c232ff","Type":"ContainerDied","Data":"d2b937efb5cb363c5d2b1ffebf2c05783efceefbe6e756b381e8b96438b287fb"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110759 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder3a6d-account-delete-b6ptg" event={"ID":"e69614e3-a574-42e1-adc2-09861e9974e5","Type":"ContainerStarted","Data":"1c0a461627ebef432fa1b5bcbe2e6bde6b1afb13041733bfe0e1de0dc065b1f9"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110773 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1","Type":"ContainerDied","Data":"0fb9b727228c96fa187f5a0e156c0ff4cac686daaafe9b93470fdfcc2071ea6f"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110784 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574cdc6988-72ggg" event={"ID":"770b98aa-f177-4c7b-b37e-1664c039f47d","Type":"ContainerDied","Data":"9c37d8393b980bdf5ae1d421b8b8297aec83f883d91765e900851761b2758842"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110795 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-574cdc6988-72ggg" event={"ID":"770b98aa-f177-4c7b-b37e-1664c039f47d","Type":"ContainerDied","Data":"473b00dcb7558e67e9aa08359e3ab4519754512b16d181ab50164a20e690d308"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110802 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="473b00dcb7558e67e9aa08359e3ab4519754512b16d181ab50164a20e690d308" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110810 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586","Type":"ContainerDied","Data":"f6d28e8b3135eebdfd374f9c303c4d8edb75a0da36bd550a21205abedf1942c2"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110820 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586","Type":"ContainerDied","Data":"3adbb64821e5a6d0a9e9d25bb799ddb3dcbe390f239f4223862330f591416d59"} Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.110827 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3adbb64821e5a6d0a9e9d25bb799ddb3dcbe390f239f4223862330f591416d59" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.203596 4831 scope.go:117] "RemoveContainer" containerID="aca1e2538c4e0fa0e9737c5d8550f625f94f8366334b8f2873812e5895924cfc" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.217360 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc"] Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.230733 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-6bc44fb5cd-xtqcc"] Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.237792 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-proxy-6d4758c757-wtw5x"] Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.243867 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-proxy-6d4758c757-wtw5x"] Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.298863 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-95nwv_5fe1a689-1241-4c11-93ca-875e53319668/ovn-controller/0.log" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.298958 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-95nwv" Dec 03 06:55:35 crc kubenswrapper[4831]: E1203 06:55:35.304373 4831 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 06:55:35 crc kubenswrapper[4831]: E1203 06:55:35.304456 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts podName:6f54df74-81ac-43f7-9075-51cb26200c4e nodeName:}" failed. No retries permitted until 2025-12-03 06:55:37.304434133 +0000 UTC m=+1474.648017641 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts") pod "neutronb468-account-delete-9hl65" (UID: "6f54df74-81ac-43f7-9075-51cb26200c4e") : configmap "openstack-scripts" not found Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.364635 4831 scope.go:117] "RemoveContainer" containerID="499ef1cada2eb4986f003c7c29c7c54c82c99a884b5bcf1f1b4f4fbda65a3300" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.394885 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.404923 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-internal-tls-certs\") pod \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.405002 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb6d7\" (UniqueName: \"kubernetes.io/projected/5fe1a689-1241-4c11-93ca-875e53319668-kube-api-access-kb6d7\") pod \"5fe1a689-1241-4c11-93ca-875e53319668\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.405043 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-public-tls-certs\") pod \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.405143 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe1a689-1241-4c11-93ca-875e53319668-ovn-controller-tls-certs\") pod \"5fe1a689-1241-4c11-93ca-875e53319668\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.405266 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-logs\") pod \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.405298 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgdxp\" (UniqueName: \"kubernetes.io/projected/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-kube-api-access-tgdxp\") pod \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.405330 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-run-ovn\") pod \"5fe1a689-1241-4c11-93ca-875e53319668\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.405346 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fe1a689-1241-4c11-93ca-875e53319668-combined-ca-bundle\") pod \"5fe1a689-1241-4c11-93ca-875e53319668\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.405367 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-run\") pod \"5fe1a689-1241-4c11-93ca-875e53319668\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.405550 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-config-data\") pod \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.405568 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fe1a689-1241-4c11-93ca-875e53319668-scripts\") pod \"5fe1a689-1241-4c11-93ca-875e53319668\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.405620 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-scripts\") pod \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.405661 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-combined-ca-bundle\") pod \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\" (UID: \"cccc3a0b-98c7-4930-a7b5-3c1320a5ee69\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.405695 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-log-ovn\") pod \"5fe1a689-1241-4c11-93ca-875e53319668\" (UID: \"5fe1a689-1241-4c11-93ca-875e53319668\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.406107 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5fe1a689-1241-4c11-93ca-875e53319668" (UID: "5fe1a689-1241-4c11-93ca-875e53319668"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.406944 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.406979 4831 scope.go:117] "RemoveContainer" containerID="9e502551a8c629987632ae4db8d5bb2dcc87f5d6796281aca903d0518128dc14" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.408033 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-logs" (OuterVolumeSpecName: "logs") pod "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" (UID: "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.408505 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5fe1a689-1241-4c11-93ca-875e53319668-scripts" (OuterVolumeSpecName: "scripts") pod "5fe1a689-1241-4c11-93ca-875e53319668" (UID: "5fe1a689-1241-4c11-93ca-875e53319668"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.408682 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-run" (OuterVolumeSpecName: "var-run") pod "5fe1a689-1241-4c11-93ca-875e53319668" (UID: "5fe1a689-1241-4c11-93ca-875e53319668"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.410707 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.415386 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5fe1a689-1241-4c11-93ca-875e53319668" (UID: "5fe1a689-1241-4c11-93ca-875e53319668"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.420187 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe1a689-1241-4c11-93ca-875e53319668-kube-api-access-kb6d7" (OuterVolumeSpecName: "kube-api-access-kb6d7") pod "5fe1a689-1241-4c11-93ca-875e53319668" (UID: "5fe1a689-1241-4c11-93ca-875e53319668"). InnerVolumeSpecName "kube-api-access-kb6d7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.435428 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.437121 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.437702 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.457406 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone7cf1-account-delete-k424c"] Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.461814 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone7cf1-account-delete-k424c"] Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.464346 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-kube-api-access-tgdxp" (OuterVolumeSpecName: "kube-api-access-tgdxp") pod "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" (UID: "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69"). InnerVolumeSpecName "kube-api-access-tgdxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.469619 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.482505 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.484766 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-scripts" (OuterVolumeSpecName: "scripts") pod "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" (UID: "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.488187 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.491256 4831 scope.go:117] "RemoveContainer" containerID="bbb0b5757917dd69f07b05eab8c7233245af6d5b34e3733e43bfb640fb96d6e3" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.495160 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.504305 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.506922 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-scripts\") pod \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.506974 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"267687cf-58da-42e0-852e-c8c87f2ea42a\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507015 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h5dxt\" (UniqueName: \"kubernetes.io/projected/aadb65a0-295d-4fcf-b148-44480346d357-kube-api-access-h5dxt\") pod \"aadb65a0-295d-4fcf-b148-44480346d357\" (UID: \"aadb65a0-295d-4fcf-b148-44480346d357\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507040 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-public-tls-certs\") pod \"267687cf-58da-42e0-852e-c8c87f2ea42a\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507064 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-internal-tls-certs\") pod \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507132 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-logs\") pod \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507169 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-config-data\") pod \"770b98aa-f177-4c7b-b37e-1664c039f47d\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507193 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-scripts\") pod \"267687cf-58da-42e0-852e-c8c87f2ea42a\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507231 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/267687cf-58da-42e0-852e-c8c87f2ea42a-logs\") pod \"267687cf-58da-42e0-852e-c8c87f2ea42a\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507255 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-internal-tls-certs\") pod \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507369 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aadb65a0-295d-4fcf-b148-44480346d357-combined-ca-bundle\") pod \"aadb65a0-295d-4fcf-b148-44480346d357\" (UID: \"aadb65a0-295d-4fcf-b148-44480346d357\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507400 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-config-data\") pod \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507426 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-nova-metadata-tls-certs\") pod \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507479 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-logs\") pod \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507515 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-combined-ca-bundle\") pod \"267687cf-58da-42e0-852e-c8c87f2ea42a\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507536 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-config-data\") pod \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507571 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czslk\" (UniqueName: \"kubernetes.io/projected/770b98aa-f177-4c7b-b37e-1664c039f47d-kube-api-access-czslk\") pod \"770b98aa-f177-4c7b-b37e-1664c039f47d\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507599 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-combined-ca-bundle\") pod \"770b98aa-f177-4c7b-b37e-1664c039f47d\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507629 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4x2j\" (UniqueName: \"kubernetes.io/projected/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-kube-api-access-j4x2j\") pod \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507656 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-public-tls-certs\") pod \"770b98aa-f177-4c7b-b37e-1664c039f47d\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507680 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-combined-ca-bundle\") pod \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507704 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/267687cf-58da-42e0-852e-c8c87f2ea42a-httpd-run\") pod \"267687cf-58da-42e0-852e-c8c87f2ea42a\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507730 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aadb65a0-295d-4fcf-b148-44480346d357-config-data\") pod \"aadb65a0-295d-4fcf-b148-44480346d357\" (UID: \"aadb65a0-295d-4fcf-b148-44480346d357\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507778 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a18ae10-7f43-4072-b01c-1564735985be-config-data\") pod \"1a18ae10-7f43-4072-b01c-1564735985be\" (UID: \"1a18ae10-7f43-4072-b01c-1564735985be\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507809 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a18ae10-7f43-4072-b01c-1564735985be-combined-ca-bundle\") pod \"1a18ae10-7f43-4072-b01c-1564735985be\" (UID: \"1a18ae10-7f43-4072-b01c-1564735985be\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507840 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2rj4\" (UniqueName: \"kubernetes.io/projected/1a18ae10-7f43-4072-b01c-1564735985be-kube-api-access-d2rj4\") pod \"1a18ae10-7f43-4072-b01c-1564735985be\" (UID: \"1a18ae10-7f43-4072-b01c-1564735985be\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507867 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcj8g\" (UniqueName: \"kubernetes.io/projected/267687cf-58da-42e0-852e-c8c87f2ea42a-kube-api-access-fcj8g\") pod \"267687cf-58da-42e0-852e-c8c87f2ea42a\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507923 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507957 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-config-data\") pod \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.507990 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-public-tls-certs\") pod \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\" (UID: \"9a438bff-fbe4-4ae4-8d0f-3eecc1819f50\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508013 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-internal-tls-certs\") pod \"770b98aa-f177-4c7b-b37e-1664c039f47d\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508034 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-config-data\") pod \"267687cf-58da-42e0-852e-c8c87f2ea42a\" (UID: \"267687cf-58da-42e0-852e-c8c87f2ea42a\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508058 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-combined-ca-bundle\") pod \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508112 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x45tm\" (UniqueName: \"kubernetes.io/projected/7a5d88e3-73a3-4f3d-af31-af675ab452bd-kube-api-access-x45tm\") pod \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508137 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-combined-ca-bundle\") pod \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508160 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a5d88e3-73a3-4f3d-af31-af675ab452bd-logs\") pod \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508187 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnzss\" (UniqueName: \"kubernetes.io/projected/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-kube-api-access-gnzss\") pod \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\" (UID: \"a5bbc77d-ca7b-48ab-b8bc-b304a12bb586\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508218 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-config-data-custom\") pod \"770b98aa-f177-4c7b-b37e-1664c039f47d\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508242 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a5d88e3-73a3-4f3d-af31-af675ab452bd-httpd-run\") pod \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\" (UID: \"7a5d88e3-73a3-4f3d-af31-af675ab452bd\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508263 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/770b98aa-f177-4c7b-b37e-1664c039f47d-logs\") pod \"770b98aa-f177-4c7b-b37e-1664c039f47d\" (UID: \"770b98aa-f177-4c7b-b37e-1664c039f47d\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508793 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5fe1a689-1241-4c11-93ca-875e53319668-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508818 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508831 4831 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508844 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb6d7\" (UniqueName: \"kubernetes.io/projected/5fe1a689-1241-4c11-93ca-875e53319668-kube-api-access-kb6d7\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508858 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508888 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgdxp\" (UniqueName: \"kubernetes.io/projected/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-kube-api-access-tgdxp\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508901 4831 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.508915 4831 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5fe1a689-1241-4c11-93ca-875e53319668-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.513498 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/267687cf-58da-42e0-852e-c8c87f2ea42a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "267687cf-58da-42e0-852e-c8c87f2ea42a" (UID: "267687cf-58da-42e0-852e-c8c87f2ea42a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.519872 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a5d88e3-73a3-4f3d-af31-af675ab452bd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "7a5d88e3-73a3-4f3d-af31-af675ab452bd" (UID: "7a5d88e3-73a3-4f3d-af31-af675ab452bd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.520505 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7a5d88e3-73a3-4f3d-af31-af675ab452bd-logs" (OuterVolumeSpecName: "logs") pod "7a5d88e3-73a3-4f3d-af31-af675ab452bd" (UID: "7a5d88e3-73a3-4f3d-af31-af675ab452bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.524875 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-logs" (OuterVolumeSpecName: "logs") pod "9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" (UID: "9a438bff-fbe4-4ae4-8d0f-3eecc1819f50"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.529870 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/267687cf-58da-42e0-852e-c8c87f2ea42a-logs" (OuterVolumeSpecName: "logs") pod "267687cf-58da-42e0-852e-c8c87f2ea42a" (UID: "267687cf-58da-42e0-852e-c8c87f2ea42a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.531751 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.533364 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.539462 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-logs" (OuterVolumeSpecName: "logs") pod "a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" (UID: "a5bbc77d-ca7b-48ab-b8bc-b304a12bb586"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.571707 4831 scope.go:117] "RemoveContainer" containerID="8b47e6d72bbe33bdcb24b9fec31efaddaf7bd0c2d98f21c24f88fbde38ed48a1" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.587064 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/770b98aa-f177-4c7b-b37e-1664c039f47d-logs" (OuterVolumeSpecName: "logs") pod "770b98aa-f177-4c7b-b37e-1664c039f47d" (UID: "770b98aa-f177-4c7b-b37e-1664c039f47d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.609878 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-internal-tls-certs\") pod \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.609966 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c60bce87-ea0b-4b3d-8243-93ed40c232ff-config-data\") pod \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\" (UID: \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610015 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-logs\") pod \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610065 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zz2mw\" (UniqueName: \"kubernetes.io/projected/c60bce87-ea0b-4b3d-8243-93ed40c232ff-kube-api-access-zz2mw\") pod \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\" (UID: \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610156 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ffe861-7d12-49e2-9737-fc100833da39-combined-ca-bundle\") pod \"e1ffe861-7d12-49e2-9737-fc100833da39\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610199 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwcmz\" (UniqueName: \"kubernetes.io/projected/e1ffe861-7d12-49e2-9737-fc100833da39-kube-api-access-rwcmz\") pod \"e1ffe861-7d12-49e2-9737-fc100833da39\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610285 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-public-tls-certs\") pod \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610375 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1ffe861-7d12-49e2-9737-fc100833da39-config-data\") pod \"e1ffe861-7d12-49e2-9737-fc100833da39\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610406 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom\") pod \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610461 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ffe861-7d12-49e2-9737-fc100833da39-memcached-tls-certs\") pod \"e1ffe861-7d12-49e2-9737-fc100833da39\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610476 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-combined-ca-bundle\") pod \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610496 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e1ffe861-7d12-49e2-9737-fc100833da39-kolla-config\") pod \"e1ffe861-7d12-49e2-9737-fc100833da39\" (UID: \"e1ffe861-7d12-49e2-9737-fc100833da39\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610523 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data\") pod \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610554 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlhps\" (UniqueName: \"kubernetes.io/projected/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-kube-api-access-mlhps\") pod \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610591 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts\") pod \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610605 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-etc-machine-id\") pod \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\" (UID: \"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610632 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60bce87-ea0b-4b3d-8243-93ed40c232ff-combined-ca-bundle\") pod \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\" (UID: \"c60bce87-ea0b-4b3d-8243-93ed40c232ff\") " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610988 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/267687cf-58da-42e0-852e-c8c87f2ea42a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.611002 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7a5d88e3-73a3-4f3d-af31-af675ab452bd-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.611011 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/7a5d88e3-73a3-4f3d-af31-af675ab452bd-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.611019 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/770b98aa-f177-4c7b-b37e-1664c039f47d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.611027 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.611035 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/267687cf-58da-42e0-852e-c8c87f2ea42a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.611044 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.610367 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "7a5d88e3-73a3-4f3d-af31-af675ab452bd" (UID: "7a5d88e3-73a3-4f3d-af31-af675ab452bd"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: E1203 06:55:35.611096 4831 configmap.go:193] Couldn't get configMap openstack/rabbitmq-config-data: configmap "rabbitmq-config-data" not found Dec 03 06:55:35 crc kubenswrapper[4831]: E1203 06:55:35.613219 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data podName:dc0cbb94-92ec-4369-b609-f3186f302c66 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:43.613179776 +0000 UTC m=+1480.956763274 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data") pod "rabbitmq-server-0" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66") : configmap "rabbitmq-config-data" not found Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.613020 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.613025 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-scripts" (OuterVolumeSpecName: "scripts") pod "7a5d88e3-73a3-4f3d-af31-af675ab452bd" (UID: "7a5d88e3-73a3-4f3d-af31-af675ab452bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.613074 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-logs" (OuterVolumeSpecName: "logs") pod "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.614337 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1ffe861-7d12-49e2-9737-fc100833da39-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "e1ffe861-7d12-49e2-9737-fc100833da39" (UID: "e1ffe861-7d12-49e2-9737-fc100833da39"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.615541 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e1ffe861-7d12-49e2-9737-fc100833da39-config-data" (OuterVolumeSpecName: "config-data") pod "e1ffe861-7d12-49e2-9737-fc100833da39" (UID: "e1ffe861-7d12-49e2-9737-fc100833da39"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.617251 4831 scope.go:117] "RemoveContainer" containerID="a1a3fe54cd3d30665e97d6fbc19db3951584ee5c9ac2f3ff590c8299afe5e9f9" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.625490 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aadb65a0-295d-4fcf-b148-44480346d357-kube-api-access-h5dxt" (OuterVolumeSpecName: "kube-api-access-h5dxt") pod "aadb65a0-295d-4fcf-b148-44480346d357" (UID: "aadb65a0-295d-4fcf-b148-44480346d357"). InnerVolumeSpecName "kube-api-access-h5dxt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.625577 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "267687cf-58da-42e0-852e-c8c87f2ea42a" (UID: "267687cf-58da-42e0-852e-c8c87f2ea42a"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.655980 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a5d88e3-73a3-4f3d-af31-af675ab452bd-kube-api-access-x45tm" (OuterVolumeSpecName: "kube-api-access-x45tm") pod "7a5d88e3-73a3-4f3d-af31-af675ab452bd" (UID: "7a5d88e3-73a3-4f3d-af31-af675ab452bd"). InnerVolumeSpecName "kube-api-access-x45tm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.704708 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c60bce87-ea0b-4b3d-8243-93ed40c232ff-kube-api-access-zz2mw" (OuterVolumeSpecName: "kube-api-access-zz2mw") pod "c60bce87-ea0b-4b3d-8243-93ed40c232ff" (UID: "c60bce87-ea0b-4b3d-8243-93ed40c232ff"). InnerVolumeSpecName "kube-api-access-zz2mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.708725 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-kube-api-access-j4x2j" (OuterVolumeSpecName: "kube-api-access-j4x2j") pod "9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" (UID: "9a438bff-fbe4-4ae4-8d0f-3eecc1819f50"). InnerVolumeSpecName "kube-api-access-j4x2j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.709536 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-scripts" (OuterVolumeSpecName: "scripts") pod "267687cf-58da-42e0-852e-c8c87f2ea42a" (UID: "267687cf-58da-42e0-852e-c8c87f2ea42a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.710606 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-kube-api-access-gnzss" (OuterVolumeSpecName: "kube-api-access-gnzss") pod "a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" (UID: "a5bbc77d-ca7b-48ab-b8bc-b304a12bb586"). InnerVolumeSpecName "kube-api-access-gnzss". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.710612 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "770b98aa-f177-4c7b-b37e-1664c039f47d" (UID: "770b98aa-f177-4c7b-b37e-1664c039f47d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.710655 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a18ae10-7f43-4072-b01c-1564735985be-kube-api-access-d2rj4" (OuterVolumeSpecName: "kube-api-access-d2rj4") pod "1a18ae10-7f43-4072-b01c-1564735985be" (UID: "1a18ae10-7f43-4072-b01c-1564735985be"). InnerVolumeSpecName "kube-api-access-d2rj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.710929 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/267687cf-58da-42e0-852e-c8c87f2ea42a-kube-api-access-fcj8g" (OuterVolumeSpecName: "kube-api-access-fcj8g") pod "267687cf-58da-42e0-852e-c8c87f2ea42a" (UID: "267687cf-58da-42e0-852e-c8c87f2ea42a"). InnerVolumeSpecName "kube-api-access-fcj8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717740 4831 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717764 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j4x2j\" (UniqueName: \"kubernetes.io/projected/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-kube-api-access-j4x2j\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717773 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717783 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2rj4\" (UniqueName: \"kubernetes.io/projected/1a18ae10-7f43-4072-b01c-1564735985be-kube-api-access-d2rj4\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717811 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zz2mw\" (UniqueName: \"kubernetes.io/projected/c60bce87-ea0b-4b3d-8243-93ed40c232ff-kube-api-access-zz2mw\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717821 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcj8g\" (UniqueName: \"kubernetes.io/projected/267687cf-58da-42e0-852e-c8c87f2ea42a-kube-api-access-fcj8g\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717843 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717852 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x45tm\" (UniqueName: \"kubernetes.io/projected/7a5d88e3-73a3-4f3d-af31-af675ab452bd-kube-api-access-x45tm\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717862 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnzss\" (UniqueName: \"kubernetes.io/projected/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-kube-api-access-gnzss\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717890 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717900 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717915 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717923 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h5dxt\" (UniqueName: \"kubernetes.io/projected/aadb65a0-295d-4fcf-b148-44480346d357-kube-api-access-h5dxt\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717932 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e1ffe861-7d12-49e2-9737-fc100833da39-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717943 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.717970 4831 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/e1ffe861-7d12-49e2-9737-fc100833da39-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.719219 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/770b98aa-f177-4c7b-b37e-1664c039f47d-kube-api-access-czslk" (OuterVolumeSpecName: "kube-api-access-czslk") pod "770b98aa-f177-4c7b-b37e-1664c039f47d" (UID: "770b98aa-f177-4c7b-b37e-1664c039f47d"). InnerVolumeSpecName "kube-api-access-czslk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.725617 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-kube-api-access-mlhps" (OuterVolumeSpecName: "kube-api-access-mlhps") pod "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1"). InnerVolumeSpecName "kube-api-access-mlhps". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.726503 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.726672 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ffe861-7d12-49e2-9737-fc100833da39-kube-api-access-rwcmz" (OuterVolumeSpecName: "kube-api-access-rwcmz") pod "e1ffe861-7d12-49e2-9737-fc100833da39" (UID: "e1ffe861-7d12-49e2-9737-fc100833da39"). InnerVolumeSpecName "kube-api-access-rwcmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.742028 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts" (OuterVolumeSpecName: "scripts") pod "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.819897 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlhps\" (UniqueName: \"kubernetes.io/projected/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-kube-api-access-mlhps\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.819941 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.819959 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czslk\" (UniqueName: \"kubernetes.io/projected/770b98aa-f177-4c7b-b37e-1664c039f47d-kube-api-access-czslk\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.819976 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwcmz\" (UniqueName: \"kubernetes.io/projected/e1ffe861-7d12-49e2-9737-fc100833da39-kube-api-access-rwcmz\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:35 crc kubenswrapper[4831]: I1203 06:55:35.819991 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.088486 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.131689 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.135492 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" event={"ID":"4b96e376-8104-4a92-b1ec-6078943d0b50","Type":"ContainerStarted","Data":"09ee85d7ac2cb3cc7d8f856bddb0cb3f115aeb97e79547bb4adf57568c348fee"} Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.136017 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" podUID="4b96e376-8104-4a92-b1ec-6078943d0b50" containerName="barbican-keystone-listener-log" containerID="cri-o://0c8277d69038c0d9447efe65fefda45d9ad5b34d1becc16314549119edd58eca" gracePeriod=30 Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.136427 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" podUID="4b96e376-8104-4a92-b1ec-6078943d0b50" containerName="barbican-keystone-listener" containerID="cri-o://09ee85d7ac2cb3cc7d8f856bddb0cb3f115aeb97e79547bb4adf57568c348fee" gracePeriod=30 Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.150412 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.153129 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1a18ae10-7f43-4072-b01c-1564735985be","Type":"ContainerDied","Data":"a6116f2662c7629f66fb5e0289f22eab236dae54ed70aba728091ccce2d2fbd4"} Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.153177 4831 scope.go:117] "RemoveContainer" containerID="3733faf327347b93ef92bc4d96624b9ad7e5d0b2f31cffa41222a7c371a88c3d" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.153284 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.177519 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" podStartSLOduration=9.177499065 podStartE2EDuration="9.177499065s" podCreationTimestamp="2025-12-03 06:55:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:55:36.175328928 +0000 UTC m=+1473.518912436" watchObservedRunningTime="2025-12-03 06:55:36.177499065 +0000 UTC m=+1473.521082573" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.184165 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell07556-account-delete-rtljc" event={"ID":"2ef057cb-0d02-49d4-a20d-ac9a3f85484e","Type":"ContainerStarted","Data":"1ce4d862c594baa625d818638ef9af5cb1a640bf2485dc86c14507efd56534c6"} Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.184359 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novacell07556-account-delete-rtljc" podUID="2ef057cb-0d02-49d4-a20d-ac9a3f85484e" containerName="mariadb-account-delete" containerID="cri-o://1ce4d862c594baa625d818638ef9af5cb1a640bf2485dc86c14507efd56534c6" gracePeriod=30 Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.193016 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance3e24-account-delete-nkv48" event={"ID":"54eb5ffe-7e2f-4a33-9689-0470affe10e0","Type":"ContainerStarted","Data":"15b7bbc85d81bb5e07862ca4b4de5ff3923ce96b0caa2a68467827bd0e8e0b7f"} Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.193842 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance3e24-account-delete-nkv48" podUID="54eb5ffe-7e2f-4a33-9689-0470affe10e0" containerName="mariadb-account-delete" containerID="cri-o://15b7bbc85d81bb5e07862ca4b4de5ff3923ce96b0caa2a68467827bd0e8e0b7f" gracePeriod=30 Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.213532 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"e1ffe861-7d12-49e2-9737-fc100833da39","Type":"ContainerDied","Data":"6fc2106d8d8e2ae434b1ee1fc9a96a4258b13ed90efd411bf72d739990bd9e4b"} Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.213612 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.235620 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:36 crc kubenswrapper[4831]: E1203 06:55:36.235749 4831 configmap.go:193] Couldn't get configMap openstack/rabbitmq-cell1-config-data: configmap "rabbitmq-cell1-config-data" not found Dec 03 06:55:36 crc kubenswrapper[4831]: E1203 06:55:36.235804 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data podName:8d6ac806-4ac5-4de4-b6a0-b265032150f4 nodeName:}" failed. No retries permitted until 2025-12-03 06:55:44.235787413 +0000 UTC m=+1481.579370921 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "config-data" (UniqueName: "kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data") pod "rabbitmq-cell1-server-0" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4") : configmap "rabbitmq-cell1-config-data" not found Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.236438 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novacell07556-account-delete-rtljc" podStartSLOduration=7.236416032 podStartE2EDuration="7.236416032s" podCreationTimestamp="2025-12-03 06:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:55:36.206555526 +0000 UTC m=+1473.550139034" watchObservedRunningTime="2025-12-03 06:55:36.236416032 +0000 UTC m=+1473.579999540" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.265253 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement2619-account-delete-d6v4b" event={"ID":"20722e3f-f810-4ac7-80d7-09cae400150a","Type":"ContainerStarted","Data":"557c4ed3ac9d594aa36c1567b066b8623beb6bdaab234448402e61224dd28e60"} Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.265476 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement2619-account-delete-d6v4b" podUID="20722e3f-f810-4ac7-80d7-09cae400150a" containerName="mariadb-account-delete" containerID="cri-o://557c4ed3ac9d594aa36c1567b066b8623beb6bdaab234448402e61224dd28e60" gracePeriod=30 Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.270832 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance3e24-account-delete-nkv48" podStartSLOduration=8.270807229 podStartE2EDuration="8.270807229s" podCreationTimestamp="2025-12-03 06:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:55:36.22762484 +0000 UTC m=+1473.571208348" watchObservedRunningTime="2025-12-03 06:55:36.270807229 +0000 UTC m=+1473.614390737" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.304233 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"c60bce87-ea0b-4b3d-8243-93ed40c232ff","Type":"ContainerDied","Data":"7ed4005aa07f9a098997fab85847f9392c3a28c78abe03b54b5d13fcf163d7d7"} Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.304410 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.323961 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"aadb65a0-295d-4fcf-b148-44480346d357","Type":"ContainerDied","Data":"cd394ff880e63d3ae898a844be66ad605e336a324338fbbcd3be61ddca712c73"} Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.324131 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.330649 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"1eab7a3f-11f0-4d00-b436-93cc30c2e8e1","Type":"ContainerDied","Data":"7d7112e1bf2494fbb6f69eb22091f97a0a8fcd4e8434fc22e1091c126c54bd70"} Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.331112 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.342195 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.342271 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.342399 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.342035 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi8edc-account-delete-2pvdg" event={"ID":"56055ee5-407e-4ced-865f-03585e5f7f7b","Type":"ContainerStarted","Data":"a69329035f458ff6fcb46d258f921e780538a61f81fd40d9c93eb03532d06f8d"} Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.342516 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5d9bffbcdd-ztjkw" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.342542 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-574cdc6988-72ggg" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.342956 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-95nwv" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.345032 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.347328 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/novaapi8edc-account-delete-2pvdg" podUID="56055ee5-407e-4ced-865f-03585e5f7f7b" containerName="mariadb-account-delete" containerID="cri-o://a69329035f458ff6fcb46d258f921e780538a61f81fd40d9c93eb03532d06f8d" gracePeriod=30 Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.379708 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement2619-account-delete-d6v4b" podStartSLOduration=8.379691375 podStartE2EDuration="8.379691375s" podCreationTimestamp="2025-12-03 06:55:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:55:36.286364161 +0000 UTC m=+1473.629947669" watchObservedRunningTime="2025-12-03 06:55:36.379691375 +0000 UTC m=+1473.723274883" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.435165 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/novaapi8edc-account-delete-2pvdg" podStartSLOduration=7.435143425 podStartE2EDuration="7.435143425s" podCreationTimestamp="2025-12-03 06:55:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 06:55:36.427101146 +0000 UTC m=+1473.770684654" watchObservedRunningTime="2025-12-03 06:55:36.435143425 +0000 UTC m=+1473.778726953" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.634338 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aadb65a0-295d-4fcf-b148-44480346d357-config-data" (OuterVolumeSpecName: "config-data") pod "aadb65a0-295d-4fcf-b148-44480346d357" (UID: "aadb65a0-295d-4fcf-b148-44480346d357"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.650185 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aadb65a0-295d-4fcf-b148-44480346d357-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.654142 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe1a689-1241-4c11-93ca-875e53319668-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5fe1a689-1241-4c11-93ca-875e53319668" (UID: "5fe1a689-1241-4c11-93ca-875e53319668"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:36 crc kubenswrapper[4831]: E1203 06:55:36.693181 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:36 crc kubenswrapper[4831]: E1203 06:55:36.693660 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:36 crc kubenswrapper[4831]: E1203 06:55:36.695848 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.704347 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-config-data" (OuterVolumeSpecName: "config-data") pod "a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" (UID: "a5bbc77d-ca7b-48ab-b8bc-b304a12bb586"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:36 crc kubenswrapper[4831]: E1203 06:55:36.704575 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:36 crc kubenswrapper[4831]: E1203 06:55:36.704617 4831 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovsdb-server" Dec 03 06:55:36 crc kubenswrapper[4831]: E1203 06:55:36.711469 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:36 crc kubenswrapper[4831]: E1203 06:55:36.729727 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:36 crc kubenswrapper[4831]: E1203 06:55:36.729806 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovs-vswitchd" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.743195 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c60bce87-ea0b-4b3d-8243-93ed40c232ff-config-data" (OuterVolumeSpecName: "config-data") pod "c60bce87-ea0b-4b3d-8243-93ed40c232ff" (UID: "c60bce87-ea0b-4b3d-8243-93ed40c232ff"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.753327 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c60bce87-ea0b-4b3d-8243-93ed40c232ff-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.753401 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.753428 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5fe1a689-1241-4c11-93ca-875e53319668-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.810507 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" (UID: "a5bbc77d-ca7b-48ab-b8bc-b304a12bb586"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.855761 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.890152 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-config-data" (OuterVolumeSpecName: "config-data") pod "9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" (UID: "9a438bff-fbe4-4ae4-8d0f-3eecc1819f50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.890326 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" (UID: "9a438bff-fbe4-4ae4-8d0f-3eecc1819f50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.896615 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" (UID: "a5bbc77d-ca7b-48ab-b8bc-b304a12bb586"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.956970 4831 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.957002 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.957016 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.962777 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c60bce87-ea0b-4b3d-8243-93ed40c232ff-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c60bce87-ea0b-4b3d-8243-93ed40c232ff" (UID: "c60bce87-ea0b-4b3d-8243-93ed40c232ff"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.964478 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.972234 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a18ae10-7f43-4072-b01c-1564735985be-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a18ae10-7f43-4072-b01c-1564735985be" (UID: "1a18ae10-7f43-4072-b01c-1564735985be"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:36 crc kubenswrapper[4831]: I1203 06:55:36.984030 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ffe861-7d12-49e2-9737-fc100833da39-memcached-tls-certs" (OuterVolumeSpecName: "memcached-tls-certs") pod "e1ffe861-7d12-49e2-9737-fc100833da39" (UID: "e1ffe861-7d12-49e2-9737-fc100833da39"). InnerVolumeSpecName "memcached-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.022306 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "770b98aa-f177-4c7b-b37e-1664c039f47d" (UID: "770b98aa-f177-4c7b-b37e-1664c039f47d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.026275 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d44a6a4-631f-4ccd-ac61-781623a04c11" path="/var/lib/kubelet/pods/5d44a6a4-631f-4ccd-ac61-781623a04c11/volumes" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.027171 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e358bf07-df54-4268-9421-f31c57f5594c" path="/var/lib/kubelet/pods/e358bf07-df54-4268-9421-f31c57f5594c/volumes" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.027999 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd770919-d80e-4947-b52d-673beb117374" path="/var/lib/kubelet/pods/fd770919-d80e-4947-b52d-673beb117374/volumes" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.068087 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a18ae10-7f43-4072-b01c-1564735985be-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.068131 4831 reconciler_common.go:293] "Volume detached for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ffe861-7d12-49e2-9737-fc100833da39-memcached-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.068143 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.068157 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c60bce87-ea0b-4b3d-8243-93ed40c232ff-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.068166 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.085411 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7a5d88e3-73a3-4f3d-af31-af675ab452bd" (UID: "7a5d88e3-73a3-4f3d-af31-af675ab452bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.097463 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a18ae10-7f43-4072-b01c-1564735985be-config-data" (OuterVolumeSpecName: "config-data") pod "1a18ae10-7f43-4072-b01c-1564735985be" (UID: "1a18ae10-7f43-4072-b01c-1564735985be"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.114682 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "7a5d88e3-73a3-4f3d-af31-af675ab452bd" (UID: "7a5d88e3-73a3-4f3d-af31-af675ab452bd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.141519 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-config-data" (OuterVolumeSpecName: "config-data") pod "267687cf-58da-42e0-852e-c8c87f2ea42a" (UID: "267687cf-58da-42e0-852e-c8c87f2ea42a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.141573 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" (UID: "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.150104 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-config-data" (OuterVolumeSpecName: "config-data") pod "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" (UID: "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.161593 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "770b98aa-f177-4c7b-b37e-1664c039f47d" (UID: "770b98aa-f177-4c7b-b37e-1664c039f47d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.169019 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.169046 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.169056 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.169064 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a18ae10-7f43-4072-b01c-1564735985be-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.169073 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.169081 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.169090 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.172820 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" (UID: "9a438bff-fbe4-4ae4-8d0f-3eecc1819f50"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.174209 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aadb65a0-295d-4fcf-b148-44480346d357-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aadb65a0-295d-4fcf-b148-44480346d357" (UID: "aadb65a0-295d-4fcf-b148-44480346d357"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.176495 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "770b98aa-f177-4c7b-b37e-1664c039f47d" (UID: "770b98aa-f177-4c7b-b37e-1664c039f47d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.178163 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.185565 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-config-data" (OuterVolumeSpecName: "config-data") pod "770b98aa-f177-4c7b-b37e-1664c039f47d" (UID: "770b98aa-f177-4c7b-b37e-1664c039f47d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.186803 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "267687cf-58da-42e0-852e-c8c87f2ea42a" (UID: "267687cf-58da-42e0-852e-c8c87f2ea42a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.186836 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "267687cf-58da-42e0-852e-c8c87f2ea42a" (UID: "267687cf-58da-42e0-852e-c8c87f2ea42a"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.208307 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" (UID: "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.229180 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe1a689-1241-4c11-93ca-875e53319668-ovn-controller-tls-certs" (OuterVolumeSpecName: "ovn-controller-tls-certs") pod "5fe1a689-1241-4c11-93ca-875e53319668" (UID: "5fe1a689-1241-4c11-93ca-875e53319668"). InnerVolumeSpecName "ovn-controller-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.230115 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-config-data" (OuterVolumeSpecName: "config-data") pod "7a5d88e3-73a3-4f3d-af31-af675ab452bd" (UID: "7a5d88e3-73a3-4f3d-af31-af675ab452bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.236604 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ffe861-7d12-49e2-9737-fc100833da39-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e1ffe861-7d12-49e2-9737-fc100833da39" (UID: "e1ffe861-7d12-49e2-9737-fc100833da39"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.243877 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.252974 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data" (OuterVolumeSpecName: "config-data") pod "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" (UID: "1eab7a3f-11f0-4d00-b436-93cc30c2e8e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.261856 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" (UID: "9a438bff-fbe4-4ae4-8d0f-3eecc1819f50"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.271982 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.272016 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.272026 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.272036 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.272045 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.272054 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1ffe861-7d12-49e2-9737-fc100833da39-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.272062 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5fe1a689-1241-4c11-93ca-875e53319668-ovn-controller-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.272072 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.272082 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/267687cf-58da-42e0-852e-c8c87f2ea42a-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.272091 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.272099 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770b98aa-f177-4c7b-b37e-1664c039f47d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.272107 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.272116 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aadb65a0-295d-4fcf-b148-44480346d357-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.272124 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7a5d88e3-73a3-4f3d-af31-af675ab452bd-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.353022 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" (UID: "cccc3a0b-98c7-4930-a7b5-3c1320a5ee69"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.375260 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:37 crc kubenswrapper[4831]: E1203 06:55:37.375364 4831 configmap.go:193] Couldn't get configMap openstack/openstack-scripts: configmap "openstack-scripts" not found Dec 03 06:55:37 crc kubenswrapper[4831]: E1203 06:55:37.375418 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts podName:6f54df74-81ac-43f7-9075-51cb26200c4e nodeName:}" failed. No retries permitted until 2025-12-03 06:55:41.375398921 +0000 UTC m=+1478.718982449 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts") pod "neutronb468-account-delete-9hl65" (UID: "6f54df74-81ac-43f7-9075-51cb26200c4e") : configmap "openstack-scripts" not found Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.388202 4831 generic.go:334] "Generic (PLEG): container finished" podID="56055ee5-407e-4ced-865f-03585e5f7f7b" containerID="a69329035f458ff6fcb46d258f921e780538a61f81fd40d9c93eb03532d06f8d" exitCode=1 Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.390227 4831 generic.go:334] "Generic (PLEG): container finished" podID="5067e964-1daa-4bbd-8e2b-872ce1067389" containerID="8f1ceb54accf6cc95db136b64ce1f4080a7894f0ee7b6160b877d7044f1c4354" exitCode=0 Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.392075 4831 generic.go:334] "Generic (PLEG): container finished" podID="e69614e3-a574-42e1-adc2-09861e9974e5" containerID="9ec26ef13af70543f75a8bc515a87380e1e69e18ff78e24db623bf3f171971b6" exitCode=1 Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.395771 4831 generic.go:334] "Generic (PLEG): container finished" podID="20722e3f-f810-4ac7-80d7-09cae400150a" containerID="557c4ed3ac9d594aa36c1567b066b8623beb6bdaab234448402e61224dd28e60" exitCode=1 Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.400198 4831 generic.go:334] "Generic (PLEG): container finished" podID="dc0cbb94-92ec-4369-b609-f3186f302c66" containerID="309cf75d8cbb562efc0116b04c83dd3b65a4b14e1de35f409f188bc0a497e2b6" exitCode=0 Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.405949 4831 generic.go:334] "Generic (PLEG): container finished" podID="8d6ac806-4ac5-4de4-b6a0-b265032150f4" containerID="e701230003bdb542945df73fbff521649b1c0dbe31d133988a35ce0bd844cb8e" exitCode=0 Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.413936 4831 generic.go:334] "Generic (PLEG): container finished" podID="81084d6a-9987-4466-8f89-455aa3ff2627" containerID="ac040d35ff00ea3fd5ce4c567c429329ee4b16038dc1dca2dd296dc049531e88" exitCode=1 Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.445913 4831 generic.go:334] "Generic (PLEG): container finished" podID="2ef057cb-0d02-49d4-a20d-ac9a3f85484e" containerID="1ce4d862c594baa625d818638ef9af5cb1a640bf2485dc86c14507efd56534c6" exitCode=1 Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.453603 4831 generic.go:334] "Generic (PLEG): container finished" podID="54eb5ffe-7e2f-4a33-9689-0470affe10e0" containerID="15b7bbc85d81bb5e07862ca4b4de5ff3923ce96b0caa2a68467827bd0e8e0b7f" exitCode=1 Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.455113 4831 generic.go:334] "Generic (PLEG): container finished" podID="4b96e376-8104-4a92-b1ec-6078943d0b50" containerID="0c8277d69038c0d9447efe65fefda45d9ad5b34d1becc16314549119edd58eca" exitCode=143 Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.456199 4831 generic.go:334] "Generic (PLEG): container finished" podID="6f54df74-81ac-43f7-9075-51cb26200c4e" containerID="a52c85e8d9ab49bfc227e8d70ef51e897c401fd8ec1f992efd5d76f97e31a07c" exitCode=1 Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.472468 4831 generic.go:334] "Generic (PLEG): container finished" podID="55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" containerID="2083069d237ad9e634573977e455af0f77cad2d64fa39a6dce0010ff03639849" exitCode=0 Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922523 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi8edc-account-delete-2pvdg" event={"ID":"56055ee5-407e-4ced-865f-03585e5f7f7b","Type":"ContainerDied","Data":"a69329035f458ff6fcb46d258f921e780538a61f81fd40d9c93eb03532d06f8d"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922814 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novaapi8edc-account-delete-2pvdg" event={"ID":"56055ee5-407e-4ced-865f-03585e5f7f7b","Type":"ContainerDied","Data":"a3c72b37845e43bea5a19b3a3ec03f881d38c6c718efa1c8831dee03df6cff88"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922827 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3c72b37845e43bea5a19b3a3ec03f881d38c6c718efa1c8831dee03df6cff88" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922839 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-665dcf9f4f-8g7pd" event={"ID":"5067e964-1daa-4bbd-8e2b-872ce1067389","Type":"ContainerDied","Data":"8f1ceb54accf6cc95db136b64ce1f4080a7894f0ee7b6160b877d7044f1c4354"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922876 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder3a6d-account-delete-b6ptg" event={"ID":"e69614e3-a574-42e1-adc2-09861e9974e5","Type":"ContainerDied","Data":"9ec26ef13af70543f75a8bc515a87380e1e69e18ff78e24db623bf3f171971b6"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922890 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement2619-account-delete-d6v4b" event={"ID":"20722e3f-f810-4ac7-80d7-09cae400150a","Type":"ContainerDied","Data":"557c4ed3ac9d594aa36c1567b066b8623beb6bdaab234448402e61224dd28e60"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922901 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement2619-account-delete-d6v4b" event={"ID":"20722e3f-f810-4ac7-80d7-09cae400150a","Type":"ContainerDied","Data":"dd287a893c75f576c5531a39dd2895a9a3b0aeab155d78a6353480d4431e316c"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922909 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd287a893c75f576c5531a39dd2895a9a3b0aeab155d78a6353480d4431e316c" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922917 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dc0cbb94-92ec-4369-b609-f3186f302c66","Type":"ContainerDied","Data":"309cf75d8cbb562efc0116b04c83dd3b65a4b14e1de35f409f188bc0a497e2b6"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922951 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d6ac806-4ac5-4de4-b6a0-b265032150f4","Type":"ContainerDied","Data":"e701230003bdb542945df73fbff521649b1c0dbe31d133988a35ce0bd844cb8e"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922966 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican3e51-account-delete-4jnl5" event={"ID":"81084d6a-9987-4466-8f89-455aa3ff2627","Type":"ContainerDied","Data":"ac040d35ff00ea3fd5ce4c567c429329ee4b16038dc1dca2dd296dc049531e88"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922978 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell07556-account-delete-rtljc" event={"ID":"2ef057cb-0d02-49d4-a20d-ac9a3f85484e","Type":"ContainerDied","Data":"1ce4d862c594baa625d818638ef9af5cb1a640bf2485dc86c14507efd56534c6"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922988 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/novacell07556-account-delete-rtljc" event={"ID":"2ef057cb-0d02-49d4-a20d-ac9a3f85484e","Type":"ContainerDied","Data":"d29ae68af819bb49831c646ba7cee33f52be33973d438d7bca8d1ed398a01440"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.922997 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d29ae68af819bb49831c646ba7cee33f52be33973d438d7bca8d1ed398a01440" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.923028 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance3e24-account-delete-nkv48" event={"ID":"54eb5ffe-7e2f-4a33-9689-0470affe10e0","Type":"ContainerDied","Data":"15b7bbc85d81bb5e07862ca4b4de5ff3923ce96b0caa2a68467827bd0e8e0b7f"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.923046 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance3e24-account-delete-nkv48" event={"ID":"54eb5ffe-7e2f-4a33-9689-0470affe10e0","Type":"ContainerDied","Data":"df3f1f9fe5181fd0912bd9e87bbd99cebc16052fe1a67dcafb470ec88c877bd0"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.923059 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="df3f1f9fe5181fd0912bd9e87bbd99cebc16052fe1a67dcafb470ec88c877bd0" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.923069 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" event={"ID":"4b96e376-8104-4a92-b1ec-6078943d0b50","Type":"ContainerDied","Data":"0c8277d69038c0d9447efe65fefda45d9ad5b34d1becc16314549119edd58eca"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.923112 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronb468-account-delete-9hl65" event={"ID":"6f54df74-81ac-43f7-9075-51cb26200c4e","Type":"ContainerDied","Data":"a52c85e8d9ab49bfc227e8d70ef51e897c401fd8ec1f992efd5d76f97e31a07c"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.923130 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b","Type":"ContainerDied","Data":"2083069d237ad9e634573977e455af0f77cad2d64fa39a6dce0010ff03639849"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.923141 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b","Type":"ContainerDied","Data":"7e40cb987e0469c635f3de5b8417af2e7616ec9f1700bb9c2af7a4b7938dfee1"} Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.923149 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7e40cb987e0469c635f3de5b8417af2e7616ec9f1700bb9c2af7a4b7938dfee1" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.932660 4831 scope.go:117] "RemoveContainer" containerID="2a6aed804c2a9f3ed8236fffc4704163803135cc2b4f57e197408d2f4c85bb43" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.965055 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell07556-account-delete-rtljc" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.986366 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mn22b\" (UniqueName: \"kubernetes.io/projected/2ef057cb-0d02-49d4-a20d-ac9a3f85484e-kube-api-access-mn22b\") pod \"2ef057cb-0d02-49d4-a20d-ac9a3f85484e\" (UID: \"2ef057cb-0d02-49d4-a20d-ac9a3f85484e\") " Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.986625 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef057cb-0d02-49d4-a20d-ac9a3f85484e-operator-scripts\") pod \"2ef057cb-0d02-49d4-a20d-ac9a3f85484e\" (UID: \"2ef057cb-0d02-49d4-a20d-ac9a3f85484e\") " Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.988334 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ef057cb-0d02-49d4-a20d-ac9a3f85484e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ef057cb-0d02-49d4-a20d-ac9a3f85484e" (UID: "2ef057cb-0d02-49d4-a20d-ac9a3f85484e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:37 crc kubenswrapper[4831]: I1203 06:55:37.992186 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ef057cb-0d02-49d4-a20d-ac9a3f85484e-kube-api-access-mn22b" (OuterVolumeSpecName: "kube-api-access-mn22b") pod "2ef057cb-0d02-49d4-a20d-ac9a3f85484e" (UID: "2ef057cb-0d02-49d4-a20d-ac9a3f85484e"). InnerVolumeSpecName "kube-api-access-mn22b". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.096090 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mn22b\" (UniqueName: \"kubernetes.io/projected/2ef057cb-0d02-49d4-a20d-ac9a3f85484e-kube-api-access-mn22b\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.096121 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ef057cb-0d02-49d4-a20d-ac9a3f85484e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.380217 4831 scope.go:117] "RemoveContainer" containerID="d2b937efb5cb363c5d2b1ffebf2c05783efceefbe6e756b381e8b96438b287fb" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.386095 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement2619-account-delete-d6v4b" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.401997 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20722e3f-f810-4ac7-80d7-09cae400150a-operator-scripts\") pod \"20722e3f-f810-4ac7-80d7-09cae400150a\" (UID: \"20722e3f-f810-4ac7-80d7-09cae400150a\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.402192 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cth2\" (UniqueName: \"kubernetes.io/projected/20722e3f-f810-4ac7-80d7-09cae400150a-kube-api-access-9cth2\") pod \"20722e3f-f810-4ac7-80d7-09cae400150a\" (UID: \"20722e3f-f810-4ac7-80d7-09cae400150a\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.402656 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/20722e3f-f810-4ac7-80d7-09cae400150a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "20722e3f-f810-4ac7-80d7-09cae400150a" (UID: "20722e3f-f810-4ac7-80d7-09cae400150a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.403177 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/20722e3f-f810-4ac7-80d7-09cae400150a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.407506 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20722e3f-f810-4ac7-80d7-09cae400150a-kube-api-access-9cth2" (OuterVolumeSpecName: "kube-api-access-9cth2") pod "20722e3f-f810-4ac7-80d7-09cae400150a" (UID: "20722e3f-f810-4ac7-80d7-09cae400150a"). InnerVolumeSpecName "kube-api-access-9cth2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.426678 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.438271 4831 scope.go:117] "RemoveContainer" containerID="42a7022437eee40dbce5d592f4a83cabd1f8ff8152cabd031da50508af22483d" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.469959 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-5d9bffbcdd-ztjkw"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.485103 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance3e24-account-delete-nkv48" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.489023 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-5d9bffbcdd-ztjkw"] Dec 03 06:55:38 crc kubenswrapper[4831]: E1203 06:55:38.490869 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa is running failed: container process not found" containerID="f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 06:55:38 crc kubenswrapper[4831]: E1203 06:55:38.496994 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa is running failed: container process not found" containerID="f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.501802 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.506754 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-config-data-generated\") pod \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.506793 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmfj8\" (UniqueName: \"kubernetes.io/projected/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-kube-api-access-fmfj8\") pod \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.506818 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-config-data-default\") pod \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.506883 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-combined-ca-bundle\") pod \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.506926 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-galera-tls-certs\") pod \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.506980 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-operator-scripts\") pod \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.506999 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.507030 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54eb5ffe-7e2f-4a33-9689-0470affe10e0-operator-scripts\") pod \"54eb5ffe-7e2f-4a33-9689-0470affe10e0\" (UID: \"54eb5ffe-7e2f-4a33-9689-0470affe10e0\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.507064 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-kolla-config\") pod \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\" (UID: \"55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.507143 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flv5c\" (UniqueName: \"kubernetes.io/projected/54eb5ffe-7e2f-4a33-9689-0470affe10e0-kube-api-access-flv5c\") pod \"54eb5ffe-7e2f-4a33-9689-0470affe10e0\" (UID: \"54eb5ffe-7e2f-4a33-9689-0470affe10e0\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.507587 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cth2\" (UniqueName: \"kubernetes.io/projected/20722e3f-f810-4ac7-80d7-09cae400150a-kube-api-access-9cth2\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: E1203 06:55:38.514214 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa is running failed: container process not found" containerID="f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa" cmd=["/usr/local/bin/container-scripts/status_check.sh"] Dec 03 06:55:38 crc kubenswrapper[4831]: E1203 06:55:38.514307 4831 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-northd-0" podUID="5b548290-abc5-4c67-862c-16aa03a652da" containerName="ovn-northd" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.516990 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.517439 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" (UID: "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.517830 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54eb5ffe-7e2f-4a33-9689-0470affe10e0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "54eb5ffe-7e2f-4a33-9689-0470affe10e0" (UID: "54eb5ffe-7e2f-4a33-9689-0470affe10e0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.517872 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" (UID: "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.518298 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" (UID: "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.518721 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5b548290-abc5-4c67-862c-16aa03a652da/ovn-northd/0.log" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.518757 4831 generic.go:334] "Generic (PLEG): container finished" podID="5b548290-abc5-4c67-862c-16aa03a652da" containerID="f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa" exitCode=139 Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.518841 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b548290-abc5-4c67-862c-16aa03a652da","Type":"ContainerDied","Data":"f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa"} Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.522570 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" (UID: "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.545335 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-kube-api-access-fmfj8" (OuterVolumeSpecName: "kube-api-access-fmfj8") pod "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" (UID: "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b"). InnerVolumeSpecName "kube-api-access-fmfj8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.545585 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54eb5ffe-7e2f-4a33-9689-0470affe10e0-kube-api-access-flv5c" (OuterVolumeSpecName: "kube-api-access-flv5c") pod "54eb5ffe-7e2f-4a33-9689-0470affe10e0" (UID: "54eb5ffe-7e2f-4a33-9689-0470affe10e0"). InnerVolumeSpecName "kube-api-access-flv5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.545639 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.548273 4831 scope.go:117] "RemoveContainer" containerID="0fb9b727228c96fa187f5a0e156c0ff4cac686daaafe9b93470fdfcc2071ea6f" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.565825 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-665dcf9f4f-8g7pd" event={"ID":"5067e964-1daa-4bbd-8e2b-872ce1067389","Type":"ContainerDied","Data":"c92eb12f4c11ef063a6c7bd73114db57fdf21e3ba90802835bee2899ce4b49fa"} Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.565869 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c92eb12f4c11ef063a6c7bd73114db57fdf21e3ba90802835bee2899ce4b49fa" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.578485 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.580776 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "mysql-db") pod "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" (UID: "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.583287 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/kube-state-metrics-0" podUID="5bf96e96-13ba-44c9-b16e-b1c2acbfc643" containerName="kube-state-metrics" probeResult="failure" output="Get \"https://10.217.0.196:8081/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.593378 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-galera-tls-certs" (OuterVolumeSpecName: "galera-tls-certs") pod "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" (UID: "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b"). InnerVolumeSpecName "galera-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.607086 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi8edc-account-delete-2pvdg" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.607234 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.607780 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"dc0cbb94-92ec-4369-b609-f3186f302c66","Type":"ContainerDied","Data":"2a0709e4a5635f9cc13d8259ae8df8a212e11b8c97b94c1cb186db44ad3b61cb"} Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.607807 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a0709e4a5635f9cc13d8259ae8df8a212e11b8c97b94c1cb186db44ad3b61cb" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.610508 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.610541 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.610553 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/54eb5ffe-7e2f-4a33-9689-0470affe10e0-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.610562 4831 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-kolla-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.610576 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flv5c\" (UniqueName: \"kubernetes.io/projected/54eb5ffe-7e2f-4a33-9689-0470affe10e0-kube-api-access-flv5c\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.610586 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-config-data-generated\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.610594 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmfj8\" (UniqueName: \"kubernetes.io/projected/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-kube-api-access-fmfj8\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.610605 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-config-data-default\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.610613 4831 reconciler_common.go:293] "Volume detached for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-galera-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.626192 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"8d6ac806-4ac5-4de4-b6a0-b265032150f4","Type":"ContainerDied","Data":"e960d02942ee37d96724ed9e66feb25249f3e46c0430b2f545795a2dd1ce326d"} Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.626227 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e960d02942ee37d96724ed9e66feb25249f3e46c0430b2f545795a2dd1ce326d" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.628024 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" (UID: "55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.628156 4831 scope.go:117] "RemoveContainer" containerID="db914d2ef3f9530c88bd1d73542d378efb6e0c34e42edbdd03ab88b16f91baa9" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.647428 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/memcached-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.648094 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican3e51-account-delete-4jnl5" event={"ID":"81084d6a-9987-4466-8f89-455aa3ff2627","Type":"ContainerDied","Data":"3a6b737f1b9367c00835c9a404c4bc9ba862def65b579c5143283c26dba94964"} Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.648118 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3a6b737f1b9367c00835c9a404c4bc9ba862def65b579c5143283c26dba94964" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.654371 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.654777 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/memcached-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.660168 4831 scope.go:117] "RemoveContainer" containerID="23562686db7ab5892626a9f6b6ffc89b32e0c913addd62ae2225fd861ed8de86" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.663565 4831 generic.go:334] "Generic (PLEG): container finished" podID="fa796212-03db-4860-93f4-d2918ed44070" containerID="e1b665f5de4ff9fea0d7f46233aebcb76bd558b21574f6845a1c4a0745851315" exitCode=0 Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.663646 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fa796212-03db-4860-93f4-d2918ed44070","Type":"ContainerDied","Data":"e1b665f5de4ff9fea0d7f46233aebcb76bd558b21574f6845a1c4a0745851315"} Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.672970 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder3a6d-account-delete-b6ptg" event={"ID":"e69614e3-a574-42e1-adc2-09861e9974e5","Type":"ContainerDied","Data":"1c0a461627ebef432fa1b5bcbe2e6bde6b1afb13041733bfe0e1de0dc065b1f9"} Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.673006 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c0a461627ebef432fa1b5bcbe2e6bde6b1afb13041733bfe0e1de0dc065b1f9" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.673013 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.673711 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder3a6d-account-delete-b6ptg" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.686184 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.659861 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.704560 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.708794 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican3e51-account-delete-4jnl5" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.721360 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-95nwv"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.721418 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-95nwv"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.721851 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d6ac806-4ac5-4de4-b6a0-b265032150f4-erlang-cookie-secret\") pod \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.721879 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data\") pod \"dc0cbb94-92ec-4369-b609-f3186f302c66\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.721912 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-config-data\") pod \"5067e964-1daa-4bbd-8e2b-872ce1067389\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.721932 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.721954 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d6ac806-4ac5-4de4-b6a0-b265032150f4-pod-info\") pod \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.721977 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-erlang-cookie\") pod \"dc0cbb94-92ec-4369-b609-f3186f302c66\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.721994 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e69614e3-a574-42e1-adc2-09861e9974e5-operator-scripts\") pod \"e69614e3-a574-42e1-adc2-09861e9974e5\" (UID: \"e69614e3-a574-42e1-adc2-09861e9974e5\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722016 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-scripts\") pod \"5067e964-1daa-4bbd-8e2b-872ce1067389\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722038 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56055ee5-407e-4ced-865f-03585e5f7f7b-operator-scripts\") pod \"56055ee5-407e-4ced-865f-03585e5f7f7b\" (UID: \"56055ee5-407e-4ced-865f-03585e5f7f7b\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722056 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsdbs\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-kube-api-access-fsdbs\") pod \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722084 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-public-tls-certs\") pod \"5067e964-1daa-4bbd-8e2b-872ce1067389\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722106 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h2rnn\" (UniqueName: \"kubernetes.io/projected/56055ee5-407e-4ced-865f-03585e5f7f7b-kube-api-access-h2rnn\") pod \"56055ee5-407e-4ced-865f-03585e5f7f7b\" (UID: \"56055ee5-407e-4ced-865f-03585e5f7f7b\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722132 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-erlang-cookie\") pod \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722155 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-server-conf\") pod \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722183 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-plugins-conf\") pod \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722206 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-confd\") pod \"dc0cbb94-92ec-4369-b609-f3186f302c66\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722255 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jsrdw\" (UniqueName: \"kubernetes.io/projected/5067e964-1daa-4bbd-8e2b-872ce1067389-kube-api-access-jsrdw\") pod \"5067e964-1daa-4bbd-8e2b-872ce1067389\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722278 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smwwf\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-kube-api-access-smwwf\") pod \"dc0cbb94-92ec-4369-b609-f3186f302c66\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722297 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-tls\") pod \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722330 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-fernet-keys\") pod \"5067e964-1daa-4bbd-8e2b-872ce1067389\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722352 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-plugins-conf\") pod \"dc0cbb94-92ec-4369-b609-f3186f302c66\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722387 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-plugins\") pod \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722443 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-plugins\") pod \"dc0cbb94-92ec-4369-b609-f3186f302c66\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722465 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-confd\") pod \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722491 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data\") pod \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\" (UID: \"8d6ac806-4ac5-4de4-b6a0-b265032150f4\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722512 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-tls\") pod \"dc0cbb94-92ec-4369-b609-f3186f302c66\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722530 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc0cbb94-92ec-4369-b609-f3186f302c66-erlang-cookie-secret\") pod \"dc0cbb94-92ec-4369-b609-f3186f302c66\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722548 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc0cbb94-92ec-4369-b609-f3186f302c66-pod-info\") pod \"dc0cbb94-92ec-4369-b609-f3186f302c66\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722565 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8hrh\" (UniqueName: \"kubernetes.io/projected/e69614e3-a574-42e1-adc2-09861e9974e5-kube-api-access-n8hrh\") pod \"e69614e3-a574-42e1-adc2-09861e9974e5\" (UID: \"e69614e3-a574-42e1-adc2-09861e9974e5\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722584 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-server-conf\") pod \"dc0cbb94-92ec-4369-b609-f3186f302c66\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722599 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-combined-ca-bundle\") pod \"5067e964-1daa-4bbd-8e2b-872ce1067389\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722631 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-credential-keys\") pod \"5067e964-1daa-4bbd-8e2b-872ce1067389\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722644 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"dc0cbb94-92ec-4369-b609-f3186f302c66\" (UID: \"dc0cbb94-92ec-4369-b609-f3186f302c66\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.722672 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-internal-tls-certs\") pod \"5067e964-1daa-4bbd-8e2b-872ce1067389\" (UID: \"5067e964-1daa-4bbd-8e2b-872ce1067389\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.723255 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.723270 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: E1203 06:55:38.724441 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23562686db7ab5892626a9f6b6ffc89b32e0c913addd62ae2225fd861ed8de86\": container with ID starting with 23562686db7ab5892626a9f6b6ffc89b32e0c913addd62ae2225fd861ed8de86 not found: ID does not exist" containerID="23562686db7ab5892626a9f6b6ffc89b32e0c913addd62ae2225fd861ed8de86" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.724539 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novacell07556-account-delete-rtljc" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.724677 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "8d6ac806-4ac5-4de4-b6a0-b265032150f4" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.731970 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronb468-account-delete-9hl65" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.742612 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement2619-account-delete-d6v4b" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.746460 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.750567 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/8d6ac806-4ac5-4de4-b6a0-b265032150f4-pod-info" (OuterVolumeSpecName: "pod-info") pod "8d6ac806-4ac5-4de4-b6a0-b265032150f4" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.751932 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutronb468-account-delete-9hl65" event={"ID":"6f54df74-81ac-43f7-9075-51cb26200c4e","Type":"ContainerDied","Data":"2b1f15a184562cc4375ff199a0fe5d2f16f37d8ddc349b48eb679d5612ccb59e"} Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.751986 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.752007 4831 scope.go:117] "RemoveContainer" containerID="a52c85e8d9ab49bfc227e8d70ef51e897c401fd8ec1f992efd5d76f97e31a07c" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.755752 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "8d6ac806-4ac5-4de4-b6a0-b265032150f4" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.756169 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.762660 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "dc0cbb94-92ec-4369-b609-f3186f302c66" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.768685 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "dc0cbb94-92ec-4369-b609-f3186f302c66" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.768869 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.769507 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56055ee5-407e-4ced-865f-03585e5f7f7b-kube-api-access-h2rnn" (OuterVolumeSpecName: "kube-api-access-h2rnn") pod "56055ee5-407e-4ced-865f-03585e5f7f7b" (UID: "56055ee5-407e-4ced-865f-03585e5f7f7b"). InnerVolumeSpecName "kube-api-access-h2rnn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.788276 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e69614e3-a574-42e1-adc2-09861e9974e5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e69614e3-a574-42e1-adc2-09861e9974e5" (UID: "e69614e3-a574-42e1-adc2-09861e9974e5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.788731 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "dc0cbb94-92ec-4369-b609-f3186f302c66" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.789355 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56055ee5-407e-4ced-865f-03585e5f7f7b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "56055ee5-407e-4ced-865f-03585e5f7f7b" (UID: "56055ee5-407e-4ced-865f-03585e5f7f7b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.802902 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.810132 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "8d6ac806-4ac5-4de4-b6a0-b265032150f4" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.810144 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/dc0cbb94-92ec-4369-b609-f3186f302c66-pod-info" (OuterVolumeSpecName: "pod-info") pod "dc0cbb94-92ec-4369-b609-f3186f302c66" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.810242 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "dc0cbb94-92ec-4369-b609-f3186f302c66" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.811572 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "8d6ac806-4ac5-4de4-b6a0-b265032150f4" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.812152 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5b548290-abc5-4c67-862c-16aa03a652da/ovn-northd/0.log" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.812219 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.824775 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81084d6a-9987-4466-8f89-455aa3ff2627-operator-scripts\") pod \"81084d6a-9987-4466-8f89-455aa3ff2627\" (UID: \"81084d6a-9987-4466-8f89-455aa3ff2627\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.824833 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts\") pod \"6f54df74-81ac-43f7-9075-51cb26200c4e\" (UID: \"6f54df74-81ac-43f7-9075-51cb26200c4e\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.824872 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpjfw\" (UniqueName: \"kubernetes.io/projected/81084d6a-9987-4466-8f89-455aa3ff2627-kube-api-access-vpjfw\") pod \"81084d6a-9987-4466-8f89-455aa3ff2627\" (UID: \"81084d6a-9987-4466-8f89-455aa3ff2627\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.824895 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-scripts\") pod \"fa796212-03db-4860-93f4-d2918ed44070\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825025 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa796212-03db-4860-93f4-d2918ed44070-etc-machine-id\") pod \"fa796212-03db-4860-93f4-d2918ed44070\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825042 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-combined-ca-bundle\") pod \"fa796212-03db-4860-93f4-d2918ed44070\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825057 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwdv6\" (UniqueName: \"kubernetes.io/projected/6f54df74-81ac-43f7-9075-51cb26200c4e-kube-api-access-qwdv6\") pod \"6f54df74-81ac-43f7-9075-51cb26200c4e\" (UID: \"6f54df74-81ac-43f7-9075-51cb26200c4e\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825087 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-config-data\") pod \"fa796212-03db-4860-93f4-d2918ed44070\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825128 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-config-data-custom\") pod \"fa796212-03db-4860-93f4-d2918ed44070\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825242 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6687g\" (UniqueName: \"kubernetes.io/projected/fa796212-03db-4860-93f4-d2918ed44070-kube-api-access-6687g\") pod \"fa796212-03db-4860-93f4-d2918ed44070\" (UID: \"fa796212-03db-4860-93f4-d2918ed44070\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825632 4831 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825644 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825653 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825661 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825669 4831 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/dc0cbb94-92ec-4369-b609-f3186f302c66-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825687 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825696 4831 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/8d6ac806-4ac5-4de4-b6a0-b265032150f4-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825705 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825716 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e69614e3-a574-42e1-adc2-09861e9974e5-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825725 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/56055ee5-407e-4ced-865f-03585e5f7f7b-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825735 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h2rnn\" (UniqueName: \"kubernetes.io/projected/56055ee5-407e-4ced-865f-03585e5f7f7b-kube-api-access-h2rnn\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825743 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.825752 4831 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.826454 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fa796212-03db-4860-93f4-d2918ed44070-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "fa796212-03db-4860-93f4-d2918ed44070" (UID: "fa796212-03db-4860-93f4-d2918ed44070"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.826829 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81084d6a-9987-4466-8f89-455aa3ff2627-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "81084d6a-9987-4466-8f89-455aa3ff2627" (UID: "81084d6a-9987-4466-8f89-455aa3ff2627"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.827162 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6f54df74-81ac-43f7-9075-51cb26200c4e" (UID: "6f54df74-81ac-43f7-9075-51cb26200c4e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.831701 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-kube-api-access-smwwf" (OuterVolumeSpecName: "kube-api-access-smwwf") pod "dc0cbb94-92ec-4369-b609-f3186f302c66" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66"). InnerVolumeSpecName "kube-api-access-smwwf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.836957 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81084d6a-9987-4466-8f89-455aa3ff2627-kube-api-access-vpjfw" (OuterVolumeSpecName: "kube-api-access-vpjfw") pod "81084d6a-9987-4466-8f89-455aa3ff2627" (UID: "81084d6a-9987-4466-8f89-455aa3ff2627"). InnerVolumeSpecName "kube-api-access-vpjfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.837021 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.837443 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc0cbb94-92ec-4369-b609-f3186f302c66-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "dc0cbb94-92ec-4369-b609-f3186f302c66" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.895482 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.900708 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5067e964-1daa-4bbd-8e2b-872ce1067389-kube-api-access-jsrdw" (OuterVolumeSpecName: "kube-api-access-jsrdw") pod "5067e964-1daa-4bbd-8e2b-872ce1067389" (UID: "5067e964-1daa-4bbd-8e2b-872ce1067389"). InnerVolumeSpecName "kube-api-access-jsrdw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.913014 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.926301 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5067e964-1daa-4bbd-8e2b-872ce1067389" (UID: "5067e964-1daa-4bbd-8e2b-872ce1067389"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.926431 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-kube-api-access-fsdbs" (OuterVolumeSpecName: "kube-api-access-fsdbs") pod "8d6ac806-4ac5-4de4-b6a0-b265032150f4" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4"). InnerVolumeSpecName "kube-api-access-fsdbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.926487 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "8d6ac806-4ac5-4de4-b6a0-b265032150f4" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.926532 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d6ac806-4ac5-4de4-b6a0-b265032150f4-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "8d6ac806-4ac5-4de4-b6a0-b265032150f4" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.926577 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-scripts" (OuterVolumeSpecName: "scripts") pod "fa796212-03db-4860-93f4-d2918ed44070" (UID: "fa796212-03db-4860-93f4-d2918ed44070"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.926793 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.926856 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa796212-03db-4860-93f4-d2918ed44070-kube-api-access-6687g" (OuterVolumeSpecName: "kube-api-access-6687g") pod "fa796212-03db-4860-93f4-d2918ed44070" (UID: "fa796212-03db-4860-93f4-d2918ed44070"). InnerVolumeSpecName "kube-api-access-6687g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.926947 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "fa796212-03db-4860-93f4-d2918ed44070" (UID: "fa796212-03db-4860-93f4-d2918ed44070"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.927286 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5067e964-1daa-4bbd-8e2b-872ce1067389" (UID: "5067e964-1daa-4bbd-8e2b-872ce1067389"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.927424 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-ovn-northd-tls-certs\") pod \"5b548290-abc5-4c67-862c-16aa03a652da\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.927466 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-metrics-certs-tls-certs\") pod \"5b548290-abc5-4c67-862c-16aa03a652da\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.927507 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b548290-abc5-4c67-862c-16aa03a652da-config\") pod \"5b548290-abc5-4c67-862c-16aa03a652da\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.927532 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtx5p\" (UniqueName: \"kubernetes.io/projected/5b548290-abc5-4c67-862c-16aa03a652da-kube-api-access-jtx5p\") pod \"5b548290-abc5-4c67-862c-16aa03a652da\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.927605 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-combined-ca-bundle\") pod \"5b548290-abc5-4c67-862c-16aa03a652da\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.927709 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b548290-abc5-4c67-862c-16aa03a652da-scripts\") pod \"5b548290-abc5-4c67-862c-16aa03a652da\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.927731 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b548290-abc5-4c67-862c-16aa03a652da-ovn-rundir\") pod \"5b548290-abc5-4c67-862c-16aa03a652da\" (UID: \"5b548290-abc5-4c67-862c-16aa03a652da\") " Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.928469 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b548290-abc5-4c67-862c-16aa03a652da-config" (OuterVolumeSpecName: "config") pod "5b548290-abc5-4c67-862c-16aa03a652da" (UID: "5b548290-abc5-4c67-862c-16aa03a652da"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929623 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f54df74-81ac-43f7-9075-51cb26200c4e-kube-api-access-qwdv6" (OuterVolumeSpecName: "kube-api-access-qwdv6") pod "6f54df74-81ac-43f7-9075-51cb26200c4e" (UID: "6f54df74-81ac-43f7-9075-51cb26200c4e"). InnerVolumeSpecName "kube-api-access-qwdv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929808 4831 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/8d6ac806-4ac5-4de4-b6a0-b265032150f4-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929823 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929834 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6687g\" (UniqueName: \"kubernetes.io/projected/fa796212-03db-4860-93f4-d2918ed44070-kube-api-access-6687g\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929844 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsdbs\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-kube-api-access-fsdbs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929854 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/81084d6a-9987-4466-8f89-455aa3ff2627-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929863 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6f54df74-81ac-43f7-9075-51cb26200c4e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929882 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpjfw\" (UniqueName: \"kubernetes.io/projected/81084d6a-9987-4466-8f89-455aa3ff2627-kube-api-access-vpjfw\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929892 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929901 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b548290-abc5-4c67-862c-16aa03a652da-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929909 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jsrdw\" (UniqueName: \"kubernetes.io/projected/5067e964-1daa-4bbd-8e2b-872ce1067389-kube-api-access-jsrdw\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929918 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smwwf\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-kube-api-access-smwwf\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929928 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929936 4831 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929944 4831 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fa796212-03db-4860-93f4-d2918ed44070-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929952 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwdv6\" (UniqueName: \"kubernetes.io/projected/6f54df74-81ac-43f7-9075-51cb26200c4e-kube-api-access-qwdv6\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929961 4831 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/dc0cbb94-92ec-4369-b609-f3186f302c66-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929971 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929979 4831 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.929999 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.930459 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-scripts" (OuterVolumeSpecName: "scripts") pod "5067e964-1daa-4bbd-8e2b-872ce1067389" (UID: "5067e964-1daa-4bbd-8e2b-872ce1067389"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.936951 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b548290-abc5-4c67-862c-16aa03a652da-scripts" (OuterVolumeSpecName: "scripts") pod "5b548290-abc5-4c67-862c-16aa03a652da" (UID: "5b548290-abc5-4c67-862c-16aa03a652da"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.944605 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-config-data" (OuterVolumeSpecName: "config-data") pod "5067e964-1daa-4bbd-8e2b-872ce1067389" (UID: "5067e964-1daa-4bbd-8e2b-872ce1067389"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.946081 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b548290-abc5-4c67-862c-16aa03a652da-ovn-rundir" (OuterVolumeSpecName: "ovn-rundir") pod "5b548290-abc5-4c67-862c-16aa03a652da" (UID: "5b548290-abc5-4c67-862c-16aa03a652da"). InnerVolumeSpecName "ovn-rundir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.947509 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "dc0cbb94-92ec-4369-b609-f3186f302c66" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.953531 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e69614e3-a574-42e1-adc2-09861e9974e5-kube-api-access-n8hrh" (OuterVolumeSpecName: "kube-api-access-n8hrh") pod "e69614e3-a574-42e1-adc2-09861e9974e5" (UID: "e69614e3-a574-42e1-adc2-09861e9974e5"). InnerVolumeSpecName "kube-api-access-n8hrh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.973376 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5067e964-1daa-4bbd-8e2b-872ce1067389" (UID: "5067e964-1daa-4bbd-8e2b-872ce1067389"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.977737 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b548290-abc5-4c67-862c-16aa03a652da-kube-api-access-jtx5p" (OuterVolumeSpecName: "kube-api-access-jtx5p") pod "5b548290-abc5-4c67-862c-16aa03a652da" (UID: "5b548290-abc5-4c67-862c-16aa03a652da"). InnerVolumeSpecName "kube-api-access-jtx5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:38 crc kubenswrapper[4831]: I1203 06:55:38.990130 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.003875 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.031379 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a18ae10-7f43-4072-b01c-1564735985be" path="/var/lib/kubelet/pods/1a18ae10-7f43-4072-b01c-1564735985be/volumes" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.032039 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" path="/var/lib/kubelet/pods/1eab7a3f-11f0-4d00-b436-93cc30c2e8e1/volumes" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.032745 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtx5p\" (UniqueName: \"kubernetes.io/projected/5b548290-abc5-4c67-862c-16aa03a652da-kube-api-access-jtx5p\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.032763 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8hrh\" (UniqueName: \"kubernetes.io/projected/e69614e3-a574-42e1-adc2-09861e9974e5-kube-api-access-n8hrh\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.032774 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.032794 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.032805 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5b548290-abc5-4c67-862c-16aa03a652da-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.032814 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5b548290-abc5-4c67-862c-16aa03a652da-ovn-rundir\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.032822 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.032830 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.034680 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="267687cf-58da-42e0-852e-c8c87f2ea42a" path="/var/lib/kubelet/pods/267687cf-58da-42e0-852e-c8c87f2ea42a/volumes" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.035525 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe1a689-1241-4c11-93ca-875e53319668" path="/var/lib/kubelet/pods/5fe1a689-1241-4c11-93ca-875e53319668/volumes" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.037174 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a5d88e3-73a3-4f3d-af31-af675ab452bd" path="/var/lib/kubelet/pods/7a5d88e3-73a3-4f3d-af31-af675ab452bd/volumes" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.038048 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" path="/var/lib/kubelet/pods/9a438bff-fbe4-4ae4-8d0f-3eecc1819f50/volumes" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.039209 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" path="/var/lib/kubelet/pods/a5bbc77d-ca7b-48ab-b8bc-b304a12bb586/volumes" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.040962 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c60bce87-ea0b-4b3d-8243-93ed40c232ff" path="/var/lib/kubelet/pods/c60bce87-ea0b-4b3d-8243-93ed40c232ff/volumes" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.042794 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" path="/var/lib/kubelet/pods/cccc3a0b-98c7-4930-a7b5-3c1320a5ee69/volumes" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.051366 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data" (OuterVolumeSpecName: "config-data") pod "dc0cbb94-92ec-4369-b609-f3186f302c66" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.052249 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ffe861-7d12-49e2-9737-fc100833da39" path="/var/lib/kubelet/pods/e1ffe861-7d12-49e2-9737-fc100833da39/volumes" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.094039 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-server-conf" (OuterVolumeSpecName: "server-conf") pod "8d6ac806-4ac5-4de4-b6a0-b265032150f4" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.110214 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.110259 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-574cdc6988-72ggg"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.110272 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-574cdc6988-72ggg"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.115393 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data" (OuterVolumeSpecName: "config-data") pod "8d6ac806-4ac5-4de4-b6a0-b265032150f4" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.117660 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.135107 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.135133 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.135148 4831 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.135157 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/8d6ac806-4ac5-4de4-b6a0-b265032150f4-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.183980 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-server-conf" (OuterVolumeSpecName: "server-conf") pod "dc0cbb94-92ec-4369-b609-f3186f302c66" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.208487 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b548290-abc5-4c67-862c-16aa03a652da" (UID: "5b548290-abc5-4c67-862c-16aa03a652da"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.208624 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5067e964-1daa-4bbd-8e2b-872ce1067389" (UID: "5067e964-1daa-4bbd-8e2b-872ce1067389"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.229655 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement2619-account-delete-d6v4b"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.236453 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.236482 4831 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/dc0cbb94-92ec-4369-b609-f3186f302c66-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.236492 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.258596 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement2619-account-delete-d6v4b"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.261895 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fa796212-03db-4860-93f4-d2918ed44070" (UID: "fa796212-03db-4860-93f4-d2918ed44070"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.271520 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5067e964-1daa-4bbd-8e2b-872ce1067389" (UID: "5067e964-1daa-4bbd-8e2b-872ce1067389"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.281258 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novacell07556-account-delete-rtljc"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.283924 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-ovn-northd-tls-certs" (OuterVolumeSpecName: "ovn-northd-tls-certs") pod "5b548290-abc5-4c67-862c-16aa03a652da" (UID: "5b548290-abc5-4c67-862c-16aa03a652da"). InnerVolumeSpecName "ovn-northd-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.295146 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novacell07556-account-delete-rtljc"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.311091 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.325706 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "8d6ac806-4ac5-4de4-b6a0-b265032150f4" (UID: "8d6ac806-4ac5-4de4-b6a0-b265032150f4"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.335366 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.339494 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5067e964-1daa-4bbd-8e2b-872ce1067389-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.339551 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-ovn-northd-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.339561 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.339571 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/8d6ac806-4ac5-4de4-b6a0-b265032150f4-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.341185 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-config-data" (OuterVolumeSpecName: "config-data") pod "fa796212-03db-4860-93f4-d2918ed44070" (UID: "fa796212-03db-4860-93f4-d2918ed44070"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.343407 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "dc0cbb94-92ec-4369-b609-f3186f302c66" (UID: "dc0cbb94-92ec-4369-b609-f3186f302c66"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.352493 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-metrics-certs-tls-certs" (OuterVolumeSpecName: "metrics-certs-tls-certs") pod "5b548290-abc5-4c67-862c-16aa03a652da" (UID: "5b548290-abc5-4c67-862c-16aa03a652da"). InnerVolumeSpecName "metrics-certs-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.442328 4831 reconciler_common.go:293] "Volume detached for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5b548290-abc5-4c67-862c-16aa03a652da-metrics-certs-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.442370 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/dc0cbb94-92ec-4369-b609-f3186f302c66-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.442385 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fa796212-03db-4860-93f4-d2918ed44070-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.512312 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.538457 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.542893 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-combined-ca-bundle\") pod \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.542933 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-config\") pod \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.543006 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-internal-tls-certs\") pod \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.543041 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jrh9v\" (UniqueName: \"kubernetes.io/projected/1f442a70-f040-4b0e-853d-6ce1f4caf63d-kube-api-access-jrh9v\") pod \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.543090 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-public-tls-certs\") pod \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.543142 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-ovndb-tls-certs\") pod \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.543159 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-httpd-config\") pod \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\" (UID: \"1f442a70-f040-4b0e-853d-6ce1f4caf63d\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.546951 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f442a70-f040-4b0e-853d-6ce1f4caf63d-kube-api-access-jrh9v" (OuterVolumeSpecName: "kube-api-access-jrh9v") pod "1f442a70-f040-4b0e-853d-6ce1f4caf63d" (UID: "1f442a70-f040-4b0e-853d-6ce1f4caf63d"). InnerVolumeSpecName "kube-api-access-jrh9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.548191 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "1f442a70-f040-4b0e-853d-6ce1f4caf63d" (UID: "1f442a70-f040-4b0e-853d-6ce1f4caf63d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.619707 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-config" (OuterVolumeSpecName: "config") pod "1f442a70-f040-4b0e-853d-6ce1f4caf63d" (UID: "1f442a70-f040-4b0e-853d-6ce1f4caf63d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.622559 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1f442a70-f040-4b0e-853d-6ce1f4caf63d" (UID: "1f442a70-f040-4b0e-853d-6ce1f4caf63d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.625577 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "1f442a70-f040-4b0e-853d-6ce1f4caf63d" (UID: "1f442a70-f040-4b0e-853d-6ce1f4caf63d"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.644691 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-log-httpd\") pod \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.644804 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-config-data\") pod \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.644844 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-run-httpd\") pod \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.644869 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-combined-ca-bundle\") pod \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.644901 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-ceilometer-tls-certs\") pod \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.644932 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-scripts\") pod \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.644985 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f6qqp\" (UniqueName: \"kubernetes.io/projected/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-kube-api-access-f6qqp\") pod \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.645052 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-sg-core-conf-yaml\") pod \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\" (UID: \"86a3b2cb-5698-41ab-939f-a7f1e3ccc998\") " Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.645388 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.645408 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jrh9v\" (UniqueName: \"kubernetes.io/projected/1f442a70-f040-4b0e-853d-6ce1f4caf63d-kube-api-access-jrh9v\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.645421 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.645433 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.645420 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "86a3b2cb-5698-41ab-939f-a7f1e3ccc998" (UID: "86a3b2cb-5698-41ab-939f-a7f1e3ccc998"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.645445 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-config\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.645883 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "86a3b2cb-5698-41ab-939f-a7f1e3ccc998" (UID: "86a3b2cb-5698-41ab-939f-a7f1e3ccc998"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.647605 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "1f442a70-f040-4b0e-853d-6ce1f4caf63d" (UID: "1f442a70-f040-4b0e-853d-6ce1f4caf63d"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.651662 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-kube-api-access-f6qqp" (OuterVolumeSpecName: "kube-api-access-f6qqp") pod "86a3b2cb-5698-41ab-939f-a7f1e3ccc998" (UID: "86a3b2cb-5698-41ab-939f-a7f1e3ccc998"). InnerVolumeSpecName "kube-api-access-f6qqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.666494 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-scripts" (OuterVolumeSpecName: "scripts") pod "86a3b2cb-5698-41ab-939f-a7f1e3ccc998" (UID: "86a3b2cb-5698-41ab-939f-a7f1e3ccc998"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.683649 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "86a3b2cb-5698-41ab-939f-a7f1e3ccc998" (UID: "86a3b2cb-5698-41ab-939f-a7f1e3ccc998"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.692454 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "1f442a70-f040-4b0e-853d-6ce1f4caf63d" (UID: "1f442a70-f040-4b0e-853d-6ce1f4caf63d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.730499 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "86a3b2cb-5698-41ab-939f-a7f1e3ccc998" (UID: "86a3b2cb-5698-41ab-939f-a7f1e3ccc998"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.738884 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutronb468-account-delete-9hl65" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.747265 4831 generic.go:334] "Generic (PLEG): container finished" podID="1f442a70-f040-4b0e-853d-6ce1f4caf63d" containerID="dbe9b6c70e9b9ed94d9e8daa47ea19329c02389842e6765ffe90adb685ffb61d" exitCode=0 Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.747546 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-844cdc6797-kqpvp" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.749660 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-844cdc6797-kqpvp" event={"ID":"1f442a70-f040-4b0e-853d-6ce1f4caf63d","Type":"ContainerDied","Data":"dbe9b6c70e9b9ed94d9e8daa47ea19329c02389842e6765ffe90adb685ffb61d"} Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.755847 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-844cdc6797-kqpvp" event={"ID":"1f442a70-f040-4b0e-853d-6ce1f4caf63d","Type":"ContainerDied","Data":"734823a44542390e6448c01e16a0d2f0ddfac87e3c04e07062f8b210d94e02cb"} Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.756281 4831 scope.go:117] "RemoveContainer" containerID="504d875e8e97248f8475ef8011f761759ddfc7620a7fb784353a5a8aabe3869e" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.758189 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_5b548290-abc5-4c67-862c-16aa03a652da/ovn-northd/0.log" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.758392 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"5b548290-abc5-4c67-862c-16aa03a652da","Type":"ContainerDied","Data":"9163527ee0c82ba9f14f26c03aa04b245c67e69f182fc2dafc6a7e0626cc9046"} Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.758745 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.765079 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"fa796212-03db-4860-93f4-d2918ed44070","Type":"ContainerDied","Data":"9c34c729095ae2910e230e2b44b9dc0906e76610536e9e1b7bcd5fc60d54ad73"} Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.765184 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.768388 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f6qqp\" (UniqueName: \"kubernetes.io/projected/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-kube-api-access-f6qqp\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.768416 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.768425 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.769045 4831 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/1f442a70-f040-4b0e-853d-6ce1f4caf63d-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.769065 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.769075 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.769083 4831 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.769170 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.772229 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutronb468-account-delete-9hl65"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.772286 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutronb468-account-delete-9hl65"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.774646 4831 generic.go:334] "Generic (PLEG): container finished" podID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerID="7e4f54e18bab0081a153be21f4ff39501c5a9212464c6700ffd2af4a479be6b7" exitCode=0 Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.774708 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86a3b2cb-5698-41ab-939f-a7f1e3ccc998","Type":"ContainerDied","Data":"7e4f54e18bab0081a153be21f4ff39501c5a9212464c6700ffd2af4a479be6b7"} Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.774728 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"86a3b2cb-5698-41ab-939f-a7f1e3ccc998","Type":"ContainerDied","Data":"7103e91ff04023ddb4c24a272266745e9e189b6b2220b5897dd2cdc470bfc0b1"} Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.774797 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.777281 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican3e51-account-delete-4jnl5" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.777658 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-665dcf9f4f-8g7pd" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.777657 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.777691 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/novaapi8edc-account-delete-2pvdg" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.777665 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance3e24-account-delete-nkv48" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.777696 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.777720 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder3a6d-account-delete-b6ptg" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.785511 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "86a3b2cb-5698-41ab-939f-a7f1e3ccc998" (UID: "86a3b2cb-5698-41ab-939f-a7f1e3ccc998"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.788869 4831 scope.go:117] "RemoveContainer" containerID="dbe9b6c70e9b9ed94d9e8daa47ea19329c02389842e6765ffe90adb685ffb61d" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.802120 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-config-data" (OuterVolumeSpecName: "config-data") pod "86a3b2cb-5698-41ab-939f-a7f1e3ccc998" (UID: "86a3b2cb-5698-41ab-939f-a7f1e3ccc998"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.836057 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-844cdc6797-kqpvp"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.851743 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-844cdc6797-kqpvp"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.857331 4831 scope.go:117] "RemoveContainer" containerID="504d875e8e97248f8475ef8011f761759ddfc7620a7fb784353a5a8aabe3869e" Dec 03 06:55:39 crc kubenswrapper[4831]: E1203 06:55:39.858296 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"504d875e8e97248f8475ef8011f761759ddfc7620a7fb784353a5a8aabe3869e\": container with ID starting with 504d875e8e97248f8475ef8011f761759ddfc7620a7fb784353a5a8aabe3869e not found: ID does not exist" containerID="504d875e8e97248f8475ef8011f761759ddfc7620a7fb784353a5a8aabe3869e" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.858362 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"504d875e8e97248f8475ef8011f761759ddfc7620a7fb784353a5a8aabe3869e"} err="failed to get container status \"504d875e8e97248f8475ef8011f761759ddfc7620a7fb784353a5a8aabe3869e\": rpc error: code = NotFound desc = could not find container \"504d875e8e97248f8475ef8011f761759ddfc7620a7fb784353a5a8aabe3869e\": container with ID starting with 504d875e8e97248f8475ef8011f761759ddfc7620a7fb784353a5a8aabe3869e not found: ID does not exist" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.858390 4831 scope.go:117] "RemoveContainer" containerID="dbe9b6c70e9b9ed94d9e8daa47ea19329c02389842e6765ffe90adb685ffb61d" Dec 03 06:55:39 crc kubenswrapper[4831]: E1203 06:55:39.859171 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbe9b6c70e9b9ed94d9e8daa47ea19329c02389842e6765ffe90adb685ffb61d\": container with ID starting with dbe9b6c70e9b9ed94d9e8daa47ea19329c02389842e6765ffe90adb685ffb61d not found: ID does not exist" containerID="dbe9b6c70e9b9ed94d9e8daa47ea19329c02389842e6765ffe90adb685ffb61d" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.859709 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbe9b6c70e9b9ed94d9e8daa47ea19329c02389842e6765ffe90adb685ffb61d"} err="failed to get container status \"dbe9b6c70e9b9ed94d9e8daa47ea19329c02389842e6765ffe90adb685ffb61d\": rpc error: code = NotFound desc = could not find container \"dbe9b6c70e9b9ed94d9e8daa47ea19329c02389842e6765ffe90adb685ffb61d\": container with ID starting with dbe9b6c70e9b9ed94d9e8daa47ea19329c02389842e6765ffe90adb685ffb61d not found: ID does not exist" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.859742 4831 scope.go:117] "RemoveContainer" containerID="86e3f1fa5d839f3ee714b40f2722df67d64bd2305d8193e7a010af7f96797f76" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.862476 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican3e51-account-delete-4jnl5"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.874069 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.874113 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/86a3b2cb-5698-41ab-939f-a7f1e3ccc998-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.876420 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican3e51-account-delete-4jnl5"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.887840 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance3e24-account-delete-nkv48"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.901750 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance3e24-account-delete-nkv48"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.916231 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/novaapi8edc-account-delete-2pvdg"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.917105 4831 scope.go:117] "RemoveContainer" containerID="f0eb2d6ba5ef1315aaa77b25c9e9c78a2a644f93319acc5b929f740adc9e3caa" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.927825 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/novaapi8edc-account-delete-2pvdg"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.937660 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder3a6d-account-delete-b6ptg"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.948929 4831 scope.go:117] "RemoveContainer" containerID="fd1e277c8951121dadc44ba354b735192014953fab00452b13030855e8591cd1" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.960586 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder3a6d-account-delete-b6ptg"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.966887 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.976533 4831 scope.go:117] "RemoveContainer" containerID="e1b665f5de4ff9fea0d7f46233aebcb76bd558b21574f6845a1c4a0745851315" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.976748 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.982701 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.996857 4831 scope.go:117] "RemoveContainer" containerID="261de1853dcb7663f85295db044aad677b35575a677eb0bba2a868477180e3c8" Dec 03 06:55:39 crc kubenswrapper[4831]: I1203 06:55:39.999765 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.007032 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-665dcf9f4f-8g7pd"] Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.014307 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-665dcf9f4f-8g7pd"] Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.022438 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.024178 4831 scope.go:117] "RemoveContainer" containerID="e71ff33317182316540c85e64e3ed72394f4b793212b2b2ad2dcf5211991e1f4" Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.029331 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.040273 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.054625 4831 scope.go:117] "RemoveContainer" containerID="7e4f54e18bab0081a153be21f4ff39501c5a9212464c6700ffd2af4a479be6b7" Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.055957 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 06:55:40 crc kubenswrapper[4831]: E1203 06:55:40.074942 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5067e964_1daa_4bbd_8e2b_872ce1067389.slice/crio-c92eb12f4c11ef063a6c7bd73114db57fdf21e3ba90802835bee2899ce4b49fa\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b548290_abc5_4c67_862c_16aa03a652da.slice/crio-9163527ee0c82ba9f14f26c03aa04b245c67e69f182fc2dafc6a7e0626cc9046\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa796212_03db_4860_93f4_d2918ed44070.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5067e964_1daa_4bbd_8e2b_872ce1067389.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d6ac806_4ac5_4de4_b6a0_b265032150f4.slice/crio-e960d02942ee37d96724ed9e66feb25249f3e46c0430b2f545795a2dd1ce326d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d6ac806_4ac5_4de4_b6a0_b265032150f4.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc0cbb94_92ec_4369_b609_f3186f302c66.slice/crio-2a0709e4a5635f9cc13d8259ae8df8a212e11b8c97b94c1cb186db44ad3b61cb\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b548290_abc5_4c67_862c_16aa03a652da.slice\": RecentStats: unable to find data in memory cache]" Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.079043 4831 scope.go:117] "RemoveContainer" containerID="fcc1e170c1524fa94e168039a9e3d7633e7672aa2c44dec0ccc17b3dca0a710d" Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.099731 4831 scope.go:117] "RemoveContainer" containerID="261de1853dcb7663f85295db044aad677b35575a677eb0bba2a868477180e3c8" Dec 03 06:55:40 crc kubenswrapper[4831]: E1203 06:55:40.100586 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"261de1853dcb7663f85295db044aad677b35575a677eb0bba2a868477180e3c8\": container with ID starting with 261de1853dcb7663f85295db044aad677b35575a677eb0bba2a868477180e3c8 not found: ID does not exist" containerID="261de1853dcb7663f85295db044aad677b35575a677eb0bba2a868477180e3c8" Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.100641 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"261de1853dcb7663f85295db044aad677b35575a677eb0bba2a868477180e3c8"} err="failed to get container status \"261de1853dcb7663f85295db044aad677b35575a677eb0bba2a868477180e3c8\": rpc error: code = NotFound desc = could not find container \"261de1853dcb7663f85295db044aad677b35575a677eb0bba2a868477180e3c8\": container with ID starting with 261de1853dcb7663f85295db044aad677b35575a677eb0bba2a868477180e3c8 not found: ID does not exist" Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.100669 4831 scope.go:117] "RemoveContainer" containerID="e71ff33317182316540c85e64e3ed72394f4b793212b2b2ad2dcf5211991e1f4" Dec 03 06:55:40 crc kubenswrapper[4831]: E1203 06:55:40.100901 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e71ff33317182316540c85e64e3ed72394f4b793212b2b2ad2dcf5211991e1f4\": container with ID starting with e71ff33317182316540c85e64e3ed72394f4b793212b2b2ad2dcf5211991e1f4 not found: ID does not exist" containerID="e71ff33317182316540c85e64e3ed72394f4b793212b2b2ad2dcf5211991e1f4" Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.100920 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e71ff33317182316540c85e64e3ed72394f4b793212b2b2ad2dcf5211991e1f4"} err="failed to get container status \"e71ff33317182316540c85e64e3ed72394f4b793212b2b2ad2dcf5211991e1f4\": rpc error: code = NotFound desc = could not find container \"e71ff33317182316540c85e64e3ed72394f4b793212b2b2ad2dcf5211991e1f4\": container with ID starting with e71ff33317182316540c85e64e3ed72394f4b793212b2b2ad2dcf5211991e1f4 not found: ID does not exist" Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.100934 4831 scope.go:117] "RemoveContainer" containerID="7e4f54e18bab0081a153be21f4ff39501c5a9212464c6700ffd2af4a479be6b7" Dec 03 06:55:40 crc kubenswrapper[4831]: E1203 06:55:40.101125 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e4f54e18bab0081a153be21f4ff39501c5a9212464c6700ffd2af4a479be6b7\": container with ID starting with 7e4f54e18bab0081a153be21f4ff39501c5a9212464c6700ffd2af4a479be6b7 not found: ID does not exist" containerID="7e4f54e18bab0081a153be21f4ff39501c5a9212464c6700ffd2af4a479be6b7" Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.101149 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e4f54e18bab0081a153be21f4ff39501c5a9212464c6700ffd2af4a479be6b7"} err="failed to get container status \"7e4f54e18bab0081a153be21f4ff39501c5a9212464c6700ffd2af4a479be6b7\": rpc error: code = NotFound desc = could not find container \"7e4f54e18bab0081a153be21f4ff39501c5a9212464c6700ffd2af4a479be6b7\": container with ID starting with 7e4f54e18bab0081a153be21f4ff39501c5a9212464c6700ffd2af4a479be6b7 not found: ID does not exist" Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.101163 4831 scope.go:117] "RemoveContainer" containerID="fcc1e170c1524fa94e168039a9e3d7633e7672aa2c44dec0ccc17b3dca0a710d" Dec 03 06:55:40 crc kubenswrapper[4831]: E1203 06:55:40.101521 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcc1e170c1524fa94e168039a9e3d7633e7672aa2c44dec0ccc17b3dca0a710d\": container with ID starting with fcc1e170c1524fa94e168039a9e3d7633e7672aa2c44dec0ccc17b3dca0a710d not found: ID does not exist" containerID="fcc1e170c1524fa94e168039a9e3d7633e7672aa2c44dec0ccc17b3dca0a710d" Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.101536 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcc1e170c1524fa94e168039a9e3d7633e7672aa2c44dec0ccc17b3dca0a710d"} err="failed to get container status \"fcc1e170c1524fa94e168039a9e3d7633e7672aa2c44dec0ccc17b3dca0a710d\": rpc error: code = NotFound desc = could not find container \"fcc1e170c1524fa94e168039a9e3d7633e7672aa2c44dec0ccc17b3dca0a710d\": container with ID starting with fcc1e170c1524fa94e168039a9e3d7633e7672aa2c44dec0ccc17b3dca0a710d not found: ID does not exist" Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.109335 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:55:40 crc kubenswrapper[4831]: I1203 06:55:40.113766 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.022830 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f442a70-f040-4b0e-853d-6ce1f4caf63d" path="/var/lib/kubelet/pods/1f442a70-f040-4b0e-853d-6ce1f4caf63d/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.024379 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20722e3f-f810-4ac7-80d7-09cae400150a" path="/var/lib/kubelet/pods/20722e3f-f810-4ac7-80d7-09cae400150a/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.025082 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ef057cb-0d02-49d4-a20d-ac9a3f85484e" path="/var/lib/kubelet/pods/2ef057cb-0d02-49d4-a20d-ac9a3f85484e/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.026737 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5067e964-1daa-4bbd-8e2b-872ce1067389" path="/var/lib/kubelet/pods/5067e964-1daa-4bbd-8e2b-872ce1067389/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.027428 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54eb5ffe-7e2f-4a33-9689-0470affe10e0" path="/var/lib/kubelet/pods/54eb5ffe-7e2f-4a33-9689-0470affe10e0/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.028190 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" path="/var/lib/kubelet/pods/55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.030000 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56055ee5-407e-4ced-865f-03585e5f7f7b" path="/var/lib/kubelet/pods/56055ee5-407e-4ced-865f-03585e5f7f7b/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.030724 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b548290-abc5-4c67-862c-16aa03a652da" path="/var/lib/kubelet/pods/5b548290-abc5-4c67-862c-16aa03a652da/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.031467 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f54df74-81ac-43f7-9075-51cb26200c4e" path="/var/lib/kubelet/pods/6f54df74-81ac-43f7-9075-51cb26200c4e/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.032526 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="770b98aa-f177-4c7b-b37e-1664c039f47d" path="/var/lib/kubelet/pods/770b98aa-f177-4c7b-b37e-1664c039f47d/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.033195 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81084d6a-9987-4466-8f89-455aa3ff2627" path="/var/lib/kubelet/pods/81084d6a-9987-4466-8f89-455aa3ff2627/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.033843 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" path="/var/lib/kubelet/pods/86a3b2cb-5698-41ab-939f-a7f1e3ccc998/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.035253 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d6ac806-4ac5-4de4-b6a0-b265032150f4" path="/var/lib/kubelet/pods/8d6ac806-4ac5-4de4-b6a0-b265032150f4/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.035851 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aadb65a0-295d-4fcf-b148-44480346d357" path="/var/lib/kubelet/pods/aadb65a0-295d-4fcf-b148-44480346d357/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.037004 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc0cbb94-92ec-4369-b609-f3186f302c66" path="/var/lib/kubelet/pods/dc0cbb94-92ec-4369-b609-f3186f302c66/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.037613 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e69614e3-a574-42e1-adc2-09861e9974e5" path="/var/lib/kubelet/pods/e69614e3-a574-42e1-adc2-09861e9974e5/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: I1203 06:55:41.038170 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa796212-03db-4860-93f4-d2918ed44070" path="/var/lib/kubelet/pods/fa796212-03db-4860-93f4-d2918ed44070/volumes" Dec 03 06:55:41 crc kubenswrapper[4831]: E1203 06:55:41.688029 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:41 crc kubenswrapper[4831]: E1203 06:55:41.688639 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:41 crc kubenswrapper[4831]: E1203 06:55:41.688729 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:41 crc kubenswrapper[4831]: E1203 06:55:41.689482 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:41 crc kubenswrapper[4831]: E1203 06:55:41.689529 4831 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovsdb-server" Dec 03 06:55:41 crc kubenswrapper[4831]: E1203 06:55:41.690431 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:41 crc kubenswrapper[4831]: E1203 06:55:41.692086 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:41 crc kubenswrapper[4831]: E1203 06:55:41.692151 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovs-vswitchd" Dec 03 06:55:43 crc kubenswrapper[4831]: I1203 06:55:43.114580 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-869cfdc5c4-7898s" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 06:55:43 crc kubenswrapper[4831]: I1203 06:55:43.114587 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-869cfdc5c4-7898s" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:55:46 crc kubenswrapper[4831]: E1203 06:55:46.686632 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:46 crc kubenswrapper[4831]: E1203 06:55:46.687443 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:46 crc kubenswrapper[4831]: E1203 06:55:46.687785 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:46 crc kubenswrapper[4831]: E1203 06:55:46.687838 4831 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovsdb-server" Dec 03 06:55:46 crc kubenswrapper[4831]: E1203 06:55:46.688188 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:46 crc kubenswrapper[4831]: E1203 06:55:46.689926 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:46 crc kubenswrapper[4831]: E1203 06:55:46.691683 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:46 crc kubenswrapper[4831]: E1203 06:55:46.691741 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovs-vswitchd" Dec 03 06:55:48 crc kubenswrapper[4831]: I1203 06:55:48.125505 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-869cfdc5c4-7898s" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:55:48 crc kubenswrapper[4831]: I1203 06:55:48.125509 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-869cfdc5c4-7898s" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:55:51 crc kubenswrapper[4831]: E1203 06:55:51.687255 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:51 crc kubenswrapper[4831]: E1203 06:55:51.687991 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:51 crc kubenswrapper[4831]: E1203 06:55:51.688355 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:51 crc kubenswrapper[4831]: E1203 06:55:51.688697 4831 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovsdb-server" Dec 03 06:55:51 crc kubenswrapper[4831]: E1203 06:55:51.688588 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:51 crc kubenswrapper[4831]: E1203 06:55:51.690367 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:51 crc kubenswrapper[4831]: E1203 06:55:51.692059 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:51 crc kubenswrapper[4831]: E1203 06:55:51.692107 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovs-vswitchd" Dec 03 06:55:53 crc kubenswrapper[4831]: I1203 06:55:53.135501 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-869cfdc5c4-7898s" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:55:53 crc kubenswrapper[4831]: I1203 06:55:53.135523 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-869cfdc5c4-7898s" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:55:56 crc kubenswrapper[4831]: E1203 06:55:56.687529 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:56 crc kubenswrapper[4831]: E1203 06:55:56.688483 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:56 crc kubenswrapper[4831]: E1203 06:55:56.689487 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:56 crc kubenswrapper[4831]: E1203 06:55:56.689476 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" cmd=["/usr/local/bin/container-scripts/ovsdb_server_readiness.sh"] Dec 03 06:55:56 crc kubenswrapper[4831]: E1203 06:55:56.689556 4831 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 is running failed: container process not found" probeType="Readiness" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovsdb-server" Dec 03 06:55:56 crc kubenswrapper[4831]: E1203 06:55:56.692686 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:56 crc kubenswrapper[4831]: E1203 06:55:56.696311 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" cmd=["/usr/local/bin/container-scripts/vswitchd_readiness.sh"] Dec 03 06:55:56 crc kubenswrapper[4831]: E1203 06:55:56.696397 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/ovn-controller-ovs-w7h89" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovs-vswitchd" Dec 03 06:55:57 crc kubenswrapper[4831]: I1203 06:55:57.597175 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:55:57 crc kubenswrapper[4831]: I1203 06:55:57.597253 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:55:58 crc kubenswrapper[4831]: I1203 06:55:58.144613 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-869cfdc5c4-7898s" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:55:58 crc kubenswrapper[4831]: I1203 06:55:58.144631 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-869cfdc5c4-7898s" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.774227 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-w7h89_7f657b4b-bed8-4244-8727-2a3c59364041/ovs-vswitchd/0.log" Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.775032 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.946273 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-lib\") pod \"7f657b4b-bed8-4244-8727-2a3c59364041\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.946621 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-log\") pod \"7f657b4b-bed8-4244-8727-2a3c59364041\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.946679 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pqxbl\" (UniqueName: \"kubernetes.io/projected/7f657b4b-bed8-4244-8727-2a3c59364041-kube-api-access-pqxbl\") pod \"7f657b4b-bed8-4244-8727-2a3c59364041\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.946718 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f657b4b-bed8-4244-8727-2a3c59364041-scripts\") pod \"7f657b4b-bed8-4244-8727-2a3c59364041\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.946558 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-lib" (OuterVolumeSpecName: "var-lib") pod "7f657b4b-bed8-4244-8727-2a3c59364041" (UID: "7f657b4b-bed8-4244-8727-2a3c59364041"). InnerVolumeSpecName "var-lib". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.946797 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-run\") pod \"7f657b4b-bed8-4244-8727-2a3c59364041\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.946821 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-etc-ovs\") pod \"7f657b4b-bed8-4244-8727-2a3c59364041\" (UID: \"7f657b4b-bed8-4244-8727-2a3c59364041\") " Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.946823 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-log" (OuterVolumeSpecName: "var-log") pod "7f657b4b-bed8-4244-8727-2a3c59364041" (UID: "7f657b4b-bed8-4244-8727-2a3c59364041"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.946890 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-run" (OuterVolumeSpecName: "var-run") pod "7f657b4b-bed8-4244-8727-2a3c59364041" (UID: "7f657b4b-bed8-4244-8727-2a3c59364041"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.946998 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-etc-ovs" (OuterVolumeSpecName: "etc-ovs") pod "7f657b4b-bed8-4244-8727-2a3c59364041" (UID: "7f657b4b-bed8-4244-8727-2a3c59364041"). InnerVolumeSpecName "etc-ovs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.947120 4831 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.947133 4831 reconciler_common.go:293] "Volume detached for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-etc-ovs\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.947141 4831 reconciler_common.go:293] "Volume detached for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-lib\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.947150 4831 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/7f657b4b-bed8-4244-8727-2a3c59364041-var-log\") on node \"crc\" DevicePath \"\"" Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.947826 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f657b4b-bed8-4244-8727-2a3c59364041-scripts" (OuterVolumeSpecName: "scripts") pod "7f657b4b-bed8-4244-8727-2a3c59364041" (UID: "7f657b4b-bed8-4244-8727-2a3c59364041"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.951231 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f657b4b-bed8-4244-8727-2a3c59364041-kube-api-access-pqxbl" (OuterVolumeSpecName: "kube-api-access-pqxbl") pod "7f657b4b-bed8-4244-8727-2a3c59364041" (UID: "7f657b4b-bed8-4244-8727-2a3c59364041"). InnerVolumeSpecName "kube-api-access-pqxbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:55:59 crc kubenswrapper[4831]: I1203 06:55:59.962859 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.028749 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerID="fd5780a1f6eae3fedba096674fc112a06bd17051dfaba7a6dbcd0187d08efbe7" exitCode=137 Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.028811 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"fd5780a1f6eae3fedba096674fc112a06bd17051dfaba7a6dbcd0187d08efbe7"} Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.028854 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.028886 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5","Type":"ContainerDied","Data":"8c5c1ed90709483c852f361b002cbf1be4c46fa6a876d4a80cc2c291e47730c3"} Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.028914 4831 scope.go:117] "RemoveContainer" containerID="fd5780a1f6eae3fedba096674fc112a06bd17051dfaba7a6dbcd0187d08efbe7" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.030858 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-w7h89_7f657b4b-bed8-4244-8727-2a3c59364041/ovs-vswitchd/0.log" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.031592 4831 generic.go:334] "Generic (PLEG): container finished" podID="7f657b4b-bed8-4244-8727-2a3c59364041" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" exitCode=137 Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.031660 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-w7h89" event={"ID":"7f657b4b-bed8-4244-8727-2a3c59364041","Type":"ContainerDied","Data":"e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74"} Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.031694 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-w7h89" event={"ID":"7f657b4b-bed8-4244-8727-2a3c59364041","Type":"ContainerDied","Data":"77e4435c2b9852cf3bd63836f6d3dc5a707ee52c2e763cfa611a28d7bbaa5e2a"} Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.031786 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-w7h89" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.048813 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pqxbl\" (UniqueName: \"kubernetes.io/projected/7f657b4b-bed8-4244-8727-2a3c59364041-kube-api-access-pqxbl\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.048850 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f657b4b-bed8-4244-8727-2a3c59364041-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.053876 4831 scope.go:117] "RemoveContainer" containerID="4f4ec3457fbe407088edd9608e37a0d64f94f5f8a856063df097964d16e89a41" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.072069 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ovs-w7h89"] Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.076659 4831 scope.go:117] "RemoveContainer" containerID="c3f41ef8ab13f137c3b5591e5b203f6dabd3e35924d132650708724f52cb311a" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.080487 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ovs-w7h89"] Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.096412 4831 scope.go:117] "RemoveContainer" containerID="028761397bb478f7acb2d80c81c0b10fa950424ecf33e2cf4134085e78dfa0ea" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.116784 4831 scope.go:117] "RemoveContainer" containerID="a45f8080689abc78feae747c5b6498cddb0646c863625a0fcdc630c1aa9a119f" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.134185 4831 scope.go:117] "RemoveContainer" containerID="46ddf46be9c06097453cf59db024eae7e40683f0a9989b74823fd4ae301bb17d" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.150036 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.150078 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-cache\") pod \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.150201 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp86h\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-kube-api-access-jp86h\") pod \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.150230 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift\") pod \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.150295 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-lock\") pod \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\" (UID: \"7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5\") " Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.150935 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-lock" (OuterVolumeSpecName: "lock") pod "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" (UID: "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.151016 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-cache" (OuterVolumeSpecName: "cache") pod "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" (UID: "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.152800 4831 scope.go:117] "RemoveContainer" containerID="b9831c9567de92c654defb6777172aef010ae80be0cc6394662c2231bb36974f" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.153331 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-kube-api-access-jp86h" (OuterVolumeSpecName: "kube-api-access-jp86h") pod "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" (UID: "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5"). InnerVolumeSpecName "kube-api-access-jp86h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.153504 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" (UID: "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.153506 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "swift") pod "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" (UID: "7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.173395 4831 scope.go:117] "RemoveContainer" containerID="d637e5874dd130a1ed498e16b28107fbbaefcfdf4116b7e7df8a194953476096" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.191636 4831 scope.go:117] "RemoveContainer" containerID="2ba943381a5e11302635d211cf0600c94162fbd3abf92948e4a2fb1e82ec5bde" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.208614 4831 scope.go:117] "RemoveContainer" containerID="6a0f59a871e573533ec16955bf826ca1763261b3d21e5512bb24a0e62bd0b73f" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.226929 4831 scope.go:117] "RemoveContainer" containerID="a8bb5d58227d4221c9bfbfe3c07b51f7961676fc4bef20fa5826956eaf08b8f4" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.247555 4831 scope.go:117] "RemoveContainer" containerID="56d4eeeadba1da79db3faf5f409f1a7b9ecb1b330a57a0eb2389a1bfa3a52c83" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.251313 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jp86h\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-kube-api-access-jp86h\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.251398 4831 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-etc-swift\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.251452 4831 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-lock\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.251562 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.251792 4831 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5-cache\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.266220 4831 scope.go:117] "RemoveContainer" containerID="8dafbfe58e259c950e7b63464cdbfb891e68abc0a86b7d4e0c9bc5406283a1a8" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.269981 4831 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.281791 4831 scope.go:117] "RemoveContainer" containerID="3c231ebfbf6bf636033051e5a33d02c71564164db1667cbd16b072382a0b97b0" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.298725 4831 scope.go:117] "RemoveContainer" containerID="f22d5226986449b8e588d26eb4a1afaecba31a4dd8aa09a17eaa0c7ac5bec2f5" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.328395 4831 scope.go:117] "RemoveContainer" containerID="fd5780a1f6eae3fedba096674fc112a06bd17051dfaba7a6dbcd0187d08efbe7" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.328942 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd5780a1f6eae3fedba096674fc112a06bd17051dfaba7a6dbcd0187d08efbe7\": container with ID starting with fd5780a1f6eae3fedba096674fc112a06bd17051dfaba7a6dbcd0187d08efbe7 not found: ID does not exist" containerID="fd5780a1f6eae3fedba096674fc112a06bd17051dfaba7a6dbcd0187d08efbe7" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.328972 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd5780a1f6eae3fedba096674fc112a06bd17051dfaba7a6dbcd0187d08efbe7"} err="failed to get container status \"fd5780a1f6eae3fedba096674fc112a06bd17051dfaba7a6dbcd0187d08efbe7\": rpc error: code = NotFound desc = could not find container \"fd5780a1f6eae3fedba096674fc112a06bd17051dfaba7a6dbcd0187d08efbe7\": container with ID starting with fd5780a1f6eae3fedba096674fc112a06bd17051dfaba7a6dbcd0187d08efbe7 not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.328994 4831 scope.go:117] "RemoveContainer" containerID="4f4ec3457fbe407088edd9608e37a0d64f94f5f8a856063df097964d16e89a41" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.334146 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f4ec3457fbe407088edd9608e37a0d64f94f5f8a856063df097964d16e89a41\": container with ID starting with 4f4ec3457fbe407088edd9608e37a0d64f94f5f8a856063df097964d16e89a41 not found: ID does not exist" containerID="4f4ec3457fbe407088edd9608e37a0d64f94f5f8a856063df097964d16e89a41" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.334181 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f4ec3457fbe407088edd9608e37a0d64f94f5f8a856063df097964d16e89a41"} err="failed to get container status \"4f4ec3457fbe407088edd9608e37a0d64f94f5f8a856063df097964d16e89a41\": rpc error: code = NotFound desc = could not find container \"4f4ec3457fbe407088edd9608e37a0d64f94f5f8a856063df097964d16e89a41\": container with ID starting with 4f4ec3457fbe407088edd9608e37a0d64f94f5f8a856063df097964d16e89a41 not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.334207 4831 scope.go:117] "RemoveContainer" containerID="c3f41ef8ab13f137c3b5591e5b203f6dabd3e35924d132650708724f52cb311a" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.334600 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3f41ef8ab13f137c3b5591e5b203f6dabd3e35924d132650708724f52cb311a\": container with ID starting with c3f41ef8ab13f137c3b5591e5b203f6dabd3e35924d132650708724f52cb311a not found: ID does not exist" containerID="c3f41ef8ab13f137c3b5591e5b203f6dabd3e35924d132650708724f52cb311a" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.334620 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3f41ef8ab13f137c3b5591e5b203f6dabd3e35924d132650708724f52cb311a"} err="failed to get container status \"c3f41ef8ab13f137c3b5591e5b203f6dabd3e35924d132650708724f52cb311a\": rpc error: code = NotFound desc = could not find container \"c3f41ef8ab13f137c3b5591e5b203f6dabd3e35924d132650708724f52cb311a\": container with ID starting with c3f41ef8ab13f137c3b5591e5b203f6dabd3e35924d132650708724f52cb311a not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.334640 4831 scope.go:117] "RemoveContainer" containerID="028761397bb478f7acb2d80c81c0b10fa950424ecf33e2cf4134085e78dfa0ea" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.336069 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"028761397bb478f7acb2d80c81c0b10fa950424ecf33e2cf4134085e78dfa0ea\": container with ID starting with 028761397bb478f7acb2d80c81c0b10fa950424ecf33e2cf4134085e78dfa0ea not found: ID does not exist" containerID="028761397bb478f7acb2d80c81c0b10fa950424ecf33e2cf4134085e78dfa0ea" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.336150 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"028761397bb478f7acb2d80c81c0b10fa950424ecf33e2cf4134085e78dfa0ea"} err="failed to get container status \"028761397bb478f7acb2d80c81c0b10fa950424ecf33e2cf4134085e78dfa0ea\": rpc error: code = NotFound desc = could not find container \"028761397bb478f7acb2d80c81c0b10fa950424ecf33e2cf4134085e78dfa0ea\": container with ID starting with 028761397bb478f7acb2d80c81c0b10fa950424ecf33e2cf4134085e78dfa0ea not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.336197 4831 scope.go:117] "RemoveContainer" containerID="a45f8080689abc78feae747c5b6498cddb0646c863625a0fcdc630c1aa9a119f" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.336728 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45f8080689abc78feae747c5b6498cddb0646c863625a0fcdc630c1aa9a119f\": container with ID starting with a45f8080689abc78feae747c5b6498cddb0646c863625a0fcdc630c1aa9a119f not found: ID does not exist" containerID="a45f8080689abc78feae747c5b6498cddb0646c863625a0fcdc630c1aa9a119f" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.336815 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45f8080689abc78feae747c5b6498cddb0646c863625a0fcdc630c1aa9a119f"} err="failed to get container status \"a45f8080689abc78feae747c5b6498cddb0646c863625a0fcdc630c1aa9a119f\": rpc error: code = NotFound desc = could not find container \"a45f8080689abc78feae747c5b6498cddb0646c863625a0fcdc630c1aa9a119f\": container with ID starting with a45f8080689abc78feae747c5b6498cddb0646c863625a0fcdc630c1aa9a119f not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.336852 4831 scope.go:117] "RemoveContainer" containerID="46ddf46be9c06097453cf59db024eae7e40683f0a9989b74823fd4ae301bb17d" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.341729 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46ddf46be9c06097453cf59db024eae7e40683f0a9989b74823fd4ae301bb17d\": container with ID starting with 46ddf46be9c06097453cf59db024eae7e40683f0a9989b74823fd4ae301bb17d not found: ID does not exist" containerID="46ddf46be9c06097453cf59db024eae7e40683f0a9989b74823fd4ae301bb17d" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.341783 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46ddf46be9c06097453cf59db024eae7e40683f0a9989b74823fd4ae301bb17d"} err="failed to get container status \"46ddf46be9c06097453cf59db024eae7e40683f0a9989b74823fd4ae301bb17d\": rpc error: code = NotFound desc = could not find container \"46ddf46be9c06097453cf59db024eae7e40683f0a9989b74823fd4ae301bb17d\": container with ID starting with 46ddf46be9c06097453cf59db024eae7e40683f0a9989b74823fd4ae301bb17d not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.341819 4831 scope.go:117] "RemoveContainer" containerID="b9831c9567de92c654defb6777172aef010ae80be0cc6394662c2231bb36974f" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.342175 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9831c9567de92c654defb6777172aef010ae80be0cc6394662c2231bb36974f\": container with ID starting with b9831c9567de92c654defb6777172aef010ae80be0cc6394662c2231bb36974f not found: ID does not exist" containerID="b9831c9567de92c654defb6777172aef010ae80be0cc6394662c2231bb36974f" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.342224 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9831c9567de92c654defb6777172aef010ae80be0cc6394662c2231bb36974f"} err="failed to get container status \"b9831c9567de92c654defb6777172aef010ae80be0cc6394662c2231bb36974f\": rpc error: code = NotFound desc = could not find container \"b9831c9567de92c654defb6777172aef010ae80be0cc6394662c2231bb36974f\": container with ID starting with b9831c9567de92c654defb6777172aef010ae80be0cc6394662c2231bb36974f not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.342250 4831 scope.go:117] "RemoveContainer" containerID="d637e5874dd130a1ed498e16b28107fbbaefcfdf4116b7e7df8a194953476096" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.342547 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d637e5874dd130a1ed498e16b28107fbbaefcfdf4116b7e7df8a194953476096\": container with ID starting with d637e5874dd130a1ed498e16b28107fbbaefcfdf4116b7e7df8a194953476096 not found: ID does not exist" containerID="d637e5874dd130a1ed498e16b28107fbbaefcfdf4116b7e7df8a194953476096" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.342648 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d637e5874dd130a1ed498e16b28107fbbaefcfdf4116b7e7df8a194953476096"} err="failed to get container status \"d637e5874dd130a1ed498e16b28107fbbaefcfdf4116b7e7df8a194953476096\": rpc error: code = NotFound desc = could not find container \"d637e5874dd130a1ed498e16b28107fbbaefcfdf4116b7e7df8a194953476096\": container with ID starting with d637e5874dd130a1ed498e16b28107fbbaefcfdf4116b7e7df8a194953476096 not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.342725 4831 scope.go:117] "RemoveContainer" containerID="2ba943381a5e11302635d211cf0600c94162fbd3abf92948e4a2fb1e82ec5bde" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.344530 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba943381a5e11302635d211cf0600c94162fbd3abf92948e4a2fb1e82ec5bde\": container with ID starting with 2ba943381a5e11302635d211cf0600c94162fbd3abf92948e4a2fb1e82ec5bde not found: ID does not exist" containerID="2ba943381a5e11302635d211cf0600c94162fbd3abf92948e4a2fb1e82ec5bde" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.344611 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba943381a5e11302635d211cf0600c94162fbd3abf92948e4a2fb1e82ec5bde"} err="failed to get container status \"2ba943381a5e11302635d211cf0600c94162fbd3abf92948e4a2fb1e82ec5bde\": rpc error: code = NotFound desc = could not find container \"2ba943381a5e11302635d211cf0600c94162fbd3abf92948e4a2fb1e82ec5bde\": container with ID starting with 2ba943381a5e11302635d211cf0600c94162fbd3abf92948e4a2fb1e82ec5bde not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.344682 4831 scope.go:117] "RemoveContainer" containerID="6a0f59a871e573533ec16955bf826ca1763261b3d21e5512bb24a0e62bd0b73f" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.345253 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a0f59a871e573533ec16955bf826ca1763261b3d21e5512bb24a0e62bd0b73f\": container with ID starting with 6a0f59a871e573533ec16955bf826ca1763261b3d21e5512bb24a0e62bd0b73f not found: ID does not exist" containerID="6a0f59a871e573533ec16955bf826ca1763261b3d21e5512bb24a0e62bd0b73f" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.345300 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a0f59a871e573533ec16955bf826ca1763261b3d21e5512bb24a0e62bd0b73f"} err="failed to get container status \"6a0f59a871e573533ec16955bf826ca1763261b3d21e5512bb24a0e62bd0b73f\": rpc error: code = NotFound desc = could not find container \"6a0f59a871e573533ec16955bf826ca1763261b3d21e5512bb24a0e62bd0b73f\": container with ID starting with 6a0f59a871e573533ec16955bf826ca1763261b3d21e5512bb24a0e62bd0b73f not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.345343 4831 scope.go:117] "RemoveContainer" containerID="a8bb5d58227d4221c9bfbfe3c07b51f7961676fc4bef20fa5826956eaf08b8f4" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.345744 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8bb5d58227d4221c9bfbfe3c07b51f7961676fc4bef20fa5826956eaf08b8f4\": container with ID starting with a8bb5d58227d4221c9bfbfe3c07b51f7961676fc4bef20fa5826956eaf08b8f4 not found: ID does not exist" containerID="a8bb5d58227d4221c9bfbfe3c07b51f7961676fc4bef20fa5826956eaf08b8f4" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.345776 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8bb5d58227d4221c9bfbfe3c07b51f7961676fc4bef20fa5826956eaf08b8f4"} err="failed to get container status \"a8bb5d58227d4221c9bfbfe3c07b51f7961676fc4bef20fa5826956eaf08b8f4\": rpc error: code = NotFound desc = could not find container \"a8bb5d58227d4221c9bfbfe3c07b51f7961676fc4bef20fa5826956eaf08b8f4\": container with ID starting with a8bb5d58227d4221c9bfbfe3c07b51f7961676fc4bef20fa5826956eaf08b8f4 not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.345799 4831 scope.go:117] "RemoveContainer" containerID="56d4eeeadba1da79db3faf5f409f1a7b9ecb1b330a57a0eb2389a1bfa3a52c83" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.346437 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56d4eeeadba1da79db3faf5f409f1a7b9ecb1b330a57a0eb2389a1bfa3a52c83\": container with ID starting with 56d4eeeadba1da79db3faf5f409f1a7b9ecb1b330a57a0eb2389a1bfa3a52c83 not found: ID does not exist" containerID="56d4eeeadba1da79db3faf5f409f1a7b9ecb1b330a57a0eb2389a1bfa3a52c83" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.346462 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d4eeeadba1da79db3faf5f409f1a7b9ecb1b330a57a0eb2389a1bfa3a52c83"} err="failed to get container status \"56d4eeeadba1da79db3faf5f409f1a7b9ecb1b330a57a0eb2389a1bfa3a52c83\": rpc error: code = NotFound desc = could not find container \"56d4eeeadba1da79db3faf5f409f1a7b9ecb1b330a57a0eb2389a1bfa3a52c83\": container with ID starting with 56d4eeeadba1da79db3faf5f409f1a7b9ecb1b330a57a0eb2389a1bfa3a52c83 not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.346476 4831 scope.go:117] "RemoveContainer" containerID="8dafbfe58e259c950e7b63464cdbfb891e68abc0a86b7d4e0c9bc5406283a1a8" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.346771 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dafbfe58e259c950e7b63464cdbfb891e68abc0a86b7d4e0c9bc5406283a1a8\": container with ID starting with 8dafbfe58e259c950e7b63464cdbfb891e68abc0a86b7d4e0c9bc5406283a1a8 not found: ID does not exist" containerID="8dafbfe58e259c950e7b63464cdbfb891e68abc0a86b7d4e0c9bc5406283a1a8" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.346790 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dafbfe58e259c950e7b63464cdbfb891e68abc0a86b7d4e0c9bc5406283a1a8"} err="failed to get container status \"8dafbfe58e259c950e7b63464cdbfb891e68abc0a86b7d4e0c9bc5406283a1a8\": rpc error: code = NotFound desc = could not find container \"8dafbfe58e259c950e7b63464cdbfb891e68abc0a86b7d4e0c9bc5406283a1a8\": container with ID starting with 8dafbfe58e259c950e7b63464cdbfb891e68abc0a86b7d4e0c9bc5406283a1a8 not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.346803 4831 scope.go:117] "RemoveContainer" containerID="3c231ebfbf6bf636033051e5a33d02c71564164db1667cbd16b072382a0b97b0" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.348360 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c231ebfbf6bf636033051e5a33d02c71564164db1667cbd16b072382a0b97b0\": container with ID starting with 3c231ebfbf6bf636033051e5a33d02c71564164db1667cbd16b072382a0b97b0 not found: ID does not exist" containerID="3c231ebfbf6bf636033051e5a33d02c71564164db1667cbd16b072382a0b97b0" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.348445 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c231ebfbf6bf636033051e5a33d02c71564164db1667cbd16b072382a0b97b0"} err="failed to get container status \"3c231ebfbf6bf636033051e5a33d02c71564164db1667cbd16b072382a0b97b0\": rpc error: code = NotFound desc = could not find container \"3c231ebfbf6bf636033051e5a33d02c71564164db1667cbd16b072382a0b97b0\": container with ID starting with 3c231ebfbf6bf636033051e5a33d02c71564164db1667cbd16b072382a0b97b0 not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.348520 4831 scope.go:117] "RemoveContainer" containerID="f22d5226986449b8e588d26eb4a1afaecba31a4dd8aa09a17eaa0c7ac5bec2f5" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.348834 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f22d5226986449b8e588d26eb4a1afaecba31a4dd8aa09a17eaa0c7ac5bec2f5\": container with ID starting with f22d5226986449b8e588d26eb4a1afaecba31a4dd8aa09a17eaa0c7ac5bec2f5 not found: ID does not exist" containerID="f22d5226986449b8e588d26eb4a1afaecba31a4dd8aa09a17eaa0c7ac5bec2f5" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.348862 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22d5226986449b8e588d26eb4a1afaecba31a4dd8aa09a17eaa0c7ac5bec2f5"} err="failed to get container status \"f22d5226986449b8e588d26eb4a1afaecba31a4dd8aa09a17eaa0c7ac5bec2f5\": rpc error: code = NotFound desc = could not find container \"f22d5226986449b8e588d26eb4a1afaecba31a4dd8aa09a17eaa0c7ac5bec2f5\": container with ID starting with f22d5226986449b8e588d26eb4a1afaecba31a4dd8aa09a17eaa0c7ac5bec2f5 not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.348882 4831 scope.go:117] "RemoveContainer" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.353426 4831 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.370171 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-storage-0"] Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.377550 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-storage-0"] Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.389470 4831 scope.go:117] "RemoveContainer" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.405662 4831 scope.go:117] "RemoveContainer" containerID="a8a83a53ad6bd26f5c6124009f863fc2f2097fa4dfc3c13079ffbc2ca77116ae" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.429944 4831 scope.go:117] "RemoveContainer" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.433187 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74\": container with ID starting with e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74 not found: ID does not exist" containerID="e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.433270 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74"} err="failed to get container status \"e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74\": rpc error: code = NotFound desc = could not find container \"e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74\": container with ID starting with e05a575cd09e198974168eabf30f48f2358a5e36a219b85dec3096cda320fa74 not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.433362 4831 scope.go:117] "RemoveContainer" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.433838 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4\": container with ID starting with 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 not found: ID does not exist" containerID="63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.433894 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4"} err="failed to get container status \"63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4\": rpc error: code = NotFound desc = could not find container \"63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4\": container with ID starting with 63c6630f04f34d3b3e8caf36d2674a10b9828a2b65adff4af1822fd99caa8ee4 not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.433934 4831 scope.go:117] "RemoveContainer" containerID="a8a83a53ad6bd26f5c6124009f863fc2f2097fa4dfc3c13079ffbc2ca77116ae" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.434296 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8a83a53ad6bd26f5c6124009f863fc2f2097fa4dfc3c13079ffbc2ca77116ae\": container with ID starting with a8a83a53ad6bd26f5c6124009f863fc2f2097fa4dfc3c13079ffbc2ca77116ae not found: ID does not exist" containerID="a8a83a53ad6bd26f5c6124009f863fc2f2097fa4dfc3c13079ffbc2ca77116ae" Dec 03 06:56:00 crc kubenswrapper[4831]: I1203 06:56:00.434424 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8a83a53ad6bd26f5c6124009f863fc2f2097fa4dfc3c13079ffbc2ca77116ae"} err="failed to get container status \"a8a83a53ad6bd26f5c6124009f863fc2f2097fa4dfc3c13079ffbc2ca77116ae\": rpc error: code = NotFound desc = could not find container \"a8a83a53ad6bd26f5c6124009f863fc2f2097fa4dfc3c13079ffbc2ca77116ae\": container with ID starting with a8a83a53ad6bd26f5c6124009f863fc2f2097fa4dfc3c13079ffbc2ca77116ae not found: ID does not exist" Dec 03 06:56:00 crc kubenswrapper[4831]: E1203 06:56:00.510497 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f3e5a86_a8e3_416e_adb8_8b92c3cc81e5.slice/crio-8c5c1ed90709483c852f361b002cbf1be4c46fa6a876d4a80cc2c291e47730c3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7f3e5a86_a8e3_416e_adb8_8b92c3cc81e5.slice\": RecentStats: unable to find data in memory cache]" Dec 03 06:56:01 crc kubenswrapper[4831]: I1203 06:56:01.032626 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" path="/var/lib/kubelet/pods/7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5/volumes" Dec 03 06:56:01 crc kubenswrapper[4831]: I1203 06:56:01.040179 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" path="/var/lib/kubelet/pods/7f657b4b-bed8-4244-8727-2a3c59364041/volumes" Dec 03 06:56:01 crc kubenswrapper[4831]: I1203 06:56:01.325218 4831 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","podd0fdc967-7fb5-4702-b184-6953e8aefd19"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort podd0fdc967-7fb5-4702-b184-6953e8aefd19] : Timed out while waiting for systemd to remove kubepods-besteffort-podd0fdc967_7fb5_4702_b184_6953e8aefd19.slice" Dec 03 06:56:01 crc kubenswrapper[4831]: E1203 06:56:01.325277 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort podd0fdc967-7fb5-4702-b184-6953e8aefd19] : unable to destroy cgroup paths for cgroup [kubepods besteffort podd0fdc967-7fb5-4702-b184-6953e8aefd19] : Timed out while waiting for systemd to remove kubepods-besteffort-podd0fdc967_7fb5_4702_b184_6953e8aefd19.slice" pod="openstack/ovsdbserver-nb-0" podUID="d0fdc967-7fb5-4702-b184-6953e8aefd19" Dec 03 06:56:01 crc kubenswrapper[4831]: I1203 06:56:01.397589 4831 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod3b5c67f9-4d9b-428a-a974-9162d81b1f02"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod3b5c67f9-4d9b-428a-a974-9162d81b1f02] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3b5c67f9_4d9b_428a_a974_9162d81b1f02.slice" Dec 03 06:56:01 crc kubenswrapper[4831]: E1203 06:56:01.397694 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod3b5c67f9-4d9b-428a-a974-9162d81b1f02] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod3b5c67f9-4d9b-428a-a974-9162d81b1f02] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3b5c67f9_4d9b_428a_a974_9162d81b1f02.slice" pod="openstack/ovn-controller-metrics-8s4vh" podUID="3b5c67f9-4d9b-428a-a974-9162d81b1f02" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.063652 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-8s4vh" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.063733 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.120645 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.129937 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.135503 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-metrics-8s4vh"] Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.143607 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-metrics-8s4vh"] Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.321080 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-869cfdc5c4-7898s" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api-log" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": EOF" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.321476 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-869cfdc5c4-7898s" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api" probeResult="failure" output="Get \"https://10.217.0.209:9311/healthcheck\": EOF" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.782958 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.790647 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.896228 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqpdf\" (UniqueName: \"kubernetes.io/projected/e0b16411-0a37-4432-965b-746c2d70d00b-kube-api-access-kqpdf\") pod \"e0b16411-0a37-4432-965b-746c2d70d00b\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.897004 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-config-data\") pod \"e0b16411-0a37-4432-965b-746c2d70d00b\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.897044 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b16411-0a37-4432-965b-746c2d70d00b-logs\") pod \"e0b16411-0a37-4432-965b-746c2d70d00b\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.897069 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46af7209-8790-44ab-b255-8c84c3f5255a-logs\") pod \"46af7209-8790-44ab-b255-8c84c3f5255a\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.897094 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-internal-tls-certs\") pod \"e0b16411-0a37-4432-965b-746c2d70d00b\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.897133 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-public-tls-certs\") pod \"e0b16411-0a37-4432-965b-746c2d70d00b\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.897157 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-config-data\") pod \"46af7209-8790-44ab-b255-8c84c3f5255a\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.897179 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d95vz\" (UniqueName: \"kubernetes.io/projected/46af7209-8790-44ab-b255-8c84c3f5255a-kube-api-access-d95vz\") pod \"46af7209-8790-44ab-b255-8c84c3f5255a\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.897213 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-combined-ca-bundle\") pod \"e0b16411-0a37-4432-965b-746c2d70d00b\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.897255 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-combined-ca-bundle\") pod \"46af7209-8790-44ab-b255-8c84c3f5255a\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.897281 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-config-data-custom\") pod \"e0b16411-0a37-4432-965b-746c2d70d00b\" (UID: \"e0b16411-0a37-4432-965b-746c2d70d00b\") " Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.897381 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-config-data-custom\") pod \"46af7209-8790-44ab-b255-8c84c3f5255a\" (UID: \"46af7209-8790-44ab-b255-8c84c3f5255a\") " Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.898298 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b16411-0a37-4432-965b-746c2d70d00b-logs" (OuterVolumeSpecName: "logs") pod "e0b16411-0a37-4432-965b-746c2d70d00b" (UID: "e0b16411-0a37-4432-965b-746c2d70d00b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.898717 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46af7209-8790-44ab-b255-8c84c3f5255a-logs" (OuterVolumeSpecName: "logs") pod "46af7209-8790-44ab-b255-8c84c3f5255a" (UID: "46af7209-8790-44ab-b255-8c84c3f5255a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.902740 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46af7209-8790-44ab-b255-8c84c3f5255a-kube-api-access-d95vz" (OuterVolumeSpecName: "kube-api-access-d95vz") pod "46af7209-8790-44ab-b255-8c84c3f5255a" (UID: "46af7209-8790-44ab-b255-8c84c3f5255a"). InnerVolumeSpecName "kube-api-access-d95vz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.902922 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e0b16411-0a37-4432-965b-746c2d70d00b" (UID: "e0b16411-0a37-4432-965b-746c2d70d00b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.903066 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "46af7209-8790-44ab-b255-8c84c3f5255a" (UID: "46af7209-8790-44ab-b255-8c84c3f5255a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.904070 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b16411-0a37-4432-965b-746c2d70d00b-kube-api-access-kqpdf" (OuterVolumeSpecName: "kube-api-access-kqpdf") pod "e0b16411-0a37-4432-965b-746c2d70d00b" (UID: "e0b16411-0a37-4432-965b-746c2d70d00b"). InnerVolumeSpecName "kube-api-access-kqpdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.929139 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e0b16411-0a37-4432-965b-746c2d70d00b" (UID: "e0b16411-0a37-4432-965b-746c2d70d00b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.930825 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "46af7209-8790-44ab-b255-8c84c3f5255a" (UID: "46af7209-8790-44ab-b255-8c84c3f5255a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.940950 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "e0b16411-0a37-4432-965b-746c2d70d00b" (UID: "e0b16411-0a37-4432-965b-746c2d70d00b"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.941845 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-config-data" (OuterVolumeSpecName: "config-data") pod "46af7209-8790-44ab-b255-8c84c3f5255a" (UID: "46af7209-8790-44ab-b255-8c84c3f5255a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.942637 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-config-data" (OuterVolumeSpecName: "config-data") pod "e0b16411-0a37-4432-965b-746c2d70d00b" (UID: "e0b16411-0a37-4432-965b-746c2d70d00b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.950615 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "e0b16411-0a37-4432-965b-746c2d70d00b" (UID: "e0b16411-0a37-4432-965b-746c2d70d00b"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.998839 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.998892 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.998911 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.998930 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqpdf\" (UniqueName: \"kubernetes.io/projected/e0b16411-0a37-4432-965b-746c2d70d00b-kube-api-access-kqpdf\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.998951 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.998968 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0b16411-0a37-4432-965b-746c2d70d00b-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.998984 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/46af7209-8790-44ab-b255-8c84c3f5255a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.999000 4831 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.999016 4831 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-public-tls-certs\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.999032 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/46af7209-8790-44ab-b255-8c84c3f5255a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.999049 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d95vz\" (UniqueName: \"kubernetes.io/projected/46af7209-8790-44ab-b255-8c84c3f5255a-kube-api-access-d95vz\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:02 crc kubenswrapper[4831]: I1203 06:56:02.999066 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0b16411-0a37-4432-965b-746c2d70d00b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.023175 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b5c67f9-4d9b-428a-a974-9162d81b1f02" path="/var/lib/kubelet/pods/3b5c67f9-4d9b-428a-a974-9162d81b1f02/volumes" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.024021 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0fdc967-7fb5-4702-b184-6953e8aefd19" path="/var/lib/kubelet/pods/d0fdc967-7fb5-4702-b184-6953e8aefd19/volumes" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.075255 4831 generic.go:334] "Generic (PLEG): container finished" podID="46af7209-8790-44ab-b255-8c84c3f5255a" containerID="5d6d8b6f2a4e23898996c34bfb0c94ef64025a7f4f9fdc4da668685ab790291e" exitCode=137 Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.075350 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d79b99f8c-4tl6f" event={"ID":"46af7209-8790-44ab-b255-8c84c3f5255a","Type":"ContainerDied","Data":"5d6d8b6f2a4e23898996c34bfb0c94ef64025a7f4f9fdc4da668685ab790291e"} Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.075449 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-5d79b99f8c-4tl6f" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.076305 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-5d79b99f8c-4tl6f" event={"ID":"46af7209-8790-44ab-b255-8c84c3f5255a","Type":"ContainerDied","Data":"31812e5f50a2f4d7b7c81b6ee905feac628f4bd890a170bb5b7ecbb2605a2e1d"} Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.076422 4831 scope.go:117] "RemoveContainer" containerID="5d6d8b6f2a4e23898996c34bfb0c94ef64025a7f4f9fdc4da668685ab790291e" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.078319 4831 generic.go:334] "Generic (PLEG): container finished" podID="e0b16411-0a37-4432-965b-746c2d70d00b" containerID="ff8dfb17dc67aace688571e91e7aee7d3a0f30d38866040c0fc7042820c8a400" exitCode=137 Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.078359 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-869cfdc5c4-7898s" event={"ID":"e0b16411-0a37-4432-965b-746c2d70d00b","Type":"ContainerDied","Data":"ff8dfb17dc67aace688571e91e7aee7d3a0f30d38866040c0fc7042820c8a400"} Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.078375 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-869cfdc5c4-7898s" event={"ID":"e0b16411-0a37-4432-965b-746c2d70d00b","Type":"ContainerDied","Data":"687ba234ddf57a90f8ad1f9e42d53b83f605abffd086bec5e4128003fa2936b1"} Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.078444 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-869cfdc5c4-7898s" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.113912 4831 scope.go:117] "RemoveContainer" containerID="805905b9e770e618ea67bbd05b6bb50de90789f9322e1d4a3c6163352843ffa7" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.122685 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-869cfdc5c4-7898s"] Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.132658 4831 scope.go:117] "RemoveContainer" containerID="5d6d8b6f2a4e23898996c34bfb0c94ef64025a7f4f9fdc4da668685ab790291e" Dec 03 06:56:03 crc kubenswrapper[4831]: E1203 06:56:03.133398 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d6d8b6f2a4e23898996c34bfb0c94ef64025a7f4f9fdc4da668685ab790291e\": container with ID starting with 5d6d8b6f2a4e23898996c34bfb0c94ef64025a7f4f9fdc4da668685ab790291e not found: ID does not exist" containerID="5d6d8b6f2a4e23898996c34bfb0c94ef64025a7f4f9fdc4da668685ab790291e" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.133442 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d6d8b6f2a4e23898996c34bfb0c94ef64025a7f4f9fdc4da668685ab790291e"} err="failed to get container status \"5d6d8b6f2a4e23898996c34bfb0c94ef64025a7f4f9fdc4da668685ab790291e\": rpc error: code = NotFound desc = could not find container \"5d6d8b6f2a4e23898996c34bfb0c94ef64025a7f4f9fdc4da668685ab790291e\": container with ID starting with 5d6d8b6f2a4e23898996c34bfb0c94ef64025a7f4f9fdc4da668685ab790291e not found: ID does not exist" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.133472 4831 scope.go:117] "RemoveContainer" containerID="805905b9e770e618ea67bbd05b6bb50de90789f9322e1d4a3c6163352843ffa7" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.133539 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-869cfdc5c4-7898s"] Dec 03 06:56:03 crc kubenswrapper[4831]: E1203 06:56:03.133833 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"805905b9e770e618ea67bbd05b6bb50de90789f9322e1d4a3c6163352843ffa7\": container with ID starting with 805905b9e770e618ea67bbd05b6bb50de90789f9322e1d4a3c6163352843ffa7 not found: ID does not exist" containerID="805905b9e770e618ea67bbd05b6bb50de90789f9322e1d4a3c6163352843ffa7" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.133856 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"805905b9e770e618ea67bbd05b6bb50de90789f9322e1d4a3c6163352843ffa7"} err="failed to get container status \"805905b9e770e618ea67bbd05b6bb50de90789f9322e1d4a3c6163352843ffa7\": rpc error: code = NotFound desc = could not find container \"805905b9e770e618ea67bbd05b6bb50de90789f9322e1d4a3c6163352843ffa7\": container with ID starting with 805905b9e770e618ea67bbd05b6bb50de90789f9322e1d4a3c6163352843ffa7 not found: ID does not exist" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.133871 4831 scope.go:117] "RemoveContainer" containerID="ff8dfb17dc67aace688571e91e7aee7d3a0f30d38866040c0fc7042820c8a400" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.146458 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-5d79b99f8c-4tl6f"] Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.151680 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-5d79b99f8c-4tl6f"] Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.155730 4831 scope.go:117] "RemoveContainer" containerID="02504848c3b09a3613ecf77f3dec52f7b03c139c38e10d22eeb182e3c700886c" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.172697 4831 scope.go:117] "RemoveContainer" containerID="ff8dfb17dc67aace688571e91e7aee7d3a0f30d38866040c0fc7042820c8a400" Dec 03 06:56:03 crc kubenswrapper[4831]: E1203 06:56:03.173117 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff8dfb17dc67aace688571e91e7aee7d3a0f30d38866040c0fc7042820c8a400\": container with ID starting with ff8dfb17dc67aace688571e91e7aee7d3a0f30d38866040c0fc7042820c8a400 not found: ID does not exist" containerID="ff8dfb17dc67aace688571e91e7aee7d3a0f30d38866040c0fc7042820c8a400" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.173146 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff8dfb17dc67aace688571e91e7aee7d3a0f30d38866040c0fc7042820c8a400"} err="failed to get container status \"ff8dfb17dc67aace688571e91e7aee7d3a0f30d38866040c0fc7042820c8a400\": rpc error: code = NotFound desc = could not find container \"ff8dfb17dc67aace688571e91e7aee7d3a0f30d38866040c0fc7042820c8a400\": container with ID starting with ff8dfb17dc67aace688571e91e7aee7d3a0f30d38866040c0fc7042820c8a400 not found: ID does not exist" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.173166 4831 scope.go:117] "RemoveContainer" containerID="02504848c3b09a3613ecf77f3dec52f7b03c139c38e10d22eeb182e3c700886c" Dec 03 06:56:03 crc kubenswrapper[4831]: E1203 06:56:03.173504 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02504848c3b09a3613ecf77f3dec52f7b03c139c38e10d22eeb182e3c700886c\": container with ID starting with 02504848c3b09a3613ecf77f3dec52f7b03c139c38e10d22eeb182e3c700886c not found: ID does not exist" containerID="02504848c3b09a3613ecf77f3dec52f7b03c139c38e10d22eeb182e3c700886c" Dec 03 06:56:03 crc kubenswrapper[4831]: I1203 06:56:03.173546 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02504848c3b09a3613ecf77f3dec52f7b03c139c38e10d22eeb182e3c700886c"} err="failed to get container status \"02504848c3b09a3613ecf77f3dec52f7b03c139c38e10d22eeb182e3c700886c\": rpc error: code = NotFound desc = could not find container \"02504848c3b09a3613ecf77f3dec52f7b03c139c38e10d22eeb182e3c700886c\": container with ID starting with 02504848c3b09a3613ecf77f3dec52f7b03c139c38e10d22eeb182e3c700886c not found: ID does not exist" Dec 03 06:56:04 crc kubenswrapper[4831]: I1203 06:56:04.694680 4831 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod3a92648c-9b8b-4fcf-b028-3f59ce2ebf27"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod3a92648c-9b8b-4fcf-b028-3f59ce2ebf27] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3a92648c_9b8b_4fcf_b028_3f59ce2ebf27.slice" Dec 03 06:56:04 crc kubenswrapper[4831]: E1203 06:56:04.694762 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod3a92648c-9b8b-4fcf-b028-3f59ce2ebf27] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod3a92648c-9b8b-4fcf-b028-3f59ce2ebf27] : Timed out while waiting for systemd to remove kubepods-besteffort-pod3a92648c_9b8b_4fcf_b028_3f59ce2ebf27.slice" pod="openstack/barbican-worker-55749f9879-hprsg" podUID="3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" Dec 03 06:56:05 crc kubenswrapper[4831]: I1203 06:56:05.029969 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46af7209-8790-44ab-b255-8c84c3f5255a" path="/var/lib/kubelet/pods/46af7209-8790-44ab-b255-8c84c3f5255a/volumes" Dec 03 06:56:05 crc kubenswrapper[4831]: I1203 06:56:05.031629 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" path="/var/lib/kubelet/pods/e0b16411-0a37-4432-965b-746c2d70d00b/volumes" Dec 03 06:56:05 crc kubenswrapper[4831]: I1203 06:56:05.106077 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-55749f9879-hprsg" Dec 03 06:56:05 crc kubenswrapper[4831]: I1203 06:56:05.136678 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-worker-55749f9879-hprsg"] Dec 03 06:56:05 crc kubenswrapper[4831]: I1203 06:56:05.144607 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-worker-55749f9879-hprsg"] Dec 03 06:56:05 crc kubenswrapper[4831]: I1203 06:56:05.181942 4831 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","besteffort","pod5bf96e96-13ba-44c9-b16e-b1c2acbfc643"] err="unable to destroy cgroup paths for cgroup [kubepods besteffort pod5bf96e96-13ba-44c9-b16e-b1c2acbfc643] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5bf96e96_13ba_44c9_b16e_b1c2acbfc643.slice" Dec 03 06:56:05 crc kubenswrapper[4831]: E1203 06:56:05.181998 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods besteffort pod5bf96e96-13ba-44c9-b16e-b1c2acbfc643] : unable to destroy cgroup paths for cgroup [kubepods besteffort pod5bf96e96-13ba-44c9-b16e-b1c2acbfc643] : Timed out while waiting for systemd to remove kubepods-besteffort-pod5bf96e96_13ba_44c9_b16e_b1c2acbfc643.slice" pod="openstack/kube-state-metrics-0" podUID="5bf96e96-13ba-44c9-b16e-b1c2acbfc643" Dec 03 06:56:06 crc kubenswrapper[4831]: I1203 06:56:06.113147 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 06:56:06 crc kubenswrapper[4831]: I1203 06:56:06.132975 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:56:06 crc kubenswrapper[4831]: I1203 06:56:06.144117 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.015404 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.023726 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a92648c-9b8b-4fcf-b028-3f59ce2ebf27" path="/var/lib/kubelet/pods/3a92648c-9b8b-4fcf-b028-3f59ce2ebf27/volumes" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.024285 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bf96e96-13ba-44c9-b16e-b1c2acbfc643" path="/var/lib/kubelet/pods/5bf96e96-13ba-44c9-b16e-b1c2acbfc643/volumes" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.125403 4831 generic.go:334] "Generic (PLEG): container finished" podID="4b96e376-8104-4a92-b1ec-6078943d0b50" containerID="09ee85d7ac2cb3cc7d8f856bddb0cb3f115aeb97e79547bb4adf57568c348fee" exitCode=137 Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.125450 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" event={"ID":"4b96e376-8104-4a92-b1ec-6078943d0b50","Type":"ContainerDied","Data":"09ee85d7ac2cb3cc7d8f856bddb0cb3f115aeb97e79547bb4adf57568c348fee"} Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.125476 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" event={"ID":"4b96e376-8104-4a92-b1ec-6078943d0b50","Type":"ContainerDied","Data":"9ccfc29d10c7135ff0ec34002d80d413f8e103fa94d72ba9d8371cb4bfe9fee0"} Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.125509 4831 scope.go:117] "RemoveContainer" containerID="09ee85d7ac2cb3cc7d8f856bddb0cb3f115aeb97e79547bb4adf57568c348fee" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.125620 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-768d8958fd-hbthr" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.147412 4831 scope.go:117] "RemoveContainer" containerID="0c8277d69038c0d9447efe65fefda45d9ad5b34d1becc16314549119edd58eca" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.167182 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-config-data\") pod \"4b96e376-8104-4a92-b1ec-6078943d0b50\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.167234 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-config-data-custom\") pod \"4b96e376-8104-4a92-b1ec-6078943d0b50\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.167273 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-combined-ca-bundle\") pod \"4b96e376-8104-4a92-b1ec-6078943d0b50\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.167341 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b96e376-8104-4a92-b1ec-6078943d0b50-logs\") pod \"4b96e376-8104-4a92-b1ec-6078943d0b50\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.167400 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6mx4\" (UniqueName: \"kubernetes.io/projected/4b96e376-8104-4a92-b1ec-6078943d0b50-kube-api-access-x6mx4\") pod \"4b96e376-8104-4a92-b1ec-6078943d0b50\" (UID: \"4b96e376-8104-4a92-b1ec-6078943d0b50\") " Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.168140 4831 scope.go:117] "RemoveContainer" containerID="09ee85d7ac2cb3cc7d8f856bddb0cb3f115aeb97e79547bb4adf57568c348fee" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.168434 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b96e376-8104-4a92-b1ec-6078943d0b50-logs" (OuterVolumeSpecName: "logs") pod "4b96e376-8104-4a92-b1ec-6078943d0b50" (UID: "4b96e376-8104-4a92-b1ec-6078943d0b50"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:56:07 crc kubenswrapper[4831]: E1203 06:56:07.168567 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ee85d7ac2cb3cc7d8f856bddb0cb3f115aeb97e79547bb4adf57568c348fee\": container with ID starting with 09ee85d7ac2cb3cc7d8f856bddb0cb3f115aeb97e79547bb4adf57568c348fee not found: ID does not exist" containerID="09ee85d7ac2cb3cc7d8f856bddb0cb3f115aeb97e79547bb4adf57568c348fee" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.168620 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ee85d7ac2cb3cc7d8f856bddb0cb3f115aeb97e79547bb4adf57568c348fee"} err="failed to get container status \"09ee85d7ac2cb3cc7d8f856bddb0cb3f115aeb97e79547bb4adf57568c348fee\": rpc error: code = NotFound desc = could not find container \"09ee85d7ac2cb3cc7d8f856bddb0cb3f115aeb97e79547bb4adf57568c348fee\": container with ID starting with 09ee85d7ac2cb3cc7d8f856bddb0cb3f115aeb97e79547bb4adf57568c348fee not found: ID does not exist" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.168655 4831 scope.go:117] "RemoveContainer" containerID="0c8277d69038c0d9447efe65fefda45d9ad5b34d1becc16314549119edd58eca" Dec 03 06:56:07 crc kubenswrapper[4831]: E1203 06:56:07.169143 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c8277d69038c0d9447efe65fefda45d9ad5b34d1becc16314549119edd58eca\": container with ID starting with 0c8277d69038c0d9447efe65fefda45d9ad5b34d1becc16314549119edd58eca not found: ID does not exist" containerID="0c8277d69038c0d9447efe65fefda45d9ad5b34d1becc16314549119edd58eca" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.169176 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c8277d69038c0d9447efe65fefda45d9ad5b34d1becc16314549119edd58eca"} err="failed to get container status \"0c8277d69038c0d9447efe65fefda45d9ad5b34d1becc16314549119edd58eca\": rpc error: code = NotFound desc = could not find container \"0c8277d69038c0d9447efe65fefda45d9ad5b34d1becc16314549119edd58eca\": container with ID starting with 0c8277d69038c0d9447efe65fefda45d9ad5b34d1becc16314549119edd58eca not found: ID does not exist" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.172487 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b96e376-8104-4a92-b1ec-6078943d0b50-kube-api-access-x6mx4" (OuterVolumeSpecName: "kube-api-access-x6mx4") pod "4b96e376-8104-4a92-b1ec-6078943d0b50" (UID: "4b96e376-8104-4a92-b1ec-6078943d0b50"). InnerVolumeSpecName "kube-api-access-x6mx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.185389 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4b96e376-8104-4a92-b1ec-6078943d0b50" (UID: "4b96e376-8104-4a92-b1ec-6078943d0b50"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.196433 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b96e376-8104-4a92-b1ec-6078943d0b50" (UID: "4b96e376-8104-4a92-b1ec-6078943d0b50"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.204673 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-config-data" (OuterVolumeSpecName: "config-data") pod "4b96e376-8104-4a92-b1ec-6078943d0b50" (UID: "4b96e376-8104-4a92-b1ec-6078943d0b50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.268812 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6mx4\" (UniqueName: \"kubernetes.io/projected/4b96e376-8104-4a92-b1ec-6078943d0b50-kube-api-access-x6mx4\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.269128 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.269206 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.269275 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b96e376-8104-4a92-b1ec-6078943d0b50-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.269364 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b96e376-8104-4a92-b1ec-6078943d0b50-logs\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.465635 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-keystone-listener-768d8958fd-hbthr"] Dec 03 06:56:07 crc kubenswrapper[4831]: I1203 06:56:07.475132 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-keystone-listener-768d8958fd-hbthr"] Dec 03 06:56:09 crc kubenswrapper[4831]: I1203 06:56:09.024074 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b96e376-8104-4a92-b1ec-6078943d0b50" path="/var/lib/kubelet/pods/4b96e376-8104-4a92-b1ec-6078943d0b50/volumes" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.348594 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bw746"] Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349413 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770b98aa-f177-4c7b-b37e-1664c039f47d" containerName="barbican-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349430 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="770b98aa-f177-4c7b-b37e-1664c039f47d" containerName="barbican-api" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349450 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovsdb-server-init" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349458 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovsdb-server-init" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349470 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-replicator" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349477 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-replicator" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349489 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81084d6a-9987-4466-8f89-455aa3ff2627" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349496 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="81084d6a-9987-4466-8f89-455aa3ff2627" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349513 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f442a70-f040-4b0e-853d-6ce1f4caf63d" containerName="neutron-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349521 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f442a70-f040-4b0e-853d-6ce1f4caf63d" containerName="neutron-api" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349533 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" containerName="nova-api-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349540 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" containerName="nova-api-log" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349566 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-updater" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349573 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-updater" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349584 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349592 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api-log" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349605 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" containerName="placement-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349612 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" containerName="placement-log" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349625 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="ceilometer-central-agent" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349631 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="ceilometer-central-agent" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349639 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-auditor" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349644 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-auditor" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349650 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" containerName="placement-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349656 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" containerName="placement-api" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349670 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a5d88e3-73a3-4f3d-af31-af675ab452bd" containerName="glance-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349677 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a5d88e3-73a3-4f3d-af31-af675ab452bd" containerName="glance-log" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349688 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0cbb94-92ec-4369-b609-f3186f302c66" containerName="rabbitmq" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349695 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0cbb94-92ec-4369-b609-f3186f302c66" containerName="rabbitmq" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349706 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovs-vswitchd" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349712 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovs-vswitchd" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349730 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc0cbb94-92ec-4369-b609-f3186f302c66" containerName="setup-container" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349736 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc0cbb94-92ec-4369-b609-f3186f302c66" containerName="setup-container" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349743 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-server" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349748 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-server" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349757 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-updater" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349762 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-updater" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349773 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5067e964-1daa-4bbd-8e2b-872ce1067389" containerName="keystone-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349779 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5067e964-1daa-4bbd-8e2b-872ce1067389" containerName="keystone-api" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349785 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" containerName="cinder-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349790 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" containerName="cinder-api" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349800 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovsdb-server" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349805 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovsdb-server" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349816 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267687cf-58da-42e0-852e-c8c87f2ea42a" containerName="glance-httpd" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349821 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="267687cf-58da-42e0-852e-c8c87f2ea42a" containerName="glance-httpd" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349829 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerName="nova-metadata-metadata" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349834 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerName="nova-metadata-metadata" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349841 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="proxy-httpd" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349846 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="proxy-httpd" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349855 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-server" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349861 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-server" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349868 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-auditor" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349873 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-auditor" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349881 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" containerName="cinder-api-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349889 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" containerName="cinder-api-log" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349900 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-reaper" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349905 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-reaper" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349914 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="sg-core" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349920 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="sg-core" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349929 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" containerName="nova-api-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349935 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" containerName="nova-api-api" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349943 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f442a70-f040-4b0e-853d-6ce1f4caf63d" containerName="neutron-httpd" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349949 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f442a70-f040-4b0e-853d-6ce1f4caf63d" containerName="neutron-httpd" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349959 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-replicator" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349964 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-replicator" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349972 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bf96e96-13ba-44c9-b16e-b1c2acbfc643" containerName="kube-state-metrics" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349979 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bf96e96-13ba-44c9-b16e-b1c2acbfc643" containerName="kube-state-metrics" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.349989 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-expirer" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.349994 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-expirer" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350002 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20722e3f-f810-4ac7-80d7-09cae400150a" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350007 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="20722e3f-f810-4ac7-80d7-09cae400150a" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350025 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a18ae10-7f43-4072-b01c-1564735985be" containerName="nova-cell0-conductor-conductor" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350031 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a18ae10-7f43-4072-b01c-1564735985be" containerName="nova-cell0-conductor-conductor" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350039 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" containerName="mysql-bootstrap" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350046 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" containerName="mysql-bootstrap" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350057 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6ac806-4ac5-4de4-b6a0-b265032150f4" containerName="setup-container" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350064 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6ac806-4ac5-4de4-b6a0-b265032150f4" containerName="setup-container" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350074 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-auditor" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350080 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-auditor" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350089 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46af7209-8790-44ab-b255-8c84c3f5255a" containerName="barbican-worker-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350097 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="46af7209-8790-44ab-b255-8c84c3f5255a" containerName="barbican-worker-log" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350106 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-server" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350112 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-server" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350119 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="267687cf-58da-42e0-852e-c8c87f2ea42a" containerName="glance-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350125 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="267687cf-58da-42e0-852e-c8c87f2ea42a" containerName="glance-log" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350135 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f54df74-81ac-43f7-9075-51cb26200c4e" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350141 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f54df74-81ac-43f7-9075-51cb26200c4e" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350151 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b96e376-8104-4a92-b1ec-6078943d0b50" containerName="barbican-keystone-listener-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350157 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b96e376-8104-4a92-b1ec-6078943d0b50" containerName="barbican-keystone-listener-log" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350168 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c60bce87-ea0b-4b3d-8243-93ed40c232ff" containerName="nova-cell1-conductor-conductor" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350174 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c60bce87-ea0b-4b3d-8243-93ed40c232ff" containerName="nova-cell1-conductor-conductor" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350184 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="ceilometer-notification-agent" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350189 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="ceilometer-notification-agent" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350198 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aadb65a0-295d-4fcf-b148-44480346d357" containerName="nova-scheduler-scheduler" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350204 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="aadb65a0-295d-4fcf-b148-44480346d357" containerName="nova-scheduler-scheduler" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350210 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f54df74-81ac-43f7-9075-51cb26200c4e" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350216 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f54df74-81ac-43f7-9075-51cb26200c4e" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350222 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54eb5ffe-7e2f-4a33-9689-0470affe10e0" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350228 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="54eb5ffe-7e2f-4a33-9689-0470affe10e0" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350236 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d6ac806-4ac5-4de4-b6a0-b265032150f4" containerName="rabbitmq" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350242 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d6ac806-4ac5-4de4-b6a0-b265032150f4" containerName="rabbitmq" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350249 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56055ee5-407e-4ced-865f-03585e5f7f7b" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350255 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="56055ee5-407e-4ced-865f-03585e5f7f7b" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350262 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa796212-03db-4860-93f4-d2918ed44070" containerName="cinder-scheduler" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350267 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa796212-03db-4860-93f4-d2918ed44070" containerName="cinder-scheduler" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350277 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e69614e3-a574-42e1-adc2-09861e9974e5" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350283 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e69614e3-a574-42e1-adc2-09861e9974e5" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350290 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerName="nova-metadata-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350295 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerName="nova-metadata-log" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350302 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5fe1a689-1241-4c11-93ca-875e53319668" containerName="ovn-controller" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350308 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5fe1a689-1241-4c11-93ca-875e53319668" containerName="ovn-controller" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350335 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b548290-abc5-4c67-862c-16aa03a652da" containerName="openstack-network-exporter" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350342 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b548290-abc5-4c67-862c-16aa03a652da" containerName="openstack-network-exporter" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350350 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770b98aa-f177-4c7b-b37e-1664c039f47d" containerName="barbican-api-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350355 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="770b98aa-f177-4c7b-b37e-1664c039f47d" containerName="barbican-api-log" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350363 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1ffe861-7d12-49e2-9737-fc100833da39" containerName="memcached" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350368 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ffe861-7d12-49e2-9737-fc100833da39" containerName="memcached" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350377 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" containerName="galera" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350383 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" containerName="galera" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350390 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46af7209-8790-44ab-b255-8c84c3f5255a" containerName="barbican-worker" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350396 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="46af7209-8790-44ab-b255-8c84c3f5255a" containerName="barbican-worker" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350405 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-replicator" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350410 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-replicator" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350421 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ef057cb-0d02-49d4-a20d-ac9a3f85484e" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350426 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ef057cb-0d02-49d4-a20d-ac9a3f85484e" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350435 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="rsync" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350440 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="rsync" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350451 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b548290-abc5-4c67-862c-16aa03a652da" containerName="ovn-northd" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350456 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b548290-abc5-4c67-862c-16aa03a652da" containerName="ovn-northd" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350464 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b96e376-8104-4a92-b1ec-6078943d0b50" containerName="barbican-keystone-listener" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350469 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b96e376-8104-4a92-b1ec-6078943d0b50" containerName="barbican-keystone-listener" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350479 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350484 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350490 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa796212-03db-4860-93f4-d2918ed44070" containerName="probe" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350497 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa796212-03db-4860-93f4-d2918ed44070" containerName="probe" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350506 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a5d88e3-73a3-4f3d-af31-af675ab452bd" containerName="glance-httpd" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350512 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a5d88e3-73a3-4f3d-af31-af675ab452bd" containerName="glance-httpd" Dec 03 06:56:17 crc kubenswrapper[4831]: E1203 06:56:17.350518 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="swift-recon-cron" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350525 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="swift-recon-cron" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350650 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="20722e3f-f810-4ac7-80d7-09cae400150a" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350674 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-server" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350684 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="rsync" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350691 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f442a70-f040-4b0e-853d-6ce1f4caf63d" containerName="neutron-httpd" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350697 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-replicator" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350703 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f54df74-81ac-43f7-9075-51cb26200c4e" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350710 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c60bce87-ea0b-4b3d-8243-93ed40c232ff" containerName="nova-cell1-conductor-conductor" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350718 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerName="nova-metadata-metadata" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350727 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b548290-abc5-4c67-862c-16aa03a652da" containerName="ovn-northd" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350734 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="81084d6a-9987-4466-8f89-455aa3ff2627" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350745 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovsdb-server" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350751 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="ceilometer-notification-agent" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350757 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1ffe861-7d12-49e2-9737-fc100833da39" containerName="memcached" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350763 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350770 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b96e376-8104-4a92-b1ec-6078943d0b50" containerName="barbican-keystone-listener-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350778 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="54eb5ffe-7e2f-4a33-9689-0470affe10e0" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350786 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" containerName="nova-api-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350796 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b548290-abc5-4c67-862c-16aa03a652da" containerName="openstack-network-exporter" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350805 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" containerName="cinder-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350822 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-auditor" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350829 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d6ac806-4ac5-4de4-b6a0-b265032150f4" containerName="rabbitmq" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350840 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa796212-03db-4860-93f4-d2918ed44070" containerName="cinder-scheduler" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350852 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-updater" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350860 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="56055ee5-407e-4ced-865f-03585e5f7f7b" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350874 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5fe1a689-1241-4c11-93ca-875e53319668" containerName="ovn-controller" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350883 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-replicator" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350891 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="ceilometer-central-agent" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350899 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" containerName="placement-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350908 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a18ae10-7f43-4072-b01c-1564735985be" containerName="nova-cell0-conductor-conductor" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350916 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b96e376-8104-4a92-b1ec-6078943d0b50" containerName="barbican-keystone-listener" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350922 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-auditor" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350930 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="267687cf-58da-42e0-852e-c8c87f2ea42a" containerName="glance-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350937 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a5bbc77d-ca7b-48ab-b8bc-b304a12bb586" containerName="nova-metadata-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350947 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="swift-recon-cron" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350954 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b16411-0a37-4432-965b-746c2d70d00b" containerName="barbican-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350962 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-server" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350970 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-server" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350978 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-reaper" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350986 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a438bff-fbe4-4ae4-8d0f-3eecc1819f50" containerName="nova-api-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350994 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eab7a3f-11f0-4d00-b436-93cc30c2e8e1" containerName="cinder-api-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.350999 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="267687cf-58da-42e0-852e-c8c87f2ea42a" containerName="glance-httpd" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351005 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="46af7209-8790-44ab-b255-8c84c3f5255a" containerName="barbican-worker" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351012 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f442a70-f040-4b0e-853d-6ce1f4caf63d" containerName="neutron-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351018 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-updater" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351025 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5067e964-1daa-4bbd-8e2b-872ce1067389" containerName="keystone-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351032 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="55c21e0b-d5ab-4a59-9b0b-c1d8bbe6a01b" containerName="galera" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351041 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="sg-core" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351048 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="container-auditor" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351055 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a5d88e3-73a3-4f3d-af31-af675ab452bd" containerName="glance-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351061 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bf96e96-13ba-44c9-b16e-b1c2acbfc643" containerName="kube-state-metrics" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351069 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="object-expirer" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351078 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="770b98aa-f177-4c7b-b37e-1664c039f47d" containerName="barbican-api-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351084 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="46af7209-8790-44ab-b255-8c84c3f5255a" containerName="barbican-worker-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351090 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ef057cb-0d02-49d4-a20d-ac9a3f85484e" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351097 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f3e5a86-a8e3-416e-adb8-8b92c3cc81e5" containerName="account-replicator" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351104 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e69614e3-a574-42e1-adc2-09861e9974e5" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351121 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="86a3b2cb-5698-41ab-939f-a7f1e3ccc998" containerName="proxy-httpd" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351130 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc0cbb94-92ec-4369-b609-f3186f302c66" containerName="rabbitmq" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351138 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="770b98aa-f177-4c7b-b37e-1664c039f47d" containerName="barbican-api" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351146 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa796212-03db-4860-93f4-d2918ed44070" containerName="probe" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351153 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f657b4b-bed8-4244-8727-2a3c59364041" containerName="ovs-vswitchd" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351159 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a5d88e3-73a3-4f3d-af31-af675ab452bd" containerName="glance-httpd" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351167 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="cccc3a0b-98c7-4930-a7b5-3c1320a5ee69" containerName="placement-log" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351172 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="aadb65a0-295d-4fcf-b148-44480346d357" containerName="nova-scheduler-scheduler" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.351478 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f54df74-81ac-43f7-9075-51cb26200c4e" containerName="mariadb-account-delete" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.352178 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.365763 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bw746"] Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.423538 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc28422b-c839-4b0c-93d2-9506dae67701-catalog-content\") pod \"community-operators-bw746\" (UID: \"fc28422b-c839-4b0c-93d2-9506dae67701\") " pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.423608 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh5qz\" (UniqueName: \"kubernetes.io/projected/fc28422b-c839-4b0c-93d2-9506dae67701-kube-api-access-xh5qz\") pod \"community-operators-bw746\" (UID: \"fc28422b-c839-4b0c-93d2-9506dae67701\") " pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.423761 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc28422b-c839-4b0c-93d2-9506dae67701-utilities\") pod \"community-operators-bw746\" (UID: \"fc28422b-c839-4b0c-93d2-9506dae67701\") " pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.524986 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc28422b-c839-4b0c-93d2-9506dae67701-utilities\") pod \"community-operators-bw746\" (UID: \"fc28422b-c839-4b0c-93d2-9506dae67701\") " pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.525067 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc28422b-c839-4b0c-93d2-9506dae67701-catalog-content\") pod \"community-operators-bw746\" (UID: \"fc28422b-c839-4b0c-93d2-9506dae67701\") " pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.525126 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh5qz\" (UniqueName: \"kubernetes.io/projected/fc28422b-c839-4b0c-93d2-9506dae67701-kube-api-access-xh5qz\") pod \"community-operators-bw746\" (UID: \"fc28422b-c839-4b0c-93d2-9506dae67701\") " pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.525648 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc28422b-c839-4b0c-93d2-9506dae67701-utilities\") pod \"community-operators-bw746\" (UID: \"fc28422b-c839-4b0c-93d2-9506dae67701\") " pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.525657 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc28422b-c839-4b0c-93d2-9506dae67701-catalog-content\") pod \"community-operators-bw746\" (UID: \"fc28422b-c839-4b0c-93d2-9506dae67701\") " pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.548266 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh5qz\" (UniqueName: \"kubernetes.io/projected/fc28422b-c839-4b0c-93d2-9506dae67701-kube-api-access-xh5qz\") pod \"community-operators-bw746\" (UID: \"fc28422b-c839-4b0c-93d2-9506dae67701\") " pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:17 crc kubenswrapper[4831]: I1203 06:56:17.669144 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:18 crc kubenswrapper[4831]: I1203 06:56:18.029075 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bw746"] Dec 03 06:56:18 crc kubenswrapper[4831]: I1203 06:56:18.265536 4831 generic.go:334] "Generic (PLEG): container finished" podID="fc28422b-c839-4b0c-93d2-9506dae67701" containerID="b522be0b6f4697e2606238d3f22f513d36ccce3dd990a4b93e90ee9069db44a9" exitCode=0 Dec 03 06:56:18 crc kubenswrapper[4831]: I1203 06:56:18.265576 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bw746" event={"ID":"fc28422b-c839-4b0c-93d2-9506dae67701","Type":"ContainerDied","Data":"b522be0b6f4697e2606238d3f22f513d36ccce3dd990a4b93e90ee9069db44a9"} Dec 03 06:56:18 crc kubenswrapper[4831]: I1203 06:56:18.265602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bw746" event={"ID":"fc28422b-c839-4b0c-93d2-9506dae67701","Type":"ContainerStarted","Data":"bb891758caaff9ebd80b92b550b2f7b8d6b62328987fcd44e7c99f0f7d5faaa5"} Dec 03 06:56:20 crc kubenswrapper[4831]: I1203 06:56:20.285401 4831 generic.go:334] "Generic (PLEG): container finished" podID="fc28422b-c839-4b0c-93d2-9506dae67701" containerID="f5a78bba84db8767fe1f989545cc7537c52e45a8868406b7de1252ecf32a99e7" exitCode=0 Dec 03 06:56:20 crc kubenswrapper[4831]: I1203 06:56:20.285459 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bw746" event={"ID":"fc28422b-c839-4b0c-93d2-9506dae67701","Type":"ContainerDied","Data":"f5a78bba84db8767fe1f989545cc7537c52e45a8868406b7de1252ecf32a99e7"} Dec 03 06:56:21 crc kubenswrapper[4831]: I1203 06:56:21.299207 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bw746" event={"ID":"fc28422b-c839-4b0c-93d2-9506dae67701","Type":"ContainerStarted","Data":"e4c9023e0df62ed331232529219cb0f529700748b93933f8056a076c6196cd99"} Dec 03 06:56:21 crc kubenswrapper[4831]: I1203 06:56:21.322567 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bw746" podStartSLOduration=1.872078148 podStartE2EDuration="4.322541716s" podCreationTimestamp="2025-12-03 06:56:17 +0000 UTC" firstStartedPulling="2025-12-03 06:56:18.267413381 +0000 UTC m=+1515.610996899" lastFinishedPulling="2025-12-03 06:56:20.717876949 +0000 UTC m=+1518.061460467" observedRunningTime="2025-12-03 06:56:21.317001253 +0000 UTC m=+1518.660584811" watchObservedRunningTime="2025-12-03 06:56:21.322541716 +0000 UTC m=+1518.666125264" Dec 03 06:56:27 crc kubenswrapper[4831]: I1203 06:56:27.596725 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:56:27 crc kubenswrapper[4831]: I1203 06:56:27.597223 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:56:27 crc kubenswrapper[4831]: I1203 06:56:27.670380 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:27 crc kubenswrapper[4831]: I1203 06:56:27.670443 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:27 crc kubenswrapper[4831]: I1203 06:56:27.729754 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:28 crc kubenswrapper[4831]: I1203 06:56:28.405970 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:28 crc kubenswrapper[4831]: I1203 06:56:28.445138 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bw746"] Dec 03 06:56:30 crc kubenswrapper[4831]: I1203 06:56:30.381777 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bw746" podUID="fc28422b-c839-4b0c-93d2-9506dae67701" containerName="registry-server" containerID="cri-o://e4c9023e0df62ed331232529219cb0f529700748b93933f8056a076c6196cd99" gracePeriod=2 Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.320633 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.396191 4831 generic.go:334] "Generic (PLEG): container finished" podID="fc28422b-c839-4b0c-93d2-9506dae67701" containerID="e4c9023e0df62ed331232529219cb0f529700748b93933f8056a076c6196cd99" exitCode=0 Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.396249 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bw746" event={"ID":"fc28422b-c839-4b0c-93d2-9506dae67701","Type":"ContainerDied","Data":"e4c9023e0df62ed331232529219cb0f529700748b93933f8056a076c6196cd99"} Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.396290 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bw746" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.396330 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bw746" event={"ID":"fc28422b-c839-4b0c-93d2-9506dae67701","Type":"ContainerDied","Data":"bb891758caaff9ebd80b92b550b2f7b8d6b62328987fcd44e7c99f0f7d5faaa5"} Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.396355 4831 scope.go:117] "RemoveContainer" containerID="e4c9023e0df62ed331232529219cb0f529700748b93933f8056a076c6196cd99" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.420503 4831 scope.go:117] "RemoveContainer" containerID="f5a78bba84db8767fe1f989545cc7537c52e45a8868406b7de1252ecf32a99e7" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.436997 4831 scope.go:117] "RemoveContainer" containerID="b522be0b6f4697e2606238d3f22f513d36ccce3dd990a4b93e90ee9069db44a9" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.438555 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh5qz\" (UniqueName: \"kubernetes.io/projected/fc28422b-c839-4b0c-93d2-9506dae67701-kube-api-access-xh5qz\") pod \"fc28422b-c839-4b0c-93d2-9506dae67701\" (UID: \"fc28422b-c839-4b0c-93d2-9506dae67701\") " Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.438622 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc28422b-c839-4b0c-93d2-9506dae67701-catalog-content\") pod \"fc28422b-c839-4b0c-93d2-9506dae67701\" (UID: \"fc28422b-c839-4b0c-93d2-9506dae67701\") " Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.439250 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc28422b-c839-4b0c-93d2-9506dae67701-utilities\") pod \"fc28422b-c839-4b0c-93d2-9506dae67701\" (UID: \"fc28422b-c839-4b0c-93d2-9506dae67701\") " Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.440367 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc28422b-c839-4b0c-93d2-9506dae67701-utilities" (OuterVolumeSpecName: "utilities") pod "fc28422b-c839-4b0c-93d2-9506dae67701" (UID: "fc28422b-c839-4b0c-93d2-9506dae67701"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.443513 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc28422b-c839-4b0c-93d2-9506dae67701-kube-api-access-xh5qz" (OuterVolumeSpecName: "kube-api-access-xh5qz") pod "fc28422b-c839-4b0c-93d2-9506dae67701" (UID: "fc28422b-c839-4b0c-93d2-9506dae67701"). InnerVolumeSpecName "kube-api-access-xh5qz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.492239 4831 scope.go:117] "RemoveContainer" containerID="e4c9023e0df62ed331232529219cb0f529700748b93933f8056a076c6196cd99" Dec 03 06:56:31 crc kubenswrapper[4831]: E1203 06:56:31.492768 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4c9023e0df62ed331232529219cb0f529700748b93933f8056a076c6196cd99\": container with ID starting with e4c9023e0df62ed331232529219cb0f529700748b93933f8056a076c6196cd99 not found: ID does not exist" containerID="e4c9023e0df62ed331232529219cb0f529700748b93933f8056a076c6196cd99" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.492804 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4c9023e0df62ed331232529219cb0f529700748b93933f8056a076c6196cd99"} err="failed to get container status \"e4c9023e0df62ed331232529219cb0f529700748b93933f8056a076c6196cd99\": rpc error: code = NotFound desc = could not find container \"e4c9023e0df62ed331232529219cb0f529700748b93933f8056a076c6196cd99\": container with ID starting with e4c9023e0df62ed331232529219cb0f529700748b93933f8056a076c6196cd99 not found: ID does not exist" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.492828 4831 scope.go:117] "RemoveContainer" containerID="f5a78bba84db8767fe1f989545cc7537c52e45a8868406b7de1252ecf32a99e7" Dec 03 06:56:31 crc kubenswrapper[4831]: E1203 06:56:31.493153 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5a78bba84db8767fe1f989545cc7537c52e45a8868406b7de1252ecf32a99e7\": container with ID starting with f5a78bba84db8767fe1f989545cc7537c52e45a8868406b7de1252ecf32a99e7 not found: ID does not exist" containerID="f5a78bba84db8767fe1f989545cc7537c52e45a8868406b7de1252ecf32a99e7" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.493189 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5a78bba84db8767fe1f989545cc7537c52e45a8868406b7de1252ecf32a99e7"} err="failed to get container status \"f5a78bba84db8767fe1f989545cc7537c52e45a8868406b7de1252ecf32a99e7\": rpc error: code = NotFound desc = could not find container \"f5a78bba84db8767fe1f989545cc7537c52e45a8868406b7de1252ecf32a99e7\": container with ID starting with f5a78bba84db8767fe1f989545cc7537c52e45a8868406b7de1252ecf32a99e7 not found: ID does not exist" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.493210 4831 scope.go:117] "RemoveContainer" containerID="b522be0b6f4697e2606238d3f22f513d36ccce3dd990a4b93e90ee9069db44a9" Dec 03 06:56:31 crc kubenswrapper[4831]: E1203 06:56:31.493472 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b522be0b6f4697e2606238d3f22f513d36ccce3dd990a4b93e90ee9069db44a9\": container with ID starting with b522be0b6f4697e2606238d3f22f513d36ccce3dd990a4b93e90ee9069db44a9 not found: ID does not exist" containerID="b522be0b6f4697e2606238d3f22f513d36ccce3dd990a4b93e90ee9069db44a9" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.493502 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b522be0b6f4697e2606238d3f22f513d36ccce3dd990a4b93e90ee9069db44a9"} err="failed to get container status \"b522be0b6f4697e2606238d3f22f513d36ccce3dd990a4b93e90ee9069db44a9\": rpc error: code = NotFound desc = could not find container \"b522be0b6f4697e2606238d3f22f513d36ccce3dd990a4b93e90ee9069db44a9\": container with ID starting with b522be0b6f4697e2606238d3f22f513d36ccce3dd990a4b93e90ee9069db44a9 not found: ID does not exist" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.493994 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc28422b-c839-4b0c-93d2-9506dae67701-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fc28422b-c839-4b0c-93d2-9506dae67701" (UID: "fc28422b-c839-4b0c-93d2-9506dae67701"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.540773 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh5qz\" (UniqueName: \"kubernetes.io/projected/fc28422b-c839-4b0c-93d2-9506dae67701-kube-api-access-xh5qz\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.540807 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fc28422b-c839-4b0c-93d2-9506dae67701-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.540816 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fc28422b-c839-4b0c-93d2-9506dae67701-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.747129 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bw746"] Dec 03 06:56:31 crc kubenswrapper[4831]: I1203 06:56:31.752429 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bw746"] Dec 03 06:56:33 crc kubenswrapper[4831]: I1203 06:56:33.027767 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc28422b-c839-4b0c-93d2-9506dae67701" path="/var/lib/kubelet/pods/fc28422b-c839-4b0c-93d2-9506dae67701/volumes" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.491298 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f2n4v"] Dec 03 06:56:57 crc kubenswrapper[4831]: E1203 06:56:57.492180 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc28422b-c839-4b0c-93d2-9506dae67701" containerName="registry-server" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.492196 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc28422b-c839-4b0c-93d2-9506dae67701" containerName="registry-server" Dec 03 06:56:57 crc kubenswrapper[4831]: E1203 06:56:57.492219 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc28422b-c839-4b0c-93d2-9506dae67701" containerName="extract-utilities" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.492230 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc28422b-c839-4b0c-93d2-9506dae67701" containerName="extract-utilities" Dec 03 06:56:57 crc kubenswrapper[4831]: E1203 06:56:57.492260 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc28422b-c839-4b0c-93d2-9506dae67701" containerName="extract-content" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.492271 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc28422b-c839-4b0c-93d2-9506dae67701" containerName="extract-content" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.492512 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc28422b-c839-4b0c-93d2-9506dae67701" containerName="registry-server" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.493807 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.503133 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f2n4v"] Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.596883 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.596994 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.597080 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.598291 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.598469 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" gracePeriod=600 Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.627051 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-utilities\") pod \"certified-operators-f2n4v\" (UID: \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\") " pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.627116 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnkf7\" (UniqueName: \"kubernetes.io/projected/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-kube-api-access-tnkf7\") pod \"certified-operators-f2n4v\" (UID: \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\") " pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.627487 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-catalog-content\") pod \"certified-operators-f2n4v\" (UID: \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\") " pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.728789 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-utilities\") pod \"certified-operators-f2n4v\" (UID: \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\") " pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.728846 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnkf7\" (UniqueName: \"kubernetes.io/projected/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-kube-api-access-tnkf7\") pod \"certified-operators-f2n4v\" (UID: \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\") " pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.728884 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-catalog-content\") pod \"certified-operators-f2n4v\" (UID: \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\") " pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.729417 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-utilities\") pod \"certified-operators-f2n4v\" (UID: \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\") " pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.729533 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-catalog-content\") pod \"certified-operators-f2n4v\" (UID: \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\") " pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:56:57 crc kubenswrapper[4831]: E1203 06:56:57.740266 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.752383 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnkf7\" (UniqueName: \"kubernetes.io/projected/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-kube-api-access-tnkf7\") pod \"certified-operators-f2n4v\" (UID: \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\") " pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:56:57 crc kubenswrapper[4831]: I1203 06:56:57.816817 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:56:58 crc kubenswrapper[4831]: I1203 06:56:58.335169 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f2n4v"] Dec 03 06:56:58 crc kubenswrapper[4831]: I1203 06:56:58.669242 4831 generic.go:334] "Generic (PLEG): container finished" podID="b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" containerID="e14322d4e3b87e93e081e052eaf5a30d671af99571fe49ebd887823079b170b0" exitCode=0 Dec 03 06:56:58 crc kubenswrapper[4831]: I1203 06:56:58.669539 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2n4v" event={"ID":"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c","Type":"ContainerDied","Data":"e14322d4e3b87e93e081e052eaf5a30d671af99571fe49ebd887823079b170b0"} Dec 03 06:56:58 crc kubenswrapper[4831]: I1203 06:56:58.669601 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2n4v" event={"ID":"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c","Type":"ContainerStarted","Data":"b900fbbe691ae1746579e5bff35c36110dc3476fb21472418baa3658ef5000ff"} Dec 03 06:56:58 crc kubenswrapper[4831]: I1203 06:56:58.674622 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" exitCode=0 Dec 03 06:56:58 crc kubenswrapper[4831]: I1203 06:56:58.674671 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d"} Dec 03 06:56:58 crc kubenswrapper[4831]: I1203 06:56:58.674733 4831 scope.go:117] "RemoveContainer" containerID="cdd4c5391a62949652f778a85945bc3bd1190df6d8604a3965df5d75b3dfc56a" Dec 03 06:56:58 crc kubenswrapper[4831]: I1203 06:56:58.675495 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:56:58 crc kubenswrapper[4831]: E1203 06:56:58.675791 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:56:59 crc kubenswrapper[4831]: I1203 06:56:59.685909 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2n4v" event={"ID":"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c","Type":"ContainerStarted","Data":"b5e63408e660556d219617db98cb317869a446e6d653e07924ead34cbafb1f0b"} Dec 03 06:57:00 crc kubenswrapper[4831]: I1203 06:57:00.707544 4831 generic.go:334] "Generic (PLEG): container finished" podID="b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" containerID="b5e63408e660556d219617db98cb317869a446e6d653e07924ead34cbafb1f0b" exitCode=0 Dec 03 06:57:00 crc kubenswrapper[4831]: I1203 06:57:00.707635 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2n4v" event={"ID":"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c","Type":"ContainerDied","Data":"b5e63408e660556d219617db98cb317869a446e6d653e07924ead34cbafb1f0b"} Dec 03 06:57:01 crc kubenswrapper[4831]: I1203 06:57:01.719919 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2n4v" event={"ID":"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c","Type":"ContainerStarted","Data":"161669581ef6046dceffe6fc36a60e7c20d8fc94079eb3e47801a9fd925abbe7"} Dec 03 06:57:01 crc kubenswrapper[4831]: I1203 06:57:01.758900 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f2n4v" podStartSLOduration=2.299191622 podStartE2EDuration="4.758858305s" podCreationTimestamp="2025-12-03 06:56:57 +0000 UTC" firstStartedPulling="2025-12-03 06:56:58.671936872 +0000 UTC m=+1556.015520380" lastFinishedPulling="2025-12-03 06:57:01.131603555 +0000 UTC m=+1558.475187063" observedRunningTime="2025-12-03 06:57:01.750508505 +0000 UTC m=+1559.094092023" watchObservedRunningTime="2025-12-03 06:57:01.758858305 +0000 UTC m=+1559.102441823" Dec 03 06:57:07 crc kubenswrapper[4831]: I1203 06:57:07.817059 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:57:07 crc kubenswrapper[4831]: I1203 06:57:07.817880 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:57:07 crc kubenswrapper[4831]: I1203 06:57:07.994419 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:57:08 crc kubenswrapper[4831]: I1203 06:57:08.885020 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:57:08 crc kubenswrapper[4831]: I1203 06:57:08.949119 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f2n4v"] Dec 03 06:57:10 crc kubenswrapper[4831]: I1203 06:57:10.826406 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f2n4v" podUID="b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" containerName="registry-server" containerID="cri-o://161669581ef6046dceffe6fc36a60e7c20d8fc94079eb3e47801a9fd925abbe7" gracePeriod=2 Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.013457 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:57:11 crc kubenswrapper[4831]: E1203 06:57:11.013962 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.840393 4831 generic.go:334] "Generic (PLEG): container finished" podID="b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" containerID="161669581ef6046dceffe6fc36a60e7c20d8fc94079eb3e47801a9fd925abbe7" exitCode=0 Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.840467 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2n4v" event={"ID":"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c","Type":"ContainerDied","Data":"161669581ef6046dceffe6fc36a60e7c20d8fc94079eb3e47801a9fd925abbe7"} Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.841426 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f2n4v" event={"ID":"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c","Type":"ContainerDied","Data":"b900fbbe691ae1746579e5bff35c36110dc3476fb21472418baa3658ef5000ff"} Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.841483 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b900fbbe691ae1746579e5bff35c36110dc3476fb21472418baa3658ef5000ff" Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.854183 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.859972 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-utilities\") pod \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\" (UID: \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\") " Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.860065 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-catalog-content\") pod \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\" (UID: \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\") " Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.860140 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnkf7\" (UniqueName: \"kubernetes.io/projected/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-kube-api-access-tnkf7\") pod \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\" (UID: \"b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c\") " Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.860951 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-utilities" (OuterVolumeSpecName: "utilities") pod "b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" (UID: "b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.885244 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-kube-api-access-tnkf7" (OuterVolumeSpecName: "kube-api-access-tnkf7") pod "b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" (UID: "b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c"). InnerVolumeSpecName "kube-api-access-tnkf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.926790 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" (UID: "b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.962490 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.962526 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 06:57:11 crc kubenswrapper[4831]: I1203 06:57:11.962543 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnkf7\" (UniqueName: \"kubernetes.io/projected/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c-kube-api-access-tnkf7\") on node \"crc\" DevicePath \"\"" Dec 03 06:57:12 crc kubenswrapper[4831]: I1203 06:57:12.853424 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f2n4v" Dec 03 06:57:12 crc kubenswrapper[4831]: I1203 06:57:12.897426 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f2n4v"] Dec 03 06:57:12 crc kubenswrapper[4831]: I1203 06:57:12.907381 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f2n4v"] Dec 03 06:57:13 crc kubenswrapper[4831]: I1203 06:57:13.026830 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" path="/var/lib/kubelet/pods/b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c/volumes" Dec 03 06:57:13 crc kubenswrapper[4831]: I1203 06:57:13.661001 4831 scope.go:117] "RemoveContainer" containerID="9e4f7d54ce5529f9e5dcc758e6b2dea870d4d72e906329750dd8f3148bb9a9ae" Dec 03 06:57:13 crc kubenswrapper[4831]: I1203 06:57:13.714077 4831 scope.go:117] "RemoveContainer" containerID="8c2dcbad7750849b6be9771a4f94a9823beb233f81ffd0e05f59ae49bdc63cf8" Dec 03 06:57:13 crc kubenswrapper[4831]: I1203 06:57:13.736794 4831 scope.go:117] "RemoveContainer" containerID="61288a4ffbb5c63055a0602926b289f0293c643cceda88d7579316fb994d6eed" Dec 03 06:57:13 crc kubenswrapper[4831]: I1203 06:57:13.770870 4831 scope.go:117] "RemoveContainer" containerID="2083069d237ad9e634573977e455af0f77cad2d64fa39a6dce0010ff03639849" Dec 03 06:57:13 crc kubenswrapper[4831]: I1203 06:57:13.838079 4831 scope.go:117] "RemoveContainer" containerID="454f4b86b6c9c0a48f9173d3808d9aeaba4f55df7d2f0c60530104f9fda643a9" Dec 03 06:57:13 crc kubenswrapper[4831]: I1203 06:57:13.892812 4831 scope.go:117] "RemoveContainer" containerID="c817be12198932ba478b957a6adc60b0016409ee731b40995f2037effe75923a" Dec 03 06:57:13 crc kubenswrapper[4831]: I1203 06:57:13.918648 4831 scope.go:117] "RemoveContainer" containerID="cfbeb31890e53191672906e82bdc1b8672d17ef21f7bf4d793893c62a489706c" Dec 03 06:57:13 crc kubenswrapper[4831]: I1203 06:57:13.945904 4831 scope.go:117] "RemoveContainer" containerID="055b028fb60e01afe549cd5b7477b7dee313aab541390287578dc34dd742e9d3" Dec 03 06:57:13 crc kubenswrapper[4831]: I1203 06:57:13.962033 4831 scope.go:117] "RemoveContainer" containerID="8f3f08db7a40b296fad0ffb99615b25c24f20d9ef4906c8b1c92750fff434683" Dec 03 06:57:13 crc kubenswrapper[4831]: I1203 06:57:13.977370 4831 scope.go:117] "RemoveContainer" containerID="b925f0622d92bf472d21abc6c23f6621534691535cbfaf8ba8388d4d004db117" Dec 03 06:57:14 crc kubenswrapper[4831]: I1203 06:57:14.004986 4831 scope.go:117] "RemoveContainer" containerID="cd5262ff4d09f2a70f87d311ed4d215c4c526cf5ccf08abf8aeb8dd061f54cd4" Dec 03 06:57:14 crc kubenswrapper[4831]: I1203 06:57:14.036726 4831 scope.go:117] "RemoveContainer" containerID="e701230003bdb542945df73fbff521649b1c0dbe31d133988a35ce0bd844cb8e" Dec 03 06:57:14 crc kubenswrapper[4831]: I1203 06:57:14.053820 4831 scope.go:117] "RemoveContainer" containerID="25a311e276ecde83fbfb561d83a456c375aa79b31fba2fee94be2d8120bcf6e0" Dec 03 06:57:14 crc kubenswrapper[4831]: I1203 06:57:14.070302 4831 scope.go:117] "RemoveContainer" containerID="309cf75d8cbb562efc0116b04c83dd3b65a4b14e1de35f409f188bc0a497e2b6" Dec 03 06:57:14 crc kubenswrapper[4831]: I1203 06:57:14.087893 4831 scope.go:117] "RemoveContainer" containerID="6d6b1c280fd68fd1f6d2b15694ba84f3b57d21b989d2fef9e7c2285d54cbffc8" Dec 03 06:57:14 crc kubenswrapper[4831]: I1203 06:57:14.111228 4831 scope.go:117] "RemoveContainer" containerID="48e689b6dcef20403bef843759f14be2c5ce12c72fc1101d7f6683cde1f5871c" Dec 03 06:57:25 crc kubenswrapper[4831]: I1203 06:57:25.013142 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:57:25 crc kubenswrapper[4831]: E1203 06:57:25.014197 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:57:36 crc kubenswrapper[4831]: I1203 06:57:36.013101 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:57:36 crc kubenswrapper[4831]: E1203 06:57:36.013849 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:57:49 crc kubenswrapper[4831]: I1203 06:57:49.012306 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:57:49 crc kubenswrapper[4831]: E1203 06:57:49.012997 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:58:00 crc kubenswrapper[4831]: I1203 06:58:00.012334 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:58:00 crc kubenswrapper[4831]: E1203 06:58:00.012954 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.019185 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:58:14 crc kubenswrapper[4831]: E1203 06:58:14.020530 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.336904 4831 scope.go:117] "RemoveContainer" containerID="d84b948796c9132bd3b586cb87ffe22a8a11b658e05ec8005cfa4a6023a15577" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.412844 4831 scope.go:117] "RemoveContainer" containerID="202e6afb367fa1688e99525965f011edec7e64075502bdc543a3d94545c54fcf" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.478305 4831 scope.go:117] "RemoveContainer" containerID="4eef0053326ed74823bd51d439a4fac226fd4c4da5f8fb1077ece054b1be9bf8" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.529780 4831 scope.go:117] "RemoveContainer" containerID="5ef858aa3c34c29bab2a25c1ab6ce3b92a27fd595642d016617e9623d2c38b49" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.559360 4831 scope.go:117] "RemoveContainer" containerID="4d7afff5bfb7f70c1f6e79575dad07b5923d28bf89c9b89e6bf75d1bd7f50d68" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.593266 4831 scope.go:117] "RemoveContainer" containerID="361dfa918d7a67eda1a1a06e0059d6ef2b6bf45cf14229417deb0087cee8f873" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.638707 4831 scope.go:117] "RemoveContainer" containerID="5173ebd2612f2bca1eb5bfe9a9a4e301ffa1ac330d0fd6c0b492b51d60a348c0" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.659378 4831 scope.go:117] "RemoveContainer" containerID="18b7e657c68eea07658d1883b7cea150a92aa7f604639f69593adc7e316fa57c" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.678705 4831 scope.go:117] "RemoveContainer" containerID="68e8d72f0a3c2b88f2aef5d2282e0149f866aaf159fa298da782fd0be9e123cb" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.700092 4831 scope.go:117] "RemoveContainer" containerID="0c9038e6a18689fceba5e0a50ac6d4a0a041c704037c77f6242ca7ae37b78999" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.730057 4831 scope.go:117] "RemoveContainer" containerID="491f85725d63c87d76c7a6e62d6c33ebed834612994f604e9a07a041f9273116" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.770114 4831 scope.go:117] "RemoveContainer" containerID="8f1ceb54accf6cc95db136b64ce1f4080a7894f0ee7b6160b877d7044f1c4354" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.802507 4831 scope.go:117] "RemoveContainer" containerID="bf3305a025f14b656d2a04fe1dcbbd105079d7a12d26aa4a54d3b942e9fa3357" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.838002 4831 scope.go:117] "RemoveContainer" containerID="cadf275943a0d51df428d9e8f0db5f89e2c97e8e7f7a9f3ee7ee2e2d69b75447" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.865354 4831 scope.go:117] "RemoveContainer" containerID="bfdaec1e442053ec75746ee712e4fee25a008c9a2e11ba56c206f7698abbcfcd" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.883933 4831 scope.go:117] "RemoveContainer" containerID="4dfb2678a75458c31ce4e501a156049718eef44858bc8bb12bca6a8c8f4adbfa" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.906613 4831 scope.go:117] "RemoveContainer" containerID="9c37d8393b980bdf5ae1d421b8b8297aec83f883d91765e900851761b2758842" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.928643 4831 scope.go:117] "RemoveContainer" containerID="0cd0a2d44929d2103ada46d13db9172b16fd96eb4d3d73a7610523b81b7a7a5f" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.952948 4831 scope.go:117] "RemoveContainer" containerID="7ef08bde82922dfbabfe3171b51605013b1ffda34f112a536b00995fa61b5fd4" Dec 03 06:58:14 crc kubenswrapper[4831]: I1203 06:58:14.978597 4831 scope.go:117] "RemoveContainer" containerID="c6d0fe5894f63e35b1a7232d25b36b47f61870d8b5406acfc75dfc2ca5424ee7" Dec 03 06:58:15 crc kubenswrapper[4831]: I1203 06:58:15.003729 4831 scope.go:117] "RemoveContainer" containerID="70797ca9857cf493d328084e6e35279fdc2c8eaa7d4a14ba2b9c4f7d662a349c" Dec 03 06:58:26 crc kubenswrapper[4831]: I1203 06:58:26.013449 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:58:26 crc kubenswrapper[4831]: E1203 06:58:26.014507 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:58:38 crc kubenswrapper[4831]: I1203 06:58:38.017277 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:58:38 crc kubenswrapper[4831]: E1203 06:58:38.019124 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:58:53 crc kubenswrapper[4831]: I1203 06:58:53.025437 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:58:53 crc kubenswrapper[4831]: E1203 06:58:53.026368 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:59:08 crc kubenswrapper[4831]: I1203 06:59:08.012648 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:59:08 crc kubenswrapper[4831]: E1203 06:59:08.013348 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.381120 4831 scope.go:117] "RemoveContainer" containerID="54f683321e8d2d8116c08f5d3cfe3537791340a4d01427fbdbb99a494adf2145" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.422872 4831 scope.go:117] "RemoveContainer" containerID="75b047084dc8b16ee4f7f7d719653de8add48f84de233fd696b42a672c3d2c8e" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.470742 4831 scope.go:117] "RemoveContainer" containerID="67de1998f02ce4f499b3387ae3a93d5ab8ad64720e8c404dce1cf6612058988f" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.496139 4831 scope.go:117] "RemoveContainer" containerID="cb55894e2c9cf5e0ad069474f6818fd633cf016e9cf5e6f36f1acf0b93692be8" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.527158 4831 scope.go:117] "RemoveContainer" containerID="134ca55163ecb5e5269d7ec7014e4b029385ad75e92d2133a4bef2bd84b55d9f" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.555871 4831 scope.go:117] "RemoveContainer" containerID="189867c0bb890f5deb91e700fea6eb59d7952dd3df91cce7a5a46136b51ef4e8" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.578621 4831 scope.go:117] "RemoveContainer" containerID="74e0644a9792269f43c86550ff06e6fa5782d3aaca7a69696c14e40e26a5beec" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.609697 4831 scope.go:117] "RemoveContainer" containerID="559f2b757d8e6a0c71983c6a77dee78c697efd373adaa39024dff05879505861" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.638784 4831 scope.go:117] "RemoveContainer" containerID="ab65d6c3a454bed3af0834fe18511469a011150d61395258c6f5699f0ddd1b99" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.663129 4831 scope.go:117] "RemoveContainer" containerID="37442f25c3406732dc8f977e015ae0191d021a30df623fa7e2cf9a8112d10750" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.683110 4831 scope.go:117] "RemoveContainer" containerID="291b131153029f387717c7da3fd356090449748a8a6f50e5e4ea5f9663532572" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.698269 4831 scope.go:117] "RemoveContainer" containerID="a6fcbb703b3c57a0767a6e84e0a057dc745d0f18dbb25eedf9c9e63269b59524" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.719752 4831 scope.go:117] "RemoveContainer" containerID="f43ab3aa4c0eaaff226644bb2adc947099b427f65d25d4d5d58debc73fcb00c2" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.740177 4831 scope.go:117] "RemoveContainer" containerID="81b5150fbd0b1172d6d28f39ba926d2a465c6b9bc8c2478685908f4bccff980a" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.759589 4831 scope.go:117] "RemoveContainer" containerID="fde2e50f4d7cfe1b96e8b3f523c0e29c649af725d445a3690039bc74b475203c" Dec 03 06:59:15 crc kubenswrapper[4831]: I1203 06:59:15.782971 4831 scope.go:117] "RemoveContainer" containerID="cc59704ac175aa51eef90665f0c142d600f23e32957bdb845b6911299943cd82" Dec 03 06:59:21 crc kubenswrapper[4831]: I1203 06:59:21.012541 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:59:21 crc kubenswrapper[4831]: E1203 06:59:21.013392 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:59:36 crc kubenswrapper[4831]: I1203 06:59:36.014233 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:59:36 crc kubenswrapper[4831]: E1203 06:59:36.015237 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 06:59:48 crc kubenswrapper[4831]: I1203 06:59:48.014037 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 06:59:48 crc kubenswrapper[4831]: E1203 06:59:48.015380 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.172107 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh"] Dec 03 07:00:00 crc kubenswrapper[4831]: E1203 07:00:00.173425 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" containerName="extract-content" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.173452 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" containerName="extract-content" Dec 03 07:00:00 crc kubenswrapper[4831]: E1203 07:00:00.173470 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" containerName="extract-utilities" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.173483 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" containerName="extract-utilities" Dec 03 07:00:00 crc kubenswrapper[4831]: E1203 07:00:00.173533 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" containerName="registry-server" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.173545 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" containerName="registry-server" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.173841 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8bdcae1-5a8e-4743-bd4e-d9865ac5aa5c" containerName="registry-server" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.174912 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.177375 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.178407 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.184132 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh"] Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.330199 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkz5s\" (UniqueName: \"kubernetes.io/projected/ba20a9e0-a368-4398-af6d-2b8ce42de91a-kube-api-access-kkz5s\") pod \"collect-profiles-29412420-x54xh\" (UID: \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.330251 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba20a9e0-a368-4398-af6d-2b8ce42de91a-config-volume\") pod \"collect-profiles-29412420-x54xh\" (UID: \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.330275 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba20a9e0-a368-4398-af6d-2b8ce42de91a-secret-volume\") pod \"collect-profiles-29412420-x54xh\" (UID: \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.431843 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkz5s\" (UniqueName: \"kubernetes.io/projected/ba20a9e0-a368-4398-af6d-2b8ce42de91a-kube-api-access-kkz5s\") pod \"collect-profiles-29412420-x54xh\" (UID: \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.432482 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba20a9e0-a368-4398-af6d-2b8ce42de91a-config-volume\") pod \"collect-profiles-29412420-x54xh\" (UID: \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.432707 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba20a9e0-a368-4398-af6d-2b8ce42de91a-secret-volume\") pod \"collect-profiles-29412420-x54xh\" (UID: \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.435633 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba20a9e0-a368-4398-af6d-2b8ce42de91a-config-volume\") pod \"collect-profiles-29412420-x54xh\" (UID: \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.442727 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba20a9e0-a368-4398-af6d-2b8ce42de91a-secret-volume\") pod \"collect-profiles-29412420-x54xh\" (UID: \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.457275 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkz5s\" (UniqueName: \"kubernetes.io/projected/ba20a9e0-a368-4398-af6d-2b8ce42de91a-kube-api-access-kkz5s\") pod \"collect-profiles-29412420-x54xh\" (UID: \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" Dec 03 07:00:00 crc kubenswrapper[4831]: I1203 07:00:00.492472 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" Dec 03 07:00:01 crc kubenswrapper[4831]: I1203 07:00:01.009697 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh"] Dec 03 07:00:01 crc kubenswrapper[4831]: I1203 07:00:01.897740 4831 generic.go:334] "Generic (PLEG): container finished" podID="ba20a9e0-a368-4398-af6d-2b8ce42de91a" containerID="8dff0c63ec02427be9dc3bb1766830881fa00b3e23393bff150f01bc6ef7f266" exitCode=0 Dec 03 07:00:01 crc kubenswrapper[4831]: I1203 07:00:01.897881 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" event={"ID":"ba20a9e0-a368-4398-af6d-2b8ce42de91a","Type":"ContainerDied","Data":"8dff0c63ec02427be9dc3bb1766830881fa00b3e23393bff150f01bc6ef7f266"} Dec 03 07:00:01 crc kubenswrapper[4831]: I1203 07:00:01.897975 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" event={"ID":"ba20a9e0-a368-4398-af6d-2b8ce42de91a","Type":"ContainerStarted","Data":"ac9869472fb95a996f1f835a01c7bee012186d8466159d29d9cf58d70a3d0895"} Dec 03 07:00:02 crc kubenswrapper[4831]: I1203 07:00:02.013361 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 07:00:02 crc kubenswrapper[4831]: E1203 07:00:02.013752 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:00:03 crc kubenswrapper[4831]: I1203 07:00:03.275570 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" Dec 03 07:00:03 crc kubenswrapper[4831]: I1203 07:00:03.473300 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kkz5s\" (UniqueName: \"kubernetes.io/projected/ba20a9e0-a368-4398-af6d-2b8ce42de91a-kube-api-access-kkz5s\") pod \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\" (UID: \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\") " Dec 03 07:00:03 crc kubenswrapper[4831]: I1203 07:00:03.473405 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba20a9e0-a368-4398-af6d-2b8ce42de91a-secret-volume\") pod \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\" (UID: \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\") " Dec 03 07:00:03 crc kubenswrapper[4831]: I1203 07:00:03.473518 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba20a9e0-a368-4398-af6d-2b8ce42de91a-config-volume\") pod \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\" (UID: \"ba20a9e0-a368-4398-af6d-2b8ce42de91a\") " Dec 03 07:00:03 crc kubenswrapper[4831]: I1203 07:00:03.474157 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba20a9e0-a368-4398-af6d-2b8ce42de91a-config-volume" (OuterVolumeSpecName: "config-volume") pod "ba20a9e0-a368-4398-af6d-2b8ce42de91a" (UID: "ba20a9e0-a368-4398-af6d-2b8ce42de91a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:00:03 crc kubenswrapper[4831]: I1203 07:00:03.479138 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba20a9e0-a368-4398-af6d-2b8ce42de91a-kube-api-access-kkz5s" (OuterVolumeSpecName: "kube-api-access-kkz5s") pod "ba20a9e0-a368-4398-af6d-2b8ce42de91a" (UID: "ba20a9e0-a368-4398-af6d-2b8ce42de91a"). InnerVolumeSpecName "kube-api-access-kkz5s". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:00:03 crc kubenswrapper[4831]: I1203 07:00:03.479173 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba20a9e0-a368-4398-af6d-2b8ce42de91a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ba20a9e0-a368-4398-af6d-2b8ce42de91a" (UID: "ba20a9e0-a368-4398-af6d-2b8ce42de91a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:00:03 crc kubenswrapper[4831]: I1203 07:00:03.574974 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ba20a9e0-a368-4398-af6d-2b8ce42de91a-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:03 crc kubenswrapper[4831]: I1203 07:00:03.575065 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba20a9e0-a368-4398-af6d-2b8ce42de91a-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:03 crc kubenswrapper[4831]: I1203 07:00:03.575095 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kkz5s\" (UniqueName: \"kubernetes.io/projected/ba20a9e0-a368-4398-af6d-2b8ce42de91a-kube-api-access-kkz5s\") on node \"crc\" DevicePath \"\"" Dec 03 07:00:03 crc kubenswrapper[4831]: I1203 07:00:03.921374 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" event={"ID":"ba20a9e0-a368-4398-af6d-2b8ce42de91a","Type":"ContainerDied","Data":"ac9869472fb95a996f1f835a01c7bee012186d8466159d29d9cf58d70a3d0895"} Dec 03 07:00:03 crc kubenswrapper[4831]: I1203 07:00:03.921445 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac9869472fb95a996f1f835a01c7bee012186d8466159d29d9cf58d70a3d0895" Dec 03 07:00:03 crc kubenswrapper[4831]: I1203 07:00:03.921468 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh" Dec 03 07:00:14 crc kubenswrapper[4831]: I1203 07:00:14.012933 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 07:00:14 crc kubenswrapper[4831]: E1203 07:00:14.014107 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:00:15 crc kubenswrapper[4831]: I1203 07:00:15.982863 4831 scope.go:117] "RemoveContainer" containerID="06366c6c3c4c924b06643fdb1184e05bdab909f3fb392fa7334a1d7661ab4b77" Dec 03 07:00:16 crc kubenswrapper[4831]: I1203 07:00:16.029799 4831 scope.go:117] "RemoveContainer" containerID="cead6ad1cbb0e7e540d790058263a8a503bc9aedb3e982519dedca76eca29655" Dec 03 07:00:16 crc kubenswrapper[4831]: I1203 07:00:16.070381 4831 scope.go:117] "RemoveContainer" containerID="658edd03dc1f0ca24b716ddd82b056b7f1cebc66352ec23318becee9a216d637" Dec 03 07:00:27 crc kubenswrapper[4831]: I1203 07:00:27.013185 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 07:00:27 crc kubenswrapper[4831]: E1203 07:00:27.013977 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:00:40 crc kubenswrapper[4831]: I1203 07:00:40.014618 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 07:00:40 crc kubenswrapper[4831]: E1203 07:00:40.015512 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:00:54 crc kubenswrapper[4831]: I1203 07:00:54.012440 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 07:00:54 crc kubenswrapper[4831]: E1203 07:00:54.013190 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:01:08 crc kubenswrapper[4831]: I1203 07:01:08.013504 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 07:01:08 crc kubenswrapper[4831]: E1203 07:01:08.014792 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:01:16 crc kubenswrapper[4831]: I1203 07:01:16.169074 4831 scope.go:117] "RemoveContainer" containerID="0683fcacd44a2588366c5fcd6cc3611fb9f316d87e1f2dde6d14ccb17889efd5" Dec 03 07:01:16 crc kubenswrapper[4831]: I1203 07:01:16.202962 4831 scope.go:117] "RemoveContainer" containerID="7d0ad17b9c4b331c9b8e4e43f1949619c278138c064021159c668b145db79938" Dec 03 07:01:16 crc kubenswrapper[4831]: I1203 07:01:16.228221 4831 scope.go:117] "RemoveContainer" containerID="d9f25cfe98a3d7189a069459787b3914756469c02ce6e44bffed77d599dd4887" Dec 03 07:01:16 crc kubenswrapper[4831]: I1203 07:01:16.256782 4831 scope.go:117] "RemoveContainer" containerID="f6d28e8b3135eebdfd374f9c303c4d8edb75a0da36bd550a21205abedf1942c2" Dec 03 07:01:16 crc kubenswrapper[4831]: I1203 07:01:16.293005 4831 scope.go:117] "RemoveContainer" containerID="a5a52868ab8bcbffa60ec710ffeaf3085cae1de015dd39d00cd7003328b22a28" Dec 03 07:01:16 crc kubenswrapper[4831]: I1203 07:01:16.325639 4831 scope.go:117] "RemoveContainer" containerID="494bdbb04a97a9462642db6ed2eac92b40a565c9874d9c4019b5695ebc6ac105" Dec 03 07:01:16 crc kubenswrapper[4831]: I1203 07:01:16.349982 4831 scope.go:117] "RemoveContainer" containerID="90079cc1fa171a7d099cbc2a89a597829b7d80cd9239ad5c381d51f605510206" Dec 03 07:01:20 crc kubenswrapper[4831]: I1203 07:01:20.012687 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 07:01:20 crc kubenswrapper[4831]: E1203 07:01:20.013230 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:01:33 crc kubenswrapper[4831]: I1203 07:01:33.023355 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 07:01:33 crc kubenswrapper[4831]: E1203 07:01:33.024234 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:01:48 crc kubenswrapper[4831]: I1203 07:01:48.012428 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 07:01:48 crc kubenswrapper[4831]: E1203 07:01:48.013114 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:02:03 crc kubenswrapper[4831]: I1203 07:02:03.021865 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 07:02:04 crc kubenswrapper[4831]: I1203 07:02:04.243687 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"d44b88821d6284bb85c274d1d4055519af09d0da6d45790a48b6162617c80d83"} Dec 03 07:02:16 crc kubenswrapper[4831]: I1203 07:02:16.444439 4831 scope.go:117] "RemoveContainer" containerID="557c4ed3ac9d594aa36c1567b066b8623beb6bdaab234448402e61224dd28e60" Dec 03 07:02:16 crc kubenswrapper[4831]: I1203 07:02:16.473178 4831 scope.go:117] "RemoveContainer" containerID="a69329035f458ff6fcb46d258f921e780538a61f81fd40d9c93eb03532d06f8d" Dec 03 07:02:16 crc kubenswrapper[4831]: I1203 07:02:16.501438 4831 scope.go:117] "RemoveContainer" containerID="15b7bbc85d81bb5e07862ca4b4de5ff3923ce96b0caa2a68467827bd0e8e0b7f" Dec 03 07:02:16 crc kubenswrapper[4831]: I1203 07:02:16.523825 4831 scope.go:117] "RemoveContainer" containerID="1ce4d862c594baa625d818638ef9af5cb1a640bf2485dc86c14507efd56534c6" Dec 03 07:02:16 crc kubenswrapper[4831]: I1203 07:02:16.545661 4831 scope.go:117] "RemoveContainer" containerID="9ec26ef13af70543f75a8bc515a87380e1e69e18ff78e24db623bf3f171971b6" Dec 03 07:02:16 crc kubenswrapper[4831]: I1203 07:02:16.567930 4831 scope.go:117] "RemoveContainer" containerID="ac040d35ff00ea3fd5ce4c567c429329ee4b16038dc1dca2dd296dc049531e88" Dec 03 07:03:16 crc kubenswrapper[4831]: I1203 07:03:16.715953 4831 scope.go:117] "RemoveContainer" containerID="b5e63408e660556d219617db98cb317869a446e6d653e07924ead34cbafb1f0b" Dec 03 07:03:16 crc kubenswrapper[4831]: I1203 07:03:16.754681 4831 scope.go:117] "RemoveContainer" containerID="161669581ef6046dceffe6fc36a60e7c20d8fc94079eb3e47801a9fd925abbe7" Dec 03 07:03:16 crc kubenswrapper[4831]: I1203 07:03:16.789065 4831 scope.go:117] "RemoveContainer" containerID="e14322d4e3b87e93e081e052eaf5a30d671af99571fe49ebd887823079b170b0" Dec 03 07:04:27 crc kubenswrapper[4831]: I1203 07:04:27.596982 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:04:27 crc kubenswrapper[4831]: I1203 07:04:27.598039 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:04:41 crc kubenswrapper[4831]: I1203 07:04:41.803482 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lbvps"] Dec 03 07:04:41 crc kubenswrapper[4831]: E1203 07:04:41.804248 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba20a9e0-a368-4398-af6d-2b8ce42de91a" containerName="collect-profiles" Dec 03 07:04:41 crc kubenswrapper[4831]: I1203 07:04:41.804261 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba20a9e0-a368-4398-af6d-2b8ce42de91a" containerName="collect-profiles" Dec 03 07:04:41 crc kubenswrapper[4831]: I1203 07:04:41.804482 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba20a9e0-a368-4398-af6d-2b8ce42de91a" containerName="collect-profiles" Dec 03 07:04:41 crc kubenswrapper[4831]: I1203 07:04:41.805500 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:41 crc kubenswrapper[4831]: I1203 07:04:41.822858 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbvps"] Dec 03 07:04:41 crc kubenswrapper[4831]: I1203 07:04:41.892554 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57585be-a6ff-4f4a-a978-3cc79b794614-utilities\") pod \"redhat-marketplace-lbvps\" (UID: \"e57585be-a6ff-4f4a-a978-3cc79b794614\") " pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:41 crc kubenswrapper[4831]: I1203 07:04:41.892664 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57585be-a6ff-4f4a-a978-3cc79b794614-catalog-content\") pod \"redhat-marketplace-lbvps\" (UID: \"e57585be-a6ff-4f4a-a978-3cc79b794614\") " pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:41 crc kubenswrapper[4831]: I1203 07:04:41.892695 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lh4s8\" (UniqueName: \"kubernetes.io/projected/e57585be-a6ff-4f4a-a978-3cc79b794614-kube-api-access-lh4s8\") pod \"redhat-marketplace-lbvps\" (UID: \"e57585be-a6ff-4f4a-a978-3cc79b794614\") " pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:41 crc kubenswrapper[4831]: I1203 07:04:41.994100 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57585be-a6ff-4f4a-a978-3cc79b794614-utilities\") pod \"redhat-marketplace-lbvps\" (UID: \"e57585be-a6ff-4f4a-a978-3cc79b794614\") " pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:41 crc kubenswrapper[4831]: I1203 07:04:41.994202 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57585be-a6ff-4f4a-a978-3cc79b794614-catalog-content\") pod \"redhat-marketplace-lbvps\" (UID: \"e57585be-a6ff-4f4a-a978-3cc79b794614\") " pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:41 crc kubenswrapper[4831]: I1203 07:04:41.994232 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lh4s8\" (UniqueName: \"kubernetes.io/projected/e57585be-a6ff-4f4a-a978-3cc79b794614-kube-api-access-lh4s8\") pod \"redhat-marketplace-lbvps\" (UID: \"e57585be-a6ff-4f4a-a978-3cc79b794614\") " pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:41 crc kubenswrapper[4831]: I1203 07:04:41.994779 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57585be-a6ff-4f4a-a978-3cc79b794614-utilities\") pod \"redhat-marketplace-lbvps\" (UID: \"e57585be-a6ff-4f4a-a978-3cc79b794614\") " pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:41 crc kubenswrapper[4831]: I1203 07:04:41.994839 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57585be-a6ff-4f4a-a978-3cc79b794614-catalog-content\") pod \"redhat-marketplace-lbvps\" (UID: \"e57585be-a6ff-4f4a-a978-3cc79b794614\") " pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:42 crc kubenswrapper[4831]: I1203 07:04:42.026158 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lh4s8\" (UniqueName: \"kubernetes.io/projected/e57585be-a6ff-4f4a-a978-3cc79b794614-kube-api-access-lh4s8\") pod \"redhat-marketplace-lbvps\" (UID: \"e57585be-a6ff-4f4a-a978-3cc79b794614\") " pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:42 crc kubenswrapper[4831]: I1203 07:04:42.129929 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:42 crc kubenswrapper[4831]: I1203 07:04:42.635506 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbvps"] Dec 03 07:04:42 crc kubenswrapper[4831]: I1203 07:04:42.656578 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbvps" event={"ID":"e57585be-a6ff-4f4a-a978-3cc79b794614","Type":"ContainerStarted","Data":"e858b278f05f4f681a9f5ee8d071c38245d3a918965ba9f9d807ffa518c8fe3a"} Dec 03 07:04:43 crc kubenswrapper[4831]: I1203 07:04:43.665962 4831 generic.go:334] "Generic (PLEG): container finished" podID="e57585be-a6ff-4f4a-a978-3cc79b794614" containerID="667f1e380111630e1aeccfbaa29b70623b93cbd18bac0a901ecb32d6f95b3c60" exitCode=0 Dec 03 07:04:43 crc kubenswrapper[4831]: I1203 07:04:43.666014 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbvps" event={"ID":"e57585be-a6ff-4f4a-a978-3cc79b794614","Type":"ContainerDied","Data":"667f1e380111630e1aeccfbaa29b70623b93cbd18bac0a901ecb32d6f95b3c60"} Dec 03 07:04:43 crc kubenswrapper[4831]: I1203 07:04:43.668983 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:04:44 crc kubenswrapper[4831]: I1203 07:04:44.676916 4831 generic.go:334] "Generic (PLEG): container finished" podID="e57585be-a6ff-4f4a-a978-3cc79b794614" containerID="6be022e65aa7debee21337c3a3e126af6c4b97fa84a59408346b951c71b6f80b" exitCode=0 Dec 03 07:04:44 crc kubenswrapper[4831]: I1203 07:04:44.676994 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbvps" event={"ID":"e57585be-a6ff-4f4a-a978-3cc79b794614","Type":"ContainerDied","Data":"6be022e65aa7debee21337c3a3e126af6c4b97fa84a59408346b951c71b6f80b"} Dec 03 07:04:45 crc kubenswrapper[4831]: I1203 07:04:45.688711 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbvps" event={"ID":"e57585be-a6ff-4f4a-a978-3cc79b794614","Type":"ContainerStarted","Data":"ed668c63a8c5aa427b99a2685b2089e7e126a3818f3e479dfb19404a3b092c01"} Dec 03 07:04:45 crc kubenswrapper[4831]: I1203 07:04:45.705595 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lbvps" podStartSLOduration=3.038532935 podStartE2EDuration="4.705561319s" podCreationTimestamp="2025-12-03 07:04:41 +0000 UTC" firstStartedPulling="2025-12-03 07:04:43.668691016 +0000 UTC m=+2021.012274534" lastFinishedPulling="2025-12-03 07:04:45.3357194 +0000 UTC m=+2022.679302918" observedRunningTime="2025-12-03 07:04:45.704575808 +0000 UTC m=+2023.048159336" watchObservedRunningTime="2025-12-03 07:04:45.705561319 +0000 UTC m=+2023.049144877" Dec 03 07:04:52 crc kubenswrapper[4831]: I1203 07:04:52.130962 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:52 crc kubenswrapper[4831]: I1203 07:04:52.131604 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:52 crc kubenswrapper[4831]: I1203 07:04:52.196750 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:52 crc kubenswrapper[4831]: I1203 07:04:52.823256 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:52 crc kubenswrapper[4831]: I1203 07:04:52.893007 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbvps"] Dec 03 07:04:54 crc kubenswrapper[4831]: I1203 07:04:54.775788 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lbvps" podUID="e57585be-a6ff-4f4a-a978-3cc79b794614" containerName="registry-server" containerID="cri-o://ed668c63a8c5aa427b99a2685b2089e7e126a3818f3e479dfb19404a3b092c01" gracePeriod=2 Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.706233 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.748417 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57585be-a6ff-4f4a-a978-3cc79b794614-catalog-content\") pod \"e57585be-a6ff-4f4a-a978-3cc79b794614\" (UID: \"e57585be-a6ff-4f4a-a978-3cc79b794614\") " Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.748499 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57585be-a6ff-4f4a-a978-3cc79b794614-utilities\") pod \"e57585be-a6ff-4f4a-a978-3cc79b794614\" (UID: \"e57585be-a6ff-4f4a-a978-3cc79b794614\") " Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.748605 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lh4s8\" (UniqueName: \"kubernetes.io/projected/e57585be-a6ff-4f4a-a978-3cc79b794614-kube-api-access-lh4s8\") pod \"e57585be-a6ff-4f4a-a978-3cc79b794614\" (UID: \"e57585be-a6ff-4f4a-a978-3cc79b794614\") " Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.749257 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e57585be-a6ff-4f4a-a978-3cc79b794614-utilities" (OuterVolumeSpecName: "utilities") pod "e57585be-a6ff-4f4a-a978-3cc79b794614" (UID: "e57585be-a6ff-4f4a-a978-3cc79b794614"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.757104 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e57585be-a6ff-4f4a-a978-3cc79b794614-kube-api-access-lh4s8" (OuterVolumeSpecName: "kube-api-access-lh4s8") pod "e57585be-a6ff-4f4a-a978-3cc79b794614" (UID: "e57585be-a6ff-4f4a-a978-3cc79b794614"). InnerVolumeSpecName "kube-api-access-lh4s8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.766780 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e57585be-a6ff-4f4a-a978-3cc79b794614-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e57585be-a6ff-4f4a-a978-3cc79b794614" (UID: "e57585be-a6ff-4f4a-a978-3cc79b794614"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.786895 4831 generic.go:334] "Generic (PLEG): container finished" podID="e57585be-a6ff-4f4a-a978-3cc79b794614" containerID="ed668c63a8c5aa427b99a2685b2089e7e126a3818f3e479dfb19404a3b092c01" exitCode=0 Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.786947 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbvps" event={"ID":"e57585be-a6ff-4f4a-a978-3cc79b794614","Type":"ContainerDied","Data":"ed668c63a8c5aa427b99a2685b2089e7e126a3818f3e479dfb19404a3b092c01"} Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.786943 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lbvps" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.787029 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lbvps" event={"ID":"e57585be-a6ff-4f4a-a978-3cc79b794614","Type":"ContainerDied","Data":"e858b278f05f4f681a9f5ee8d071c38245d3a918965ba9f9d807ffa518c8fe3a"} Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.787061 4831 scope.go:117] "RemoveContainer" containerID="ed668c63a8c5aa427b99a2685b2089e7e126a3818f3e479dfb19404a3b092c01" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.811658 4831 scope.go:117] "RemoveContainer" containerID="6be022e65aa7debee21337c3a3e126af6c4b97fa84a59408346b951c71b6f80b" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.841037 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbvps"] Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.847178 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lbvps"] Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.849239 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e57585be-a6ff-4f4a-a978-3cc79b794614-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.849271 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e57585be-a6ff-4f4a-a978-3cc79b794614-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.849284 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lh4s8\" (UniqueName: \"kubernetes.io/projected/e57585be-a6ff-4f4a-a978-3cc79b794614-kube-api-access-lh4s8\") on node \"crc\" DevicePath \"\"" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.849501 4831 scope.go:117] "RemoveContainer" containerID="667f1e380111630e1aeccfbaa29b70623b93cbd18bac0a901ecb32d6f95b3c60" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.868484 4831 scope.go:117] "RemoveContainer" containerID="ed668c63a8c5aa427b99a2685b2089e7e126a3818f3e479dfb19404a3b092c01" Dec 03 07:04:55 crc kubenswrapper[4831]: E1203 07:04:55.868917 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed668c63a8c5aa427b99a2685b2089e7e126a3818f3e479dfb19404a3b092c01\": container with ID starting with ed668c63a8c5aa427b99a2685b2089e7e126a3818f3e479dfb19404a3b092c01 not found: ID does not exist" containerID="ed668c63a8c5aa427b99a2685b2089e7e126a3818f3e479dfb19404a3b092c01" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.868950 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed668c63a8c5aa427b99a2685b2089e7e126a3818f3e479dfb19404a3b092c01"} err="failed to get container status \"ed668c63a8c5aa427b99a2685b2089e7e126a3818f3e479dfb19404a3b092c01\": rpc error: code = NotFound desc = could not find container \"ed668c63a8c5aa427b99a2685b2089e7e126a3818f3e479dfb19404a3b092c01\": container with ID starting with ed668c63a8c5aa427b99a2685b2089e7e126a3818f3e479dfb19404a3b092c01 not found: ID does not exist" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.868974 4831 scope.go:117] "RemoveContainer" containerID="6be022e65aa7debee21337c3a3e126af6c4b97fa84a59408346b951c71b6f80b" Dec 03 07:04:55 crc kubenswrapper[4831]: E1203 07:04:55.869280 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6be022e65aa7debee21337c3a3e126af6c4b97fa84a59408346b951c71b6f80b\": container with ID starting with 6be022e65aa7debee21337c3a3e126af6c4b97fa84a59408346b951c71b6f80b not found: ID does not exist" containerID="6be022e65aa7debee21337c3a3e126af6c4b97fa84a59408346b951c71b6f80b" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.869309 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6be022e65aa7debee21337c3a3e126af6c4b97fa84a59408346b951c71b6f80b"} err="failed to get container status \"6be022e65aa7debee21337c3a3e126af6c4b97fa84a59408346b951c71b6f80b\": rpc error: code = NotFound desc = could not find container \"6be022e65aa7debee21337c3a3e126af6c4b97fa84a59408346b951c71b6f80b\": container with ID starting with 6be022e65aa7debee21337c3a3e126af6c4b97fa84a59408346b951c71b6f80b not found: ID does not exist" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.869339 4831 scope.go:117] "RemoveContainer" containerID="667f1e380111630e1aeccfbaa29b70623b93cbd18bac0a901ecb32d6f95b3c60" Dec 03 07:04:55 crc kubenswrapper[4831]: E1203 07:04:55.869630 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"667f1e380111630e1aeccfbaa29b70623b93cbd18bac0a901ecb32d6f95b3c60\": container with ID starting with 667f1e380111630e1aeccfbaa29b70623b93cbd18bac0a901ecb32d6f95b3c60 not found: ID does not exist" containerID="667f1e380111630e1aeccfbaa29b70623b93cbd18bac0a901ecb32d6f95b3c60" Dec 03 07:04:55 crc kubenswrapper[4831]: I1203 07:04:55.869658 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"667f1e380111630e1aeccfbaa29b70623b93cbd18bac0a901ecb32d6f95b3c60"} err="failed to get container status \"667f1e380111630e1aeccfbaa29b70623b93cbd18bac0a901ecb32d6f95b3c60\": rpc error: code = NotFound desc = could not find container \"667f1e380111630e1aeccfbaa29b70623b93cbd18bac0a901ecb32d6f95b3c60\": container with ID starting with 667f1e380111630e1aeccfbaa29b70623b93cbd18bac0a901ecb32d6f95b3c60 not found: ID does not exist" Dec 03 07:04:57 crc kubenswrapper[4831]: I1203 07:04:57.022253 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e57585be-a6ff-4f4a-a978-3cc79b794614" path="/var/lib/kubelet/pods/e57585be-a6ff-4f4a-a978-3cc79b794614/volumes" Dec 03 07:04:57 crc kubenswrapper[4831]: I1203 07:04:57.596952 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:04:57 crc kubenswrapper[4831]: I1203 07:04:57.597042 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:05:27 crc kubenswrapper[4831]: I1203 07:05:27.596918 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:05:27 crc kubenswrapper[4831]: I1203 07:05:27.597599 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:05:27 crc kubenswrapper[4831]: I1203 07:05:27.597755 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 07:05:27 crc kubenswrapper[4831]: I1203 07:05:27.598768 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d44b88821d6284bb85c274d1d4055519af09d0da6d45790a48b6162617c80d83"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:05:27 crc kubenswrapper[4831]: I1203 07:05:27.598876 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://d44b88821d6284bb85c274d1d4055519af09d0da6d45790a48b6162617c80d83" gracePeriod=600 Dec 03 07:05:28 crc kubenswrapper[4831]: I1203 07:05:28.089340 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="d44b88821d6284bb85c274d1d4055519af09d0da6d45790a48b6162617c80d83" exitCode=0 Dec 03 07:05:28 crc kubenswrapper[4831]: I1203 07:05:28.089378 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"d44b88821d6284bb85c274d1d4055519af09d0da6d45790a48b6162617c80d83"} Dec 03 07:05:28 crc kubenswrapper[4831]: I1203 07:05:28.089734 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7"} Dec 03 07:05:28 crc kubenswrapper[4831]: I1203 07:05:28.089768 4831 scope.go:117] "RemoveContainer" containerID="3938234e086b7963705c0dfcbf75fab6acb8a9996de6f91d228814b38359d12d" Dec 03 07:07:27 crc kubenswrapper[4831]: I1203 07:07:27.597042 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:07:27 crc kubenswrapper[4831]: I1203 07:07:27.598503 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.033243 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qpvlk"] Dec 03 07:07:44 crc kubenswrapper[4831]: E1203 07:07:44.036213 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57585be-a6ff-4f4a-a978-3cc79b794614" containerName="registry-server" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.036252 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57585be-a6ff-4f4a-a978-3cc79b794614" containerName="registry-server" Dec 03 07:07:44 crc kubenswrapper[4831]: E1203 07:07:44.036305 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57585be-a6ff-4f4a-a978-3cc79b794614" containerName="extract-utilities" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.036356 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57585be-a6ff-4f4a-a978-3cc79b794614" containerName="extract-utilities" Dec 03 07:07:44 crc kubenswrapper[4831]: E1203 07:07:44.036387 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e57585be-a6ff-4f4a-a978-3cc79b794614" containerName="extract-content" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.036400 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e57585be-a6ff-4f4a-a978-3cc79b794614" containerName="extract-content" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.036760 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e57585be-a6ff-4f4a-a978-3cc79b794614" containerName="registry-server" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.038958 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.052946 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qpvlk"] Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.143571 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg56d\" (UniqueName: \"kubernetes.io/projected/48f08298-7a03-4a22-b570-1f498f6cb980-kube-api-access-gg56d\") pod \"certified-operators-qpvlk\" (UID: \"48f08298-7a03-4a22-b570-1f498f6cb980\") " pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.143654 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f08298-7a03-4a22-b570-1f498f6cb980-utilities\") pod \"certified-operators-qpvlk\" (UID: \"48f08298-7a03-4a22-b570-1f498f6cb980\") " pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.143717 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f08298-7a03-4a22-b570-1f498f6cb980-catalog-content\") pod \"certified-operators-qpvlk\" (UID: \"48f08298-7a03-4a22-b570-1f498f6cb980\") " pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.245909 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f08298-7a03-4a22-b570-1f498f6cb980-utilities\") pod \"certified-operators-qpvlk\" (UID: \"48f08298-7a03-4a22-b570-1f498f6cb980\") " pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.246006 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f08298-7a03-4a22-b570-1f498f6cb980-catalog-content\") pod \"certified-operators-qpvlk\" (UID: \"48f08298-7a03-4a22-b570-1f498f6cb980\") " pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.246094 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gg56d\" (UniqueName: \"kubernetes.io/projected/48f08298-7a03-4a22-b570-1f498f6cb980-kube-api-access-gg56d\") pod \"certified-operators-qpvlk\" (UID: \"48f08298-7a03-4a22-b570-1f498f6cb980\") " pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.246822 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f08298-7a03-4a22-b570-1f498f6cb980-utilities\") pod \"certified-operators-qpvlk\" (UID: \"48f08298-7a03-4a22-b570-1f498f6cb980\") " pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.247050 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f08298-7a03-4a22-b570-1f498f6cb980-catalog-content\") pod \"certified-operators-qpvlk\" (UID: \"48f08298-7a03-4a22-b570-1f498f6cb980\") " pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.268489 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg56d\" (UniqueName: \"kubernetes.io/projected/48f08298-7a03-4a22-b570-1f498f6cb980-kube-api-access-gg56d\") pod \"certified-operators-qpvlk\" (UID: \"48f08298-7a03-4a22-b570-1f498f6cb980\") " pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.376873 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.427810 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tblqc"] Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.429262 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.445084 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tblqc"] Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.554272 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-catalog-content\") pod \"community-operators-tblqc\" (UID: \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\") " pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.554394 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-utilities\") pod \"community-operators-tblqc\" (UID: \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\") " pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.554464 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wwnp\" (UniqueName: \"kubernetes.io/projected/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-kube-api-access-2wwnp\") pod \"community-operators-tblqc\" (UID: \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\") " pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.655832 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wwnp\" (UniqueName: \"kubernetes.io/projected/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-kube-api-access-2wwnp\") pod \"community-operators-tblqc\" (UID: \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\") " pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.655880 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-catalog-content\") pod \"community-operators-tblqc\" (UID: \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\") " pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.655947 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-utilities\") pod \"community-operators-tblqc\" (UID: \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\") " pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.656501 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-utilities\") pod \"community-operators-tblqc\" (UID: \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\") " pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.657031 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-catalog-content\") pod \"community-operators-tblqc\" (UID: \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\") " pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.678808 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wwnp\" (UniqueName: \"kubernetes.io/projected/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-kube-api-access-2wwnp\") pod \"community-operators-tblqc\" (UID: \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\") " pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.779258 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:44 crc kubenswrapper[4831]: I1203 07:07:44.896670 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qpvlk"] Dec 03 07:07:44 crc kubenswrapper[4831]: W1203 07:07:44.905516 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48f08298_7a03_4a22_b570_1f498f6cb980.slice/crio-e11178fb401cb86a2aeeceeafe6e2e842724b39e5db30c050d11954389068ee2 WatchSource:0}: Error finding container e11178fb401cb86a2aeeceeafe6e2e842724b39e5db30c050d11954389068ee2: Status 404 returned error can't find the container with id e11178fb401cb86a2aeeceeafe6e2e842724b39e5db30c050d11954389068ee2 Dec 03 07:07:45 crc kubenswrapper[4831]: I1203 07:07:45.071094 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tblqc"] Dec 03 07:07:45 crc kubenswrapper[4831]: I1203 07:07:45.463170 4831 generic.go:334] "Generic (PLEG): container finished" podID="48f08298-7a03-4a22-b570-1f498f6cb980" containerID="df7f79bf7c7b9891affdb059297ebf134acfec25a88513cc8a8ef6888fa71fb3" exitCode=0 Dec 03 07:07:45 crc kubenswrapper[4831]: I1203 07:07:45.463271 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpvlk" event={"ID":"48f08298-7a03-4a22-b570-1f498f6cb980","Type":"ContainerDied","Data":"df7f79bf7c7b9891affdb059297ebf134acfec25a88513cc8a8ef6888fa71fb3"} Dec 03 07:07:45 crc kubenswrapper[4831]: I1203 07:07:45.463302 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpvlk" event={"ID":"48f08298-7a03-4a22-b570-1f498f6cb980","Type":"ContainerStarted","Data":"e11178fb401cb86a2aeeceeafe6e2e842724b39e5db30c050d11954389068ee2"} Dec 03 07:07:45 crc kubenswrapper[4831]: I1203 07:07:45.466349 4831 generic.go:334] "Generic (PLEG): container finished" podID="fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" containerID="b012826cc876b9b2650fafd2436ffece0a19f1f69bcd3e07b9fc0cbb73bf889f" exitCode=0 Dec 03 07:07:45 crc kubenswrapper[4831]: I1203 07:07:45.466442 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tblqc" event={"ID":"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e","Type":"ContainerDied","Data":"b012826cc876b9b2650fafd2436ffece0a19f1f69bcd3e07b9fc0cbb73bf889f"} Dec 03 07:07:45 crc kubenswrapper[4831]: I1203 07:07:45.466516 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tblqc" event={"ID":"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e","Type":"ContainerStarted","Data":"3363c4e53fb78adac3c42c7ec94d68ccdc0aa9e84f3cd31d804d998f35b9314e"} Dec 03 07:07:46 crc kubenswrapper[4831]: I1203 07:07:46.475416 4831 generic.go:334] "Generic (PLEG): container finished" podID="fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" containerID="ae5c457a0a80100b354f942a8f24a5decf689f2bd11ac029077de44ac806671a" exitCode=0 Dec 03 07:07:46 crc kubenswrapper[4831]: I1203 07:07:46.475632 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tblqc" event={"ID":"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e","Type":"ContainerDied","Data":"ae5c457a0a80100b354f942a8f24a5decf689f2bd11ac029077de44ac806671a"} Dec 03 07:07:46 crc kubenswrapper[4831]: I1203 07:07:46.479299 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpvlk" event={"ID":"48f08298-7a03-4a22-b570-1f498f6cb980","Type":"ContainerStarted","Data":"5517c25e44b686874c6f28195d37b5e7faf9cf469711f9ad3e1536caad0c8cff"} Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.419666 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-49sh2"] Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.422945 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.439350 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-49sh2"] Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.488926 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tblqc" event={"ID":"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e","Type":"ContainerStarted","Data":"790b6f2d107fba22e5577675aaac214a0dc0a175a48e93db530880889d53be0f"} Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.490976 4831 generic.go:334] "Generic (PLEG): container finished" podID="48f08298-7a03-4a22-b570-1f498f6cb980" containerID="5517c25e44b686874c6f28195d37b5e7faf9cf469711f9ad3e1536caad0c8cff" exitCode=0 Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.491013 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpvlk" event={"ID":"48f08298-7a03-4a22-b570-1f498f6cb980","Type":"ContainerDied","Data":"5517c25e44b686874c6f28195d37b5e7faf9cf469711f9ad3e1536caad0c8cff"} Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.499486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08595bb8-301f-446f-ac7a-be72b8f154e0-utilities\") pod \"redhat-operators-49sh2\" (UID: \"08595bb8-301f-446f-ac7a-be72b8f154e0\") " pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.499598 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08595bb8-301f-446f-ac7a-be72b8f154e0-catalog-content\") pod \"redhat-operators-49sh2\" (UID: \"08595bb8-301f-446f-ac7a-be72b8f154e0\") " pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.499694 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djjr6\" (UniqueName: \"kubernetes.io/projected/08595bb8-301f-446f-ac7a-be72b8f154e0-kube-api-access-djjr6\") pod \"redhat-operators-49sh2\" (UID: \"08595bb8-301f-446f-ac7a-be72b8f154e0\") " pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.509059 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tblqc" podStartSLOduration=2.139145079 podStartE2EDuration="3.509042496s" podCreationTimestamp="2025-12-03 07:07:44 +0000 UTC" firstStartedPulling="2025-12-03 07:07:45.468082275 +0000 UTC m=+2202.811665783" lastFinishedPulling="2025-12-03 07:07:46.837979692 +0000 UTC m=+2204.181563200" observedRunningTime="2025-12-03 07:07:47.508464967 +0000 UTC m=+2204.852048495" watchObservedRunningTime="2025-12-03 07:07:47.509042496 +0000 UTC m=+2204.852626004" Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.600972 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08595bb8-301f-446f-ac7a-be72b8f154e0-utilities\") pod \"redhat-operators-49sh2\" (UID: \"08595bb8-301f-446f-ac7a-be72b8f154e0\") " pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.601030 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08595bb8-301f-446f-ac7a-be72b8f154e0-catalog-content\") pod \"redhat-operators-49sh2\" (UID: \"08595bb8-301f-446f-ac7a-be72b8f154e0\") " pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.601054 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djjr6\" (UniqueName: \"kubernetes.io/projected/08595bb8-301f-446f-ac7a-be72b8f154e0-kube-api-access-djjr6\") pod \"redhat-operators-49sh2\" (UID: \"08595bb8-301f-446f-ac7a-be72b8f154e0\") " pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.601508 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08595bb8-301f-446f-ac7a-be72b8f154e0-utilities\") pod \"redhat-operators-49sh2\" (UID: \"08595bb8-301f-446f-ac7a-be72b8f154e0\") " pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.601560 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08595bb8-301f-446f-ac7a-be72b8f154e0-catalog-content\") pod \"redhat-operators-49sh2\" (UID: \"08595bb8-301f-446f-ac7a-be72b8f154e0\") " pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.640646 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djjr6\" (UniqueName: \"kubernetes.io/projected/08595bb8-301f-446f-ac7a-be72b8f154e0-kube-api-access-djjr6\") pod \"redhat-operators-49sh2\" (UID: \"08595bb8-301f-446f-ac7a-be72b8f154e0\") " pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:47 crc kubenswrapper[4831]: I1203 07:07:47.741501 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:48 crc kubenswrapper[4831]: I1203 07:07:48.189281 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-49sh2"] Dec 03 07:07:48 crc kubenswrapper[4831]: I1203 07:07:48.497695 4831 generic.go:334] "Generic (PLEG): container finished" podID="08595bb8-301f-446f-ac7a-be72b8f154e0" containerID="ebb0b35794a292a8c9efea12feb3eeacb5ebaefa35f88e670aface0c033d3d1b" exitCode=0 Dec 03 07:07:48 crc kubenswrapper[4831]: I1203 07:07:48.497763 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49sh2" event={"ID":"08595bb8-301f-446f-ac7a-be72b8f154e0","Type":"ContainerDied","Data":"ebb0b35794a292a8c9efea12feb3eeacb5ebaefa35f88e670aface0c033d3d1b"} Dec 03 07:07:48 crc kubenswrapper[4831]: I1203 07:07:48.497792 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49sh2" event={"ID":"08595bb8-301f-446f-ac7a-be72b8f154e0","Type":"ContainerStarted","Data":"484f9d2fe7734123793f7f4fc1a76a0d9c025cc43b8fced6984cac7e796f6ff3"} Dec 03 07:07:48 crc kubenswrapper[4831]: I1203 07:07:48.502954 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpvlk" event={"ID":"48f08298-7a03-4a22-b570-1f498f6cb980","Type":"ContainerStarted","Data":"c77e6afe1b0fe0e94be698d083573db1f7ace7d6fb2656e91c55f519ab4698ae"} Dec 03 07:07:48 crc kubenswrapper[4831]: I1203 07:07:48.547792 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qpvlk" podStartSLOduration=2.185098075 podStartE2EDuration="4.547776692s" podCreationTimestamp="2025-12-03 07:07:44 +0000 UTC" firstStartedPulling="2025-12-03 07:07:45.465095921 +0000 UTC m=+2202.808679429" lastFinishedPulling="2025-12-03 07:07:47.827774528 +0000 UTC m=+2205.171358046" observedRunningTime="2025-12-03 07:07:48.544754917 +0000 UTC m=+2205.888338415" watchObservedRunningTime="2025-12-03 07:07:48.547776692 +0000 UTC m=+2205.891360190" Dec 03 07:07:50 crc kubenswrapper[4831]: I1203 07:07:50.520945 4831 generic.go:334] "Generic (PLEG): container finished" podID="08595bb8-301f-446f-ac7a-be72b8f154e0" containerID="c868471d7a15de93cb94a77d1f67dde84585fc52f086347507e6b946dcf13e43" exitCode=0 Dec 03 07:07:50 crc kubenswrapper[4831]: I1203 07:07:50.521293 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49sh2" event={"ID":"08595bb8-301f-446f-ac7a-be72b8f154e0","Type":"ContainerDied","Data":"c868471d7a15de93cb94a77d1f67dde84585fc52f086347507e6b946dcf13e43"} Dec 03 07:07:51 crc kubenswrapper[4831]: I1203 07:07:51.531747 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49sh2" event={"ID":"08595bb8-301f-446f-ac7a-be72b8f154e0","Type":"ContainerStarted","Data":"1ebaa9fb067514ec9b85871329fd96948283468d903b89f5cb9ecd54da16d1fb"} Dec 03 07:07:51 crc kubenswrapper[4831]: I1203 07:07:51.567764 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-49sh2" podStartSLOduration=2.105127991 podStartE2EDuration="4.567733431s" podCreationTimestamp="2025-12-03 07:07:47 +0000 UTC" firstStartedPulling="2025-12-03 07:07:48.499267455 +0000 UTC m=+2205.842850963" lastFinishedPulling="2025-12-03 07:07:50.961872865 +0000 UTC m=+2208.305456403" observedRunningTime="2025-12-03 07:07:51.55487278 +0000 UTC m=+2208.898456328" watchObservedRunningTime="2025-12-03 07:07:51.567733431 +0000 UTC m=+2208.911316939" Dec 03 07:07:54 crc kubenswrapper[4831]: I1203 07:07:54.377463 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:54 crc kubenswrapper[4831]: I1203 07:07:54.377885 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:54 crc kubenswrapper[4831]: I1203 07:07:54.454656 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:54 crc kubenswrapper[4831]: I1203 07:07:54.615669 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:54 crc kubenswrapper[4831]: I1203 07:07:54.780113 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:54 crc kubenswrapper[4831]: I1203 07:07:54.780231 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:54 crc kubenswrapper[4831]: I1203 07:07:54.833753 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:55 crc kubenswrapper[4831]: I1203 07:07:55.634329 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:55 crc kubenswrapper[4831]: I1203 07:07:55.808165 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qpvlk"] Dec 03 07:07:56 crc kubenswrapper[4831]: I1203 07:07:56.587696 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qpvlk" podUID="48f08298-7a03-4a22-b570-1f498f6cb980" containerName="registry-server" containerID="cri-o://c77e6afe1b0fe0e94be698d083573db1f7ace7d6fb2656e91c55f519ab4698ae" gracePeriod=2 Dec 03 07:07:57 crc kubenswrapper[4831]: I1203 07:07:57.209425 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tblqc"] Dec 03 07:07:57 crc kubenswrapper[4831]: I1203 07:07:57.597846 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:07:57 crc kubenswrapper[4831]: I1203 07:07:57.598115 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:07:57 crc kubenswrapper[4831]: I1203 07:07:57.603615 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tblqc" podUID="fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" containerName="registry-server" containerID="cri-o://790b6f2d107fba22e5577675aaac214a0dc0a175a48e93db530880889d53be0f" gracePeriod=2 Dec 03 07:07:57 crc kubenswrapper[4831]: I1203 07:07:57.742329 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:57 crc kubenswrapper[4831]: I1203 07:07:57.742571 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:57 crc kubenswrapper[4831]: I1203 07:07:57.795795 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:58 crc kubenswrapper[4831]: I1203 07:07:58.614133 4831 generic.go:334] "Generic (PLEG): container finished" podID="fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" containerID="790b6f2d107fba22e5577675aaac214a0dc0a175a48e93db530880889d53be0f" exitCode=0 Dec 03 07:07:58 crc kubenswrapper[4831]: I1203 07:07:58.614276 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tblqc" event={"ID":"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e","Type":"ContainerDied","Data":"790b6f2d107fba22e5577675aaac214a0dc0a175a48e93db530880889d53be0f"} Dec 03 07:07:58 crc kubenswrapper[4831]: I1203 07:07:58.617640 4831 generic.go:334] "Generic (PLEG): container finished" podID="48f08298-7a03-4a22-b570-1f498f6cb980" containerID="c77e6afe1b0fe0e94be698d083573db1f7ace7d6fb2656e91c55f519ab4698ae" exitCode=0 Dec 03 07:07:58 crc kubenswrapper[4831]: I1203 07:07:58.617765 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpvlk" event={"ID":"48f08298-7a03-4a22-b570-1f498f6cb980","Type":"ContainerDied","Data":"c77e6afe1b0fe0e94be698d083573db1f7ace7d6fb2656e91c55f519ab4698ae"} Dec 03 07:07:58 crc kubenswrapper[4831]: I1203 07:07:58.672304 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:07:58 crc kubenswrapper[4831]: I1203 07:07:58.904834 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:58 crc kubenswrapper[4831]: I1203 07:07:58.981700 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.025949 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-utilities\") pod \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\" (UID: \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\") " Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.026024 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2wwnp\" (UniqueName: \"kubernetes.io/projected/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-kube-api-access-2wwnp\") pod \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\" (UID: \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\") " Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.026111 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-catalog-content\") pod \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\" (UID: \"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e\") " Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.026992 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-utilities" (OuterVolumeSpecName: "utilities") pod "fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" (UID: "fb5a31d8-0fe5-4bba-9b31-85056cb3c36e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.043331 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-kube-api-access-2wwnp" (OuterVolumeSpecName: "kube-api-access-2wwnp") pod "fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" (UID: "fb5a31d8-0fe5-4bba-9b31-85056cb3c36e"). InnerVolumeSpecName "kube-api-access-2wwnp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.073124 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" (UID: "fb5a31d8-0fe5-4bba-9b31-85056cb3c36e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.127304 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f08298-7a03-4a22-b570-1f498f6cb980-utilities\") pod \"48f08298-7a03-4a22-b570-1f498f6cb980\" (UID: \"48f08298-7a03-4a22-b570-1f498f6cb980\") " Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.127499 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gg56d\" (UniqueName: \"kubernetes.io/projected/48f08298-7a03-4a22-b570-1f498f6cb980-kube-api-access-gg56d\") pod \"48f08298-7a03-4a22-b570-1f498f6cb980\" (UID: \"48f08298-7a03-4a22-b570-1f498f6cb980\") " Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.127532 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f08298-7a03-4a22-b570-1f498f6cb980-catalog-content\") pod \"48f08298-7a03-4a22-b570-1f498f6cb980\" (UID: \"48f08298-7a03-4a22-b570-1f498f6cb980\") " Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.127876 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.127893 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2wwnp\" (UniqueName: \"kubernetes.io/projected/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-kube-api-access-2wwnp\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.127906 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.129045 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f08298-7a03-4a22-b570-1f498f6cb980-utilities" (OuterVolumeSpecName: "utilities") pod "48f08298-7a03-4a22-b570-1f498f6cb980" (UID: "48f08298-7a03-4a22-b570-1f498f6cb980"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.134165 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48f08298-7a03-4a22-b570-1f498f6cb980-kube-api-access-gg56d" (OuterVolumeSpecName: "kube-api-access-gg56d") pod "48f08298-7a03-4a22-b570-1f498f6cb980" (UID: "48f08298-7a03-4a22-b570-1f498f6cb980"). InnerVolumeSpecName "kube-api-access-gg56d". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.181555 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48f08298-7a03-4a22-b570-1f498f6cb980-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48f08298-7a03-4a22-b570-1f498f6cb980" (UID: "48f08298-7a03-4a22-b570-1f498f6cb980"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.229054 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gg56d\" (UniqueName: \"kubernetes.io/projected/48f08298-7a03-4a22-b570-1f498f6cb980-kube-api-access-gg56d\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.230375 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48f08298-7a03-4a22-b570-1f498f6cb980-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.230420 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48f08298-7a03-4a22-b570-1f498f6cb980-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.634124 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qpvlk" event={"ID":"48f08298-7a03-4a22-b570-1f498f6cb980","Type":"ContainerDied","Data":"e11178fb401cb86a2aeeceeafe6e2e842724b39e5db30c050d11954389068ee2"} Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.635632 4831 scope.go:117] "RemoveContainer" containerID="c77e6afe1b0fe0e94be698d083573db1f7ace7d6fb2656e91c55f519ab4698ae" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.634147 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qpvlk" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.638174 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tblqc" event={"ID":"fb5a31d8-0fe5-4bba-9b31-85056cb3c36e","Type":"ContainerDied","Data":"3363c4e53fb78adac3c42c7ec94d68ccdc0aa9e84f3cd31d804d998f35b9314e"} Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.638216 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tblqc" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.669201 4831 scope.go:117] "RemoveContainer" containerID="5517c25e44b686874c6f28195d37b5e7faf9cf469711f9ad3e1536caad0c8cff" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.706221 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qpvlk"] Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.725172 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qpvlk"] Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.727964 4831 scope.go:117] "RemoveContainer" containerID="df7f79bf7c7b9891affdb059297ebf134acfec25a88513cc8a8ef6888fa71fb3" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.737837 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tblqc"] Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.746037 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tblqc"] Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.756788 4831 scope.go:117] "RemoveContainer" containerID="790b6f2d107fba22e5577675aaac214a0dc0a175a48e93db530880889d53be0f" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.783915 4831 scope.go:117] "RemoveContainer" containerID="ae5c457a0a80100b354f942a8f24a5decf689f2bd11ac029077de44ac806671a" Dec 03 07:07:59 crc kubenswrapper[4831]: I1203 07:07:59.807829 4831 scope.go:117] "RemoveContainer" containerID="b012826cc876b9b2650fafd2436ffece0a19f1f69bcd3e07b9fc0cbb73bf889f" Dec 03 07:08:00 crc kubenswrapper[4831]: I1203 07:08:00.613286 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-49sh2"] Dec 03 07:08:01 crc kubenswrapper[4831]: I1203 07:08:01.020925 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48f08298-7a03-4a22-b570-1f498f6cb980" path="/var/lib/kubelet/pods/48f08298-7a03-4a22-b570-1f498f6cb980/volumes" Dec 03 07:08:01 crc kubenswrapper[4831]: I1203 07:08:01.022099 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" path="/var/lib/kubelet/pods/fb5a31d8-0fe5-4bba-9b31-85056cb3c36e/volumes" Dec 03 07:08:01 crc kubenswrapper[4831]: I1203 07:08:01.659256 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-49sh2" podUID="08595bb8-301f-446f-ac7a-be72b8f154e0" containerName="registry-server" containerID="cri-o://1ebaa9fb067514ec9b85871329fd96948283468d903b89f5cb9ecd54da16d1fb" gracePeriod=2 Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.558212 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.670676 4831 generic.go:334] "Generic (PLEG): container finished" podID="08595bb8-301f-446f-ac7a-be72b8f154e0" containerID="1ebaa9fb067514ec9b85871329fd96948283468d903b89f5cb9ecd54da16d1fb" exitCode=0 Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.670732 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49sh2" event={"ID":"08595bb8-301f-446f-ac7a-be72b8f154e0","Type":"ContainerDied","Data":"1ebaa9fb067514ec9b85871329fd96948283468d903b89f5cb9ecd54da16d1fb"} Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.670762 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-49sh2" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.670799 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-49sh2" event={"ID":"08595bb8-301f-446f-ac7a-be72b8f154e0","Type":"ContainerDied","Data":"484f9d2fe7734123793f7f4fc1a76a0d9c025cc43b8fced6984cac7e796f6ff3"} Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.670827 4831 scope.go:117] "RemoveContainer" containerID="1ebaa9fb067514ec9b85871329fd96948283468d903b89f5cb9ecd54da16d1fb" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.693948 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08595bb8-301f-446f-ac7a-be72b8f154e0-utilities\") pod \"08595bb8-301f-446f-ac7a-be72b8f154e0\" (UID: \"08595bb8-301f-446f-ac7a-be72b8f154e0\") " Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.694061 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djjr6\" (UniqueName: \"kubernetes.io/projected/08595bb8-301f-446f-ac7a-be72b8f154e0-kube-api-access-djjr6\") pod \"08595bb8-301f-446f-ac7a-be72b8f154e0\" (UID: \"08595bb8-301f-446f-ac7a-be72b8f154e0\") " Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.694167 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08595bb8-301f-446f-ac7a-be72b8f154e0-catalog-content\") pod \"08595bb8-301f-446f-ac7a-be72b8f154e0\" (UID: \"08595bb8-301f-446f-ac7a-be72b8f154e0\") " Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.695363 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08595bb8-301f-446f-ac7a-be72b8f154e0-utilities" (OuterVolumeSpecName: "utilities") pod "08595bb8-301f-446f-ac7a-be72b8f154e0" (UID: "08595bb8-301f-446f-ac7a-be72b8f154e0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.699901 4831 scope.go:117] "RemoveContainer" containerID="c868471d7a15de93cb94a77d1f67dde84585fc52f086347507e6b946dcf13e43" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.700570 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08595bb8-301f-446f-ac7a-be72b8f154e0-kube-api-access-djjr6" (OuterVolumeSpecName: "kube-api-access-djjr6") pod "08595bb8-301f-446f-ac7a-be72b8f154e0" (UID: "08595bb8-301f-446f-ac7a-be72b8f154e0"). InnerVolumeSpecName "kube-api-access-djjr6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.751930 4831 scope.go:117] "RemoveContainer" containerID="ebb0b35794a292a8c9efea12feb3eeacb5ebaefa35f88e670aface0c033d3d1b" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.789633 4831 scope.go:117] "RemoveContainer" containerID="1ebaa9fb067514ec9b85871329fd96948283468d903b89f5cb9ecd54da16d1fb" Dec 03 07:08:02 crc kubenswrapper[4831]: E1203 07:08:02.790133 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ebaa9fb067514ec9b85871329fd96948283468d903b89f5cb9ecd54da16d1fb\": container with ID starting with 1ebaa9fb067514ec9b85871329fd96948283468d903b89f5cb9ecd54da16d1fb not found: ID does not exist" containerID="1ebaa9fb067514ec9b85871329fd96948283468d903b89f5cb9ecd54da16d1fb" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.790197 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ebaa9fb067514ec9b85871329fd96948283468d903b89f5cb9ecd54da16d1fb"} err="failed to get container status \"1ebaa9fb067514ec9b85871329fd96948283468d903b89f5cb9ecd54da16d1fb\": rpc error: code = NotFound desc = could not find container \"1ebaa9fb067514ec9b85871329fd96948283468d903b89f5cb9ecd54da16d1fb\": container with ID starting with 1ebaa9fb067514ec9b85871329fd96948283468d903b89f5cb9ecd54da16d1fb not found: ID does not exist" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.790233 4831 scope.go:117] "RemoveContainer" containerID="c868471d7a15de93cb94a77d1f67dde84585fc52f086347507e6b946dcf13e43" Dec 03 07:08:02 crc kubenswrapper[4831]: E1203 07:08:02.790628 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c868471d7a15de93cb94a77d1f67dde84585fc52f086347507e6b946dcf13e43\": container with ID starting with c868471d7a15de93cb94a77d1f67dde84585fc52f086347507e6b946dcf13e43 not found: ID does not exist" containerID="c868471d7a15de93cb94a77d1f67dde84585fc52f086347507e6b946dcf13e43" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.790959 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c868471d7a15de93cb94a77d1f67dde84585fc52f086347507e6b946dcf13e43"} err="failed to get container status \"c868471d7a15de93cb94a77d1f67dde84585fc52f086347507e6b946dcf13e43\": rpc error: code = NotFound desc = could not find container \"c868471d7a15de93cb94a77d1f67dde84585fc52f086347507e6b946dcf13e43\": container with ID starting with c868471d7a15de93cb94a77d1f67dde84585fc52f086347507e6b946dcf13e43 not found: ID does not exist" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.790995 4831 scope.go:117] "RemoveContainer" containerID="ebb0b35794a292a8c9efea12feb3eeacb5ebaefa35f88e670aface0c033d3d1b" Dec 03 07:08:02 crc kubenswrapper[4831]: E1203 07:08:02.791551 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebb0b35794a292a8c9efea12feb3eeacb5ebaefa35f88e670aface0c033d3d1b\": container with ID starting with ebb0b35794a292a8c9efea12feb3eeacb5ebaefa35f88e670aface0c033d3d1b not found: ID does not exist" containerID="ebb0b35794a292a8c9efea12feb3eeacb5ebaefa35f88e670aface0c033d3d1b" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.791583 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb0b35794a292a8c9efea12feb3eeacb5ebaefa35f88e670aface0c033d3d1b"} err="failed to get container status \"ebb0b35794a292a8c9efea12feb3eeacb5ebaefa35f88e670aface0c033d3d1b\": rpc error: code = NotFound desc = could not find container \"ebb0b35794a292a8c9efea12feb3eeacb5ebaefa35f88e670aface0c033d3d1b\": container with ID starting with ebb0b35794a292a8c9efea12feb3eeacb5ebaefa35f88e670aface0c033d3d1b not found: ID does not exist" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.796065 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08595bb8-301f-446f-ac7a-be72b8f154e0-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.796139 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djjr6\" (UniqueName: \"kubernetes.io/projected/08595bb8-301f-446f-ac7a-be72b8f154e0-kube-api-access-djjr6\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.830356 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08595bb8-301f-446f-ac7a-be72b8f154e0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08595bb8-301f-446f-ac7a-be72b8f154e0" (UID: "08595bb8-301f-446f-ac7a-be72b8f154e0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:08:02 crc kubenswrapper[4831]: I1203 07:08:02.897240 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08595bb8-301f-446f-ac7a-be72b8f154e0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:08:03 crc kubenswrapper[4831]: I1203 07:08:03.038483 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-49sh2"] Dec 03 07:08:03 crc kubenswrapper[4831]: I1203 07:08:03.041453 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-49sh2"] Dec 03 07:08:05 crc kubenswrapper[4831]: I1203 07:08:05.032130 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08595bb8-301f-446f-ac7a-be72b8f154e0" path="/var/lib/kubelet/pods/08595bb8-301f-446f-ac7a-be72b8f154e0/volumes" Dec 03 07:08:06 crc kubenswrapper[4831]: E1203 07:08:06.430389 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08595bb8_301f_446f_ac7a_be72b8f154e0.slice\": RecentStats: unable to find data in memory cache]" Dec 03 07:08:16 crc kubenswrapper[4831]: E1203 07:08:16.645500 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08595bb8_301f_446f_ac7a_be72b8f154e0.slice\": RecentStats: unable to find data in memory cache]" Dec 03 07:08:26 crc kubenswrapper[4831]: E1203 07:08:26.831166 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08595bb8_301f_446f_ac7a_be72b8f154e0.slice\": RecentStats: unable to find data in memory cache]" Dec 03 07:08:27 crc kubenswrapper[4831]: I1203 07:08:27.597259 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:08:27 crc kubenswrapper[4831]: I1203 07:08:27.597411 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:08:27 crc kubenswrapper[4831]: I1203 07:08:27.597497 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 07:08:27 crc kubenswrapper[4831]: I1203 07:08:27.598600 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:08:27 crc kubenswrapper[4831]: I1203 07:08:27.598717 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" gracePeriod=600 Dec 03 07:08:27 crc kubenswrapper[4831]: E1203 07:08:27.739744 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:08:27 crc kubenswrapper[4831]: I1203 07:08:27.942806 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" exitCode=0 Dec 03 07:08:27 crc kubenswrapper[4831]: I1203 07:08:27.942890 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7"} Dec 03 07:08:27 crc kubenswrapper[4831]: I1203 07:08:27.942979 4831 scope.go:117] "RemoveContainer" containerID="d44b88821d6284bb85c274d1d4055519af09d0da6d45790a48b6162617c80d83" Dec 03 07:08:27 crc kubenswrapper[4831]: I1203 07:08:27.943800 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:08:27 crc kubenswrapper[4831]: E1203 07:08:27.944165 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:08:37 crc kubenswrapper[4831]: E1203 07:08:37.035123 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08595bb8_301f_446f_ac7a_be72b8f154e0.slice\": RecentStats: unable to find data in memory cache]" Dec 03 07:08:40 crc kubenswrapper[4831]: I1203 07:08:40.013225 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:08:40 crc kubenswrapper[4831]: E1203 07:08:40.014059 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:08:47 crc kubenswrapper[4831]: E1203 07:08:47.225942 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08595bb8_301f_446f_ac7a_be72b8f154e0.slice\": RecentStats: unable to find data in memory cache]" Dec 03 07:08:52 crc kubenswrapper[4831]: I1203 07:08:52.013152 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:08:52 crc kubenswrapper[4831]: E1203 07:08:52.014555 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:08:57 crc kubenswrapper[4831]: E1203 07:08:57.409887 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod08595bb8_301f_446f_ac7a_be72b8f154e0.slice\": RecentStats: unable to find data in memory cache]" Dec 03 07:09:06 crc kubenswrapper[4831]: I1203 07:09:06.013199 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:09:06 crc kubenswrapper[4831]: E1203 07:09:06.014196 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:09:21 crc kubenswrapper[4831]: I1203 07:09:21.012879 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:09:21 crc kubenswrapper[4831]: E1203 07:09:21.013980 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:09:32 crc kubenswrapper[4831]: I1203 07:09:32.012631 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:09:32 crc kubenswrapper[4831]: E1203 07:09:32.013888 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:09:43 crc kubenswrapper[4831]: I1203 07:09:43.031938 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:09:43 crc kubenswrapper[4831]: E1203 07:09:43.032968 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:09:58 crc kubenswrapper[4831]: I1203 07:09:58.013074 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:09:58 crc kubenswrapper[4831]: E1203 07:09:58.014531 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:10:10 crc kubenswrapper[4831]: I1203 07:10:10.013020 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:10:10 crc kubenswrapper[4831]: E1203 07:10:10.013854 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:10:23 crc kubenswrapper[4831]: I1203 07:10:23.018519 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:10:23 crc kubenswrapper[4831]: E1203 07:10:23.019457 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:10:37 crc kubenswrapper[4831]: I1203 07:10:37.013620 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:10:37 crc kubenswrapper[4831]: E1203 07:10:37.015111 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:10:51 crc kubenswrapper[4831]: I1203 07:10:51.013365 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:10:51 crc kubenswrapper[4831]: E1203 07:10:51.014186 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:11:02 crc kubenswrapper[4831]: I1203 07:11:02.014245 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:11:02 crc kubenswrapper[4831]: E1203 07:11:02.015546 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:11:14 crc kubenswrapper[4831]: I1203 07:11:14.012609 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:11:14 crc kubenswrapper[4831]: E1203 07:11:14.013599 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:11:28 crc kubenswrapper[4831]: I1203 07:11:28.012801 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:11:28 crc kubenswrapper[4831]: E1203 07:11:28.014133 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:11:40 crc kubenswrapper[4831]: I1203 07:11:40.013296 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:11:40 crc kubenswrapper[4831]: E1203 07:11:40.016522 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:11:53 crc kubenswrapper[4831]: I1203 07:11:53.023060 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:11:53 crc kubenswrapper[4831]: E1203 07:11:53.024138 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:12:04 crc kubenswrapper[4831]: I1203 07:12:04.014260 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:12:04 crc kubenswrapper[4831]: E1203 07:12:04.015498 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:12:16 crc kubenswrapper[4831]: I1203 07:12:16.013133 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:12:16 crc kubenswrapper[4831]: E1203 07:12:16.014385 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:12:29 crc kubenswrapper[4831]: I1203 07:12:29.012823 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:12:29 crc kubenswrapper[4831]: E1203 07:12:29.013754 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:12:42 crc kubenswrapper[4831]: I1203 07:12:42.013412 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:12:42 crc kubenswrapper[4831]: E1203 07:12:42.014625 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:12:55 crc kubenswrapper[4831]: I1203 07:12:55.013765 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:12:55 crc kubenswrapper[4831]: E1203 07:12:55.014580 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:13:10 crc kubenswrapper[4831]: I1203 07:13:10.013538 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:13:10 crc kubenswrapper[4831]: E1203 07:13:10.014767 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:13:23 crc kubenswrapper[4831]: I1203 07:13:23.024839 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:13:23 crc kubenswrapper[4831]: E1203 07:13:23.026014 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:13:37 crc kubenswrapper[4831]: I1203 07:13:37.012454 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:13:37 crc kubenswrapper[4831]: I1203 07:13:37.878740 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"4336866e55730b7395f5002fc32d90b28c6eace59d8f31857ec17526f866a09a"} Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.628051 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pp2kl"] Dec 03 07:14:41 crc kubenswrapper[4831]: E1203 07:14:41.629690 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08595bb8-301f-446f-ac7a-be72b8f154e0" containerName="registry-server" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.629720 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="08595bb8-301f-446f-ac7a-be72b8f154e0" containerName="registry-server" Dec 03 07:14:41 crc kubenswrapper[4831]: E1203 07:14:41.629758 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f08298-7a03-4a22-b570-1f498f6cb980" containerName="extract-utilities" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.629770 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f08298-7a03-4a22-b570-1f498f6cb980" containerName="extract-utilities" Dec 03 07:14:41 crc kubenswrapper[4831]: E1203 07:14:41.629785 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08595bb8-301f-446f-ac7a-be72b8f154e0" containerName="extract-utilities" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.629795 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="08595bb8-301f-446f-ac7a-be72b8f154e0" containerName="extract-utilities" Dec 03 07:14:41 crc kubenswrapper[4831]: E1203 07:14:41.629809 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" containerName="extract-utilities" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.629821 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" containerName="extract-utilities" Dec 03 07:14:41 crc kubenswrapper[4831]: E1203 07:14:41.629840 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f08298-7a03-4a22-b570-1f498f6cb980" containerName="extract-content" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.629868 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f08298-7a03-4a22-b570-1f498f6cb980" containerName="extract-content" Dec 03 07:14:41 crc kubenswrapper[4831]: E1203 07:14:41.629891 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08595bb8-301f-446f-ac7a-be72b8f154e0" containerName="extract-content" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.629902 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="08595bb8-301f-446f-ac7a-be72b8f154e0" containerName="extract-content" Dec 03 07:14:41 crc kubenswrapper[4831]: E1203 07:14:41.629923 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" containerName="extract-content" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.629934 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" containerName="extract-content" Dec 03 07:14:41 crc kubenswrapper[4831]: E1203 07:14:41.629952 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" containerName="registry-server" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.629960 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" containerName="registry-server" Dec 03 07:14:41 crc kubenswrapper[4831]: E1203 07:14:41.630141 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48f08298-7a03-4a22-b570-1f498f6cb980" containerName="registry-server" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.630150 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="48f08298-7a03-4a22-b570-1f498f6cb980" containerName="registry-server" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.630403 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="48f08298-7a03-4a22-b570-1f498f6cb980" containerName="registry-server" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.630435 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb5a31d8-0fe5-4bba-9b31-85056cb3c36e" containerName="registry-server" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.630461 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="08595bb8-301f-446f-ac7a-be72b8f154e0" containerName="registry-server" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.632132 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.645609 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pp2kl"] Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.774307 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24ce4ce-3ff0-420e-be47-9ff0d384af49-utilities\") pod \"redhat-marketplace-pp2kl\" (UID: \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\") " pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.774479 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24ce4ce-3ff0-420e-be47-9ff0d384af49-catalog-content\") pod \"redhat-marketplace-pp2kl\" (UID: \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\") " pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.774512 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm2lp\" (UniqueName: \"kubernetes.io/projected/a24ce4ce-3ff0-420e-be47-9ff0d384af49-kube-api-access-pm2lp\") pod \"redhat-marketplace-pp2kl\" (UID: \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\") " pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.875738 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24ce4ce-3ff0-420e-be47-9ff0d384af49-catalog-content\") pod \"redhat-marketplace-pp2kl\" (UID: \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\") " pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.875781 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm2lp\" (UniqueName: \"kubernetes.io/projected/a24ce4ce-3ff0-420e-be47-9ff0d384af49-kube-api-access-pm2lp\") pod \"redhat-marketplace-pp2kl\" (UID: \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\") " pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.875826 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24ce4ce-3ff0-420e-be47-9ff0d384af49-utilities\") pod \"redhat-marketplace-pp2kl\" (UID: \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\") " pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.876511 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24ce4ce-3ff0-420e-be47-9ff0d384af49-catalog-content\") pod \"redhat-marketplace-pp2kl\" (UID: \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\") " pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.876552 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24ce4ce-3ff0-420e-be47-9ff0d384af49-utilities\") pod \"redhat-marketplace-pp2kl\" (UID: \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\") " pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.897391 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm2lp\" (UniqueName: \"kubernetes.io/projected/a24ce4ce-3ff0-420e-be47-9ff0d384af49-kube-api-access-pm2lp\") pod \"redhat-marketplace-pp2kl\" (UID: \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\") " pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:41 crc kubenswrapper[4831]: I1203 07:14:41.977004 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:42 crc kubenswrapper[4831]: I1203 07:14:42.213399 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pp2kl"] Dec 03 07:14:42 crc kubenswrapper[4831]: I1203 07:14:42.528621 4831 generic.go:334] "Generic (PLEG): container finished" podID="a24ce4ce-3ff0-420e-be47-9ff0d384af49" containerID="7f33dbdf3b5d469f40da75a6e17eb7dd67621c7baa6d17d66554a5a72b22e5af" exitCode=0 Dec 03 07:14:42 crc kubenswrapper[4831]: I1203 07:14:42.528740 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pp2kl" event={"ID":"a24ce4ce-3ff0-420e-be47-9ff0d384af49","Type":"ContainerDied","Data":"7f33dbdf3b5d469f40da75a6e17eb7dd67621c7baa6d17d66554a5a72b22e5af"} Dec 03 07:14:42 crc kubenswrapper[4831]: I1203 07:14:42.529027 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pp2kl" event={"ID":"a24ce4ce-3ff0-420e-be47-9ff0d384af49","Type":"ContainerStarted","Data":"7697a335f4e63787474f406373c667381d8ffa512744e3c28fd55a9ddcca7e8d"} Dec 03 07:14:42 crc kubenswrapper[4831]: I1203 07:14:42.530723 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:14:44 crc kubenswrapper[4831]: I1203 07:14:44.549844 4831 generic.go:334] "Generic (PLEG): container finished" podID="a24ce4ce-3ff0-420e-be47-9ff0d384af49" containerID="643779965e88ed3fed729e20edf336b30a19af4a2057dab06d9d73e672bf4f2c" exitCode=0 Dec 03 07:14:44 crc kubenswrapper[4831]: I1203 07:14:44.549965 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pp2kl" event={"ID":"a24ce4ce-3ff0-420e-be47-9ff0d384af49","Type":"ContainerDied","Data":"643779965e88ed3fed729e20edf336b30a19af4a2057dab06d9d73e672bf4f2c"} Dec 03 07:14:45 crc kubenswrapper[4831]: I1203 07:14:45.563122 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pp2kl" event={"ID":"a24ce4ce-3ff0-420e-be47-9ff0d384af49","Type":"ContainerStarted","Data":"3562807675f215b6503169096e0889fffc5925a0836d34992630770fd558cb8f"} Dec 03 07:14:45 crc kubenswrapper[4831]: I1203 07:14:45.580784 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pp2kl" podStartSLOduration=2.048819519 podStartE2EDuration="4.580767203s" podCreationTimestamp="2025-12-03 07:14:41 +0000 UTC" firstStartedPulling="2025-12-03 07:14:42.530308008 +0000 UTC m=+2619.873891526" lastFinishedPulling="2025-12-03 07:14:45.062255702 +0000 UTC m=+2622.405839210" observedRunningTime="2025-12-03 07:14:45.57943204 +0000 UTC m=+2622.923015548" watchObservedRunningTime="2025-12-03 07:14:45.580767203 +0000 UTC m=+2622.924350711" Dec 03 07:14:51 crc kubenswrapper[4831]: I1203 07:14:51.977420 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:51 crc kubenswrapper[4831]: I1203 07:14:51.977935 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:52 crc kubenswrapper[4831]: I1203 07:14:52.029523 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:52 crc kubenswrapper[4831]: I1203 07:14:52.714775 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:52 crc kubenswrapper[4831]: I1203 07:14:52.790715 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pp2kl"] Dec 03 07:14:54 crc kubenswrapper[4831]: I1203 07:14:54.646713 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pp2kl" podUID="a24ce4ce-3ff0-420e-be47-9ff0d384af49" containerName="registry-server" containerID="cri-o://3562807675f215b6503169096e0889fffc5925a0836d34992630770fd558cb8f" gracePeriod=2 Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.570097 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.588476 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pm2lp\" (UniqueName: \"kubernetes.io/projected/a24ce4ce-3ff0-420e-be47-9ff0d384af49-kube-api-access-pm2lp\") pod \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\" (UID: \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\") " Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.588567 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24ce4ce-3ff0-420e-be47-9ff0d384af49-catalog-content\") pod \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\" (UID: \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\") " Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.588600 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24ce4ce-3ff0-420e-be47-9ff0d384af49-utilities\") pod \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\" (UID: \"a24ce4ce-3ff0-420e-be47-9ff0d384af49\") " Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.591903 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a24ce4ce-3ff0-420e-be47-9ff0d384af49-utilities" (OuterVolumeSpecName: "utilities") pod "a24ce4ce-3ff0-420e-be47-9ff0d384af49" (UID: "a24ce4ce-3ff0-420e-be47-9ff0d384af49"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.599553 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a24ce4ce-3ff0-420e-be47-9ff0d384af49-kube-api-access-pm2lp" (OuterVolumeSpecName: "kube-api-access-pm2lp") pod "a24ce4ce-3ff0-420e-be47-9ff0d384af49" (UID: "a24ce4ce-3ff0-420e-be47-9ff0d384af49"). InnerVolumeSpecName "kube-api-access-pm2lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.619029 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a24ce4ce-3ff0-420e-be47-9ff0d384af49-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a24ce4ce-3ff0-420e-be47-9ff0d384af49" (UID: "a24ce4ce-3ff0-420e-be47-9ff0d384af49"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.656742 4831 generic.go:334] "Generic (PLEG): container finished" podID="a24ce4ce-3ff0-420e-be47-9ff0d384af49" containerID="3562807675f215b6503169096e0889fffc5925a0836d34992630770fd558cb8f" exitCode=0 Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.656797 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pp2kl" event={"ID":"a24ce4ce-3ff0-420e-be47-9ff0d384af49","Type":"ContainerDied","Data":"3562807675f215b6503169096e0889fffc5925a0836d34992630770fd558cb8f"} Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.656810 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pp2kl" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.656830 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pp2kl" event={"ID":"a24ce4ce-3ff0-420e-be47-9ff0d384af49","Type":"ContainerDied","Data":"7697a335f4e63787474f406373c667381d8ffa512744e3c28fd55a9ddcca7e8d"} Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.656851 4831 scope.go:117] "RemoveContainer" containerID="3562807675f215b6503169096e0889fffc5925a0836d34992630770fd558cb8f" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.686219 4831 scope.go:117] "RemoveContainer" containerID="643779965e88ed3fed729e20edf336b30a19af4a2057dab06d9d73e672bf4f2c" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.691930 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pm2lp\" (UniqueName: \"kubernetes.io/projected/a24ce4ce-3ff0-420e-be47-9ff0d384af49-kube-api-access-pm2lp\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.691971 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a24ce4ce-3ff0-420e-be47-9ff0d384af49-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.691984 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a24ce4ce-3ff0-420e-be47-9ff0d384af49-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.696383 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pp2kl"] Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.701985 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pp2kl"] Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.714682 4831 scope.go:117] "RemoveContainer" containerID="7f33dbdf3b5d469f40da75a6e17eb7dd67621c7baa6d17d66554a5a72b22e5af" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.744834 4831 scope.go:117] "RemoveContainer" containerID="3562807675f215b6503169096e0889fffc5925a0836d34992630770fd558cb8f" Dec 03 07:14:55 crc kubenswrapper[4831]: E1203 07:14:55.745397 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3562807675f215b6503169096e0889fffc5925a0836d34992630770fd558cb8f\": container with ID starting with 3562807675f215b6503169096e0889fffc5925a0836d34992630770fd558cb8f not found: ID does not exist" containerID="3562807675f215b6503169096e0889fffc5925a0836d34992630770fd558cb8f" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.745453 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3562807675f215b6503169096e0889fffc5925a0836d34992630770fd558cb8f"} err="failed to get container status \"3562807675f215b6503169096e0889fffc5925a0836d34992630770fd558cb8f\": rpc error: code = NotFound desc = could not find container \"3562807675f215b6503169096e0889fffc5925a0836d34992630770fd558cb8f\": container with ID starting with 3562807675f215b6503169096e0889fffc5925a0836d34992630770fd558cb8f not found: ID does not exist" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.745475 4831 scope.go:117] "RemoveContainer" containerID="643779965e88ed3fed729e20edf336b30a19af4a2057dab06d9d73e672bf4f2c" Dec 03 07:14:55 crc kubenswrapper[4831]: E1203 07:14:55.745947 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"643779965e88ed3fed729e20edf336b30a19af4a2057dab06d9d73e672bf4f2c\": container with ID starting with 643779965e88ed3fed729e20edf336b30a19af4a2057dab06d9d73e672bf4f2c not found: ID does not exist" containerID="643779965e88ed3fed729e20edf336b30a19af4a2057dab06d9d73e672bf4f2c" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.745984 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"643779965e88ed3fed729e20edf336b30a19af4a2057dab06d9d73e672bf4f2c"} err="failed to get container status \"643779965e88ed3fed729e20edf336b30a19af4a2057dab06d9d73e672bf4f2c\": rpc error: code = NotFound desc = could not find container \"643779965e88ed3fed729e20edf336b30a19af4a2057dab06d9d73e672bf4f2c\": container with ID starting with 643779965e88ed3fed729e20edf336b30a19af4a2057dab06d9d73e672bf4f2c not found: ID does not exist" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.745997 4831 scope.go:117] "RemoveContainer" containerID="7f33dbdf3b5d469f40da75a6e17eb7dd67621c7baa6d17d66554a5a72b22e5af" Dec 03 07:14:55 crc kubenswrapper[4831]: E1203 07:14:55.746170 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f33dbdf3b5d469f40da75a6e17eb7dd67621c7baa6d17d66554a5a72b22e5af\": container with ID starting with 7f33dbdf3b5d469f40da75a6e17eb7dd67621c7baa6d17d66554a5a72b22e5af not found: ID does not exist" containerID="7f33dbdf3b5d469f40da75a6e17eb7dd67621c7baa6d17d66554a5a72b22e5af" Dec 03 07:14:55 crc kubenswrapper[4831]: I1203 07:14:55.746200 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f33dbdf3b5d469f40da75a6e17eb7dd67621c7baa6d17d66554a5a72b22e5af"} err="failed to get container status \"7f33dbdf3b5d469f40da75a6e17eb7dd67621c7baa6d17d66554a5a72b22e5af\": rpc error: code = NotFound desc = could not find container \"7f33dbdf3b5d469f40da75a6e17eb7dd67621c7baa6d17d66554a5a72b22e5af\": container with ID starting with 7f33dbdf3b5d469f40da75a6e17eb7dd67621c7baa6d17d66554a5a72b22e5af not found: ID does not exist" Dec 03 07:14:57 crc kubenswrapper[4831]: I1203 07:14:57.031687 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a24ce4ce-3ff0-420e-be47-9ff0d384af49" path="/var/lib/kubelet/pods/a24ce4ce-3ff0-420e-be47-9ff0d384af49/volumes" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.164109 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65"] Dec 03 07:15:00 crc kubenswrapper[4831]: E1203 07:15:00.164658 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24ce4ce-3ff0-420e-be47-9ff0d384af49" containerName="registry-server" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.164683 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24ce4ce-3ff0-420e-be47-9ff0d384af49" containerName="registry-server" Dec 03 07:15:00 crc kubenswrapper[4831]: E1203 07:15:00.164708 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24ce4ce-3ff0-420e-be47-9ff0d384af49" containerName="extract-utilities" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.164721 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24ce4ce-3ff0-420e-be47-9ff0d384af49" containerName="extract-utilities" Dec 03 07:15:00 crc kubenswrapper[4831]: E1203 07:15:00.164756 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a24ce4ce-3ff0-420e-be47-9ff0d384af49" containerName="extract-content" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.164768 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a24ce4ce-3ff0-420e-be47-9ff0d384af49" containerName="extract-content" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.165026 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a24ce4ce-3ff0-420e-be47-9ff0d384af49" containerName="registry-server" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.165759 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.168882 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.169537 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.188940 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65"] Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.267082 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-config-volume\") pod \"collect-profiles-29412435-jch65\" (UID: \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.267368 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-secret-volume\") pod \"collect-profiles-29412435-jch65\" (UID: \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.267502 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s89px\" (UniqueName: \"kubernetes.io/projected/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-kube-api-access-s89px\") pod \"collect-profiles-29412435-jch65\" (UID: \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.368580 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-config-volume\") pod \"collect-profiles-29412435-jch65\" (UID: \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.368668 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-secret-volume\") pod \"collect-profiles-29412435-jch65\" (UID: \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.368701 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s89px\" (UniqueName: \"kubernetes.io/projected/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-kube-api-access-s89px\") pod \"collect-profiles-29412435-jch65\" (UID: \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.370004 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-config-volume\") pod \"collect-profiles-29412435-jch65\" (UID: \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.383100 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-secret-volume\") pod \"collect-profiles-29412435-jch65\" (UID: \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.402217 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s89px\" (UniqueName: \"kubernetes.io/projected/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-kube-api-access-s89px\") pod \"collect-profiles-29412435-jch65\" (UID: \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" Dec 03 07:15:00 crc kubenswrapper[4831]: I1203 07:15:00.539901 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" Dec 03 07:15:01 crc kubenswrapper[4831]: I1203 07:15:01.023210 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65"] Dec 03 07:15:01 crc kubenswrapper[4831]: W1203 07:15:01.025016 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06133fcc_8cb2_4cd0_8d86_d18e7fbb838c.slice/crio-eb29b36b986a37bff76e1bef9f5c7e070779ce151c14f6e10b463953821d8c3f WatchSource:0}: Error finding container eb29b36b986a37bff76e1bef9f5c7e070779ce151c14f6e10b463953821d8c3f: Status 404 returned error can't find the container with id eb29b36b986a37bff76e1bef9f5c7e070779ce151c14f6e10b463953821d8c3f Dec 03 07:15:01 crc kubenswrapper[4831]: I1203 07:15:01.716972 4831 generic.go:334] "Generic (PLEG): container finished" podID="06133fcc-8cb2-4cd0-8d86-d18e7fbb838c" containerID="b2a439d81b7701e329ea12c32447d28753fa831e721e77cdd70233207db7e14a" exitCode=0 Dec 03 07:15:01 crc kubenswrapper[4831]: I1203 07:15:01.717479 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" event={"ID":"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c","Type":"ContainerDied","Data":"b2a439d81b7701e329ea12c32447d28753fa831e721e77cdd70233207db7e14a"} Dec 03 07:15:01 crc kubenswrapper[4831]: I1203 07:15:01.717535 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" event={"ID":"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c","Type":"ContainerStarted","Data":"eb29b36b986a37bff76e1bef9f5c7e070779ce151c14f6e10b463953821d8c3f"} Dec 03 07:15:03 crc kubenswrapper[4831]: I1203 07:15:03.049671 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" Dec 03 07:15:03 crc kubenswrapper[4831]: I1203 07:15:03.205990 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s89px\" (UniqueName: \"kubernetes.io/projected/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-kube-api-access-s89px\") pod \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\" (UID: \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\") " Dec 03 07:15:03 crc kubenswrapper[4831]: I1203 07:15:03.206123 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-config-volume\") pod \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\" (UID: \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\") " Dec 03 07:15:03 crc kubenswrapper[4831]: I1203 07:15:03.206228 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-secret-volume\") pod \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\" (UID: \"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c\") " Dec 03 07:15:03 crc kubenswrapper[4831]: I1203 07:15:03.207410 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-config-volume" (OuterVolumeSpecName: "config-volume") pod "06133fcc-8cb2-4cd0-8d86-d18e7fbb838c" (UID: "06133fcc-8cb2-4cd0-8d86-d18e7fbb838c"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:15:03 crc kubenswrapper[4831]: I1203 07:15:03.214376 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06133fcc-8cb2-4cd0-8d86-d18e7fbb838c" (UID: "06133fcc-8cb2-4cd0-8d86-d18e7fbb838c"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:15:03 crc kubenswrapper[4831]: I1203 07:15:03.214493 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-kube-api-access-s89px" (OuterVolumeSpecName: "kube-api-access-s89px") pod "06133fcc-8cb2-4cd0-8d86-d18e7fbb838c" (UID: "06133fcc-8cb2-4cd0-8d86-d18e7fbb838c"). InnerVolumeSpecName "kube-api-access-s89px". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:15:03 crc kubenswrapper[4831]: I1203 07:15:03.308606 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s89px\" (UniqueName: \"kubernetes.io/projected/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-kube-api-access-s89px\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:03 crc kubenswrapper[4831]: I1203 07:15:03.308668 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:03 crc kubenswrapper[4831]: I1203 07:15:03.308693 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:15:03 crc kubenswrapper[4831]: I1203 07:15:03.738020 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" event={"ID":"06133fcc-8cb2-4cd0-8d86-d18e7fbb838c","Type":"ContainerDied","Data":"eb29b36b986a37bff76e1bef9f5c7e070779ce151c14f6e10b463953821d8c3f"} Dec 03 07:15:03 crc kubenswrapper[4831]: I1203 07:15:03.738089 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65" Dec 03 07:15:03 crc kubenswrapper[4831]: I1203 07:15:03.738080 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb29b36b986a37bff76e1bef9f5c7e070779ce151c14f6e10b463953821d8c3f" Dec 03 07:15:04 crc kubenswrapper[4831]: I1203 07:15:04.154896 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w"] Dec 03 07:15:04 crc kubenswrapper[4831]: I1203 07:15:04.174571 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412390-2zb7w"] Dec 03 07:15:05 crc kubenswrapper[4831]: I1203 07:15:05.024771 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e528cdde-c61e-42dd-8c55-e5276df017c6" path="/var/lib/kubelet/pods/e528cdde-c61e-42dd-8c55-e5276df017c6/volumes" Dec 03 07:15:17 crc kubenswrapper[4831]: I1203 07:15:17.108520 4831 scope.go:117] "RemoveContainer" containerID="a7f810c2729b06550a79448fe19586d7a79f83aa112785034d92eece47e37f1f" Dec 03 07:15:57 crc kubenswrapper[4831]: I1203 07:15:57.597406 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:15:57 crc kubenswrapper[4831]: I1203 07:15:57.598140 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:16:27 crc kubenswrapper[4831]: I1203 07:16:27.596906 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:16:27 crc kubenswrapper[4831]: I1203 07:16:27.597820 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:16:57 crc kubenswrapper[4831]: I1203 07:16:57.596936 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:16:57 crc kubenswrapper[4831]: I1203 07:16:57.597670 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:16:57 crc kubenswrapper[4831]: I1203 07:16:57.597925 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 07:16:57 crc kubenswrapper[4831]: I1203 07:16:57.598957 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4336866e55730b7395f5002fc32d90b28c6eace59d8f31857ec17526f866a09a"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:16:57 crc kubenswrapper[4831]: I1203 07:16:57.599066 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://4336866e55730b7395f5002fc32d90b28c6eace59d8f31857ec17526f866a09a" gracePeriod=600 Dec 03 07:16:57 crc kubenswrapper[4831]: I1203 07:16:57.869211 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="4336866e55730b7395f5002fc32d90b28c6eace59d8f31857ec17526f866a09a" exitCode=0 Dec 03 07:16:57 crc kubenswrapper[4831]: I1203 07:16:57.869279 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"4336866e55730b7395f5002fc32d90b28c6eace59d8f31857ec17526f866a09a"} Dec 03 07:16:57 crc kubenswrapper[4831]: I1203 07:16:57.869475 4831 scope.go:117] "RemoveContainer" containerID="77a5558d91c2926eb34e4d040a01d7f30f2b05e3221ec7402f123ab4aa1b7fa7" Dec 03 07:16:58 crc kubenswrapper[4831]: I1203 07:16:58.882113 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9"} Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.172454 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-hn9kq"] Dec 03 07:17:50 crc kubenswrapper[4831]: E1203 07:17:50.173907 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06133fcc-8cb2-4cd0-8d86-d18e7fbb838c" containerName="collect-profiles" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.173943 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="06133fcc-8cb2-4cd0-8d86-d18e7fbb838c" containerName="collect-profiles" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.174411 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="06133fcc-8cb2-4cd0-8d86-d18e7fbb838c" containerName="collect-profiles" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.176923 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.184435 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hn9kq"] Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.276216 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbxpk\" (UniqueName: \"kubernetes.io/projected/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-kube-api-access-zbxpk\") pod \"community-operators-hn9kq\" (UID: \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\") " pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.276293 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-utilities\") pod \"community-operators-hn9kq\" (UID: \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\") " pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.276380 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-catalog-content\") pod \"community-operators-hn9kq\" (UID: \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\") " pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.376804 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-catalog-content\") pod \"community-operators-hn9kq\" (UID: \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\") " pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.376868 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbxpk\" (UniqueName: \"kubernetes.io/projected/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-kube-api-access-zbxpk\") pod \"community-operators-hn9kq\" (UID: \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\") " pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.376910 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-utilities\") pod \"community-operators-hn9kq\" (UID: \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\") " pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.377939 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-catalog-content\") pod \"community-operators-hn9kq\" (UID: \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\") " pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.378136 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-utilities\") pod \"community-operators-hn9kq\" (UID: \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\") " pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.407041 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbxpk\" (UniqueName: \"kubernetes.io/projected/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-kube-api-access-zbxpk\") pod \"community-operators-hn9kq\" (UID: \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\") " pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.527653 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:17:50 crc kubenswrapper[4831]: I1203 07:17:50.889808 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-hn9kq"] Dec 03 07:17:51 crc kubenswrapper[4831]: I1203 07:17:51.354664 4831 generic.go:334] "Generic (PLEG): container finished" podID="45d817fb-f2d8-419b-b3cf-b0a59ca44c33" containerID="dc62a7ef72357d552e1fd83f79891dcc2c7229ad259b4072cb13780cfdc6b9fd" exitCode=0 Dec 03 07:17:51 crc kubenswrapper[4831]: I1203 07:17:51.354817 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hn9kq" event={"ID":"45d817fb-f2d8-419b-b3cf-b0a59ca44c33","Type":"ContainerDied","Data":"dc62a7ef72357d552e1fd83f79891dcc2c7229ad259b4072cb13780cfdc6b9fd"} Dec 03 07:17:51 crc kubenswrapper[4831]: I1203 07:17:51.355121 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hn9kq" event={"ID":"45d817fb-f2d8-419b-b3cf-b0a59ca44c33","Type":"ContainerStarted","Data":"435f92e38b3c258c489ec39a7cf5125214d0dc9d4e6e6b3d719e3da9eeb8d096"} Dec 03 07:17:52 crc kubenswrapper[4831]: I1203 07:17:52.363614 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hn9kq" event={"ID":"45d817fb-f2d8-419b-b3cf-b0a59ca44c33","Type":"ContainerStarted","Data":"189e1ca3339d3c875cd29db55bae1e6267a137231d04f9abddd29f8176709fb8"} Dec 03 07:17:53 crc kubenswrapper[4831]: I1203 07:17:53.381974 4831 generic.go:334] "Generic (PLEG): container finished" podID="45d817fb-f2d8-419b-b3cf-b0a59ca44c33" containerID="189e1ca3339d3c875cd29db55bae1e6267a137231d04f9abddd29f8176709fb8" exitCode=0 Dec 03 07:17:53 crc kubenswrapper[4831]: I1203 07:17:53.382062 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hn9kq" event={"ID":"45d817fb-f2d8-419b-b3cf-b0a59ca44c33","Type":"ContainerDied","Data":"189e1ca3339d3c875cd29db55bae1e6267a137231d04f9abddd29f8176709fb8"} Dec 03 07:17:54 crc kubenswrapper[4831]: I1203 07:17:54.393783 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hn9kq" event={"ID":"45d817fb-f2d8-419b-b3cf-b0a59ca44c33","Type":"ContainerStarted","Data":"8764941701cdbd2aaf457d261631d6f8b7546fc83402f584d512f9796c672bc9"} Dec 03 07:17:54 crc kubenswrapper[4831]: I1203 07:17:54.432543 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-hn9kq" podStartSLOduration=1.961155808 podStartE2EDuration="4.432508293s" podCreationTimestamp="2025-12-03 07:17:50 +0000 UTC" firstStartedPulling="2025-12-03 07:17:51.35670758 +0000 UTC m=+2808.700291128" lastFinishedPulling="2025-12-03 07:17:53.828060085 +0000 UTC m=+2811.171643613" observedRunningTime="2025-12-03 07:17:54.421020123 +0000 UTC m=+2811.764603701" watchObservedRunningTime="2025-12-03 07:17:54.432508293 +0000 UTC m=+2811.776091811" Dec 03 07:18:00 crc kubenswrapper[4831]: I1203 07:18:00.528149 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:18:00 crc kubenswrapper[4831]: I1203 07:18:00.528705 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:18:00 crc kubenswrapper[4831]: I1203 07:18:00.589896 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:18:01 crc kubenswrapper[4831]: I1203 07:18:01.566262 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:18:01 crc kubenswrapper[4831]: I1203 07:18:01.633140 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hn9kq"] Dec 03 07:18:03 crc kubenswrapper[4831]: I1203 07:18:03.503100 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-hn9kq" podUID="45d817fb-f2d8-419b-b3cf-b0a59ca44c33" containerName="registry-server" containerID="cri-o://8764941701cdbd2aaf457d261631d6f8b7546fc83402f584d512f9796c672bc9" gracePeriod=2 Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.383427 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.502560 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-utilities\") pod \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\" (UID: \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\") " Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.503005 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-catalog-content\") pod \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\" (UID: \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\") " Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.504134 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-utilities" (OuterVolumeSpecName: "utilities") pod "45d817fb-f2d8-419b-b3cf-b0a59ca44c33" (UID: "45d817fb-f2d8-419b-b3cf-b0a59ca44c33"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.504552 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbxpk\" (UniqueName: \"kubernetes.io/projected/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-kube-api-access-zbxpk\") pod \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\" (UID: \"45d817fb-f2d8-419b-b3cf-b0a59ca44c33\") " Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.505466 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.513845 4831 generic.go:334] "Generic (PLEG): container finished" podID="45d817fb-f2d8-419b-b3cf-b0a59ca44c33" containerID="8764941701cdbd2aaf457d261631d6f8b7546fc83402f584d512f9796c672bc9" exitCode=0 Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.513946 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hn9kq" event={"ID":"45d817fb-f2d8-419b-b3cf-b0a59ca44c33","Type":"ContainerDied","Data":"8764941701cdbd2aaf457d261631d6f8b7546fc83402f584d512f9796c672bc9"} Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.513980 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-hn9kq" event={"ID":"45d817fb-f2d8-419b-b3cf-b0a59ca44c33","Type":"ContainerDied","Data":"435f92e38b3c258c489ec39a7cf5125214d0dc9d4e6e6b3d719e3da9eeb8d096"} Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.514008 4831 scope.go:117] "RemoveContainer" containerID="8764941701cdbd2aaf457d261631d6f8b7546fc83402f584d512f9796c672bc9" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.514151 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-hn9kq" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.515586 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-kube-api-access-zbxpk" (OuterVolumeSpecName: "kube-api-access-zbxpk") pod "45d817fb-f2d8-419b-b3cf-b0a59ca44c33" (UID: "45d817fb-f2d8-419b-b3cf-b0a59ca44c33"). InnerVolumeSpecName "kube-api-access-zbxpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.558978 4831 scope.go:117] "RemoveContainer" containerID="189e1ca3339d3c875cd29db55bae1e6267a137231d04f9abddd29f8176709fb8" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.580039 4831 scope.go:117] "RemoveContainer" containerID="dc62a7ef72357d552e1fd83f79891dcc2c7229ad259b4072cb13780cfdc6b9fd" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.584749 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45d817fb-f2d8-419b-b3cf-b0a59ca44c33" (UID: "45d817fb-f2d8-419b-b3cf-b0a59ca44c33"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.606392 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.606421 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbxpk\" (UniqueName: \"kubernetes.io/projected/45d817fb-f2d8-419b-b3cf-b0a59ca44c33-kube-api-access-zbxpk\") on node \"crc\" DevicePath \"\"" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.607549 4831 scope.go:117] "RemoveContainer" containerID="8764941701cdbd2aaf457d261631d6f8b7546fc83402f584d512f9796c672bc9" Dec 03 07:18:04 crc kubenswrapper[4831]: E1203 07:18:04.607965 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8764941701cdbd2aaf457d261631d6f8b7546fc83402f584d512f9796c672bc9\": container with ID starting with 8764941701cdbd2aaf457d261631d6f8b7546fc83402f584d512f9796c672bc9 not found: ID does not exist" containerID="8764941701cdbd2aaf457d261631d6f8b7546fc83402f584d512f9796c672bc9" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.607998 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8764941701cdbd2aaf457d261631d6f8b7546fc83402f584d512f9796c672bc9"} err="failed to get container status \"8764941701cdbd2aaf457d261631d6f8b7546fc83402f584d512f9796c672bc9\": rpc error: code = NotFound desc = could not find container \"8764941701cdbd2aaf457d261631d6f8b7546fc83402f584d512f9796c672bc9\": container with ID starting with 8764941701cdbd2aaf457d261631d6f8b7546fc83402f584d512f9796c672bc9 not found: ID does not exist" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.608019 4831 scope.go:117] "RemoveContainer" containerID="189e1ca3339d3c875cd29db55bae1e6267a137231d04f9abddd29f8176709fb8" Dec 03 07:18:04 crc kubenswrapper[4831]: E1203 07:18:04.608331 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189e1ca3339d3c875cd29db55bae1e6267a137231d04f9abddd29f8176709fb8\": container with ID starting with 189e1ca3339d3c875cd29db55bae1e6267a137231d04f9abddd29f8176709fb8 not found: ID does not exist" containerID="189e1ca3339d3c875cd29db55bae1e6267a137231d04f9abddd29f8176709fb8" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.608351 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189e1ca3339d3c875cd29db55bae1e6267a137231d04f9abddd29f8176709fb8"} err="failed to get container status \"189e1ca3339d3c875cd29db55bae1e6267a137231d04f9abddd29f8176709fb8\": rpc error: code = NotFound desc = could not find container \"189e1ca3339d3c875cd29db55bae1e6267a137231d04f9abddd29f8176709fb8\": container with ID starting with 189e1ca3339d3c875cd29db55bae1e6267a137231d04f9abddd29f8176709fb8 not found: ID does not exist" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.608363 4831 scope.go:117] "RemoveContainer" containerID="dc62a7ef72357d552e1fd83f79891dcc2c7229ad259b4072cb13780cfdc6b9fd" Dec 03 07:18:04 crc kubenswrapper[4831]: E1203 07:18:04.608737 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc62a7ef72357d552e1fd83f79891dcc2c7229ad259b4072cb13780cfdc6b9fd\": container with ID starting with dc62a7ef72357d552e1fd83f79891dcc2c7229ad259b4072cb13780cfdc6b9fd not found: ID does not exist" containerID="dc62a7ef72357d552e1fd83f79891dcc2c7229ad259b4072cb13780cfdc6b9fd" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.608791 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc62a7ef72357d552e1fd83f79891dcc2c7229ad259b4072cb13780cfdc6b9fd"} err="failed to get container status \"dc62a7ef72357d552e1fd83f79891dcc2c7229ad259b4072cb13780cfdc6b9fd\": rpc error: code = NotFound desc = could not find container \"dc62a7ef72357d552e1fd83f79891dcc2c7229ad259b4072cb13780cfdc6b9fd\": container with ID starting with dc62a7ef72357d552e1fd83f79891dcc2c7229ad259b4072cb13780cfdc6b9fd not found: ID does not exist" Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.860187 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-hn9kq"] Dec 03 07:18:04 crc kubenswrapper[4831]: I1203 07:18:04.865371 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-hn9kq"] Dec 03 07:18:05 crc kubenswrapper[4831]: I1203 07:18:05.032079 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d817fb-f2d8-419b-b3cf-b0a59ca44c33" path="/var/lib/kubelet/pods/45d817fb-f2d8-419b-b3cf-b0a59ca44c33/volumes" Dec 03 07:18:57 crc kubenswrapper[4831]: I1203 07:18:57.596639 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:18:57 crc kubenswrapper[4831]: I1203 07:18:57.598661 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:19:27 crc kubenswrapper[4831]: I1203 07:19:27.596928 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:19:27 crc kubenswrapper[4831]: I1203 07:19:27.597391 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:19:57 crc kubenswrapper[4831]: I1203 07:19:57.596774 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:19:57 crc kubenswrapper[4831]: I1203 07:19:57.597431 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:19:57 crc kubenswrapper[4831]: I1203 07:19:57.597519 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 07:19:57 crc kubenswrapper[4831]: I1203 07:19:57.598599 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:19:57 crc kubenswrapper[4831]: I1203 07:19:57.598710 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" gracePeriod=600 Dec 03 07:19:57 crc kubenswrapper[4831]: E1203 07:19:57.733401 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:19:58 crc kubenswrapper[4831]: I1203 07:19:58.714249 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" exitCode=0 Dec 03 07:19:58 crc kubenswrapper[4831]: I1203 07:19:58.714396 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9"} Dec 03 07:19:58 crc kubenswrapper[4831]: I1203 07:19:58.714718 4831 scope.go:117] "RemoveContainer" containerID="4336866e55730b7395f5002fc32d90b28c6eace59d8f31857ec17526f866a09a" Dec 03 07:19:58 crc kubenswrapper[4831]: I1203 07:19:58.716141 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:19:58 crc kubenswrapper[4831]: E1203 07:19:58.716934 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:20:10 crc kubenswrapper[4831]: I1203 07:20:10.013964 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:20:10 crc kubenswrapper[4831]: E1203 07:20:10.014987 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:20:23 crc kubenswrapper[4831]: I1203 07:20:23.021197 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:20:23 crc kubenswrapper[4831]: E1203 07:20:23.022212 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:20:36 crc kubenswrapper[4831]: I1203 07:20:36.012665 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:20:36 crc kubenswrapper[4831]: E1203 07:20:36.013812 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:20:48 crc kubenswrapper[4831]: I1203 07:20:48.013652 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:20:48 crc kubenswrapper[4831]: E1203 07:20:48.014676 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:21:01 crc kubenswrapper[4831]: I1203 07:21:01.012490 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:21:01 crc kubenswrapper[4831]: E1203 07:21:01.013150 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:21:14 crc kubenswrapper[4831]: I1203 07:21:14.014038 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:21:14 crc kubenswrapper[4831]: E1203 07:21:14.015051 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:21:28 crc kubenswrapper[4831]: I1203 07:21:28.012988 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:21:28 crc kubenswrapper[4831]: E1203 07:21:28.014180 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:21:39 crc kubenswrapper[4831]: I1203 07:21:39.013438 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:21:39 crc kubenswrapper[4831]: E1203 07:21:39.014420 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:21:52 crc kubenswrapper[4831]: I1203 07:21:52.013936 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:21:52 crc kubenswrapper[4831]: E1203 07:21:52.015047 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:22:03 crc kubenswrapper[4831]: I1203 07:22:03.021652 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:22:03 crc kubenswrapper[4831]: E1203 07:22:03.022990 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:22:18 crc kubenswrapper[4831]: I1203 07:22:18.013105 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:22:18 crc kubenswrapper[4831]: E1203 07:22:18.014080 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:22:29 crc kubenswrapper[4831]: I1203 07:22:29.013899 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:22:29 crc kubenswrapper[4831]: E1203 07:22:29.015524 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:22:42 crc kubenswrapper[4831]: I1203 07:22:42.012818 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:22:42 crc kubenswrapper[4831]: E1203 07:22:42.013486 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:22:53 crc kubenswrapper[4831]: I1203 07:22:53.021494 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:22:53 crc kubenswrapper[4831]: E1203 07:22:53.022552 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:23:04 crc kubenswrapper[4831]: I1203 07:23:04.012884 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:23:04 crc kubenswrapper[4831]: E1203 07:23:04.013769 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.604975 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gjnm2"] Dec 03 07:23:09 crc kubenswrapper[4831]: E1203 07:23:09.606136 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d817fb-f2d8-419b-b3cf-b0a59ca44c33" containerName="registry-server" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.606152 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d817fb-f2d8-419b-b3cf-b0a59ca44c33" containerName="registry-server" Dec 03 07:23:09 crc kubenswrapper[4831]: E1203 07:23:09.606164 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d817fb-f2d8-419b-b3cf-b0a59ca44c33" containerName="extract-content" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.606171 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d817fb-f2d8-419b-b3cf-b0a59ca44c33" containerName="extract-content" Dec 03 07:23:09 crc kubenswrapper[4831]: E1203 07:23:09.606209 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d817fb-f2d8-419b-b3cf-b0a59ca44c33" containerName="extract-utilities" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.606219 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d817fb-f2d8-419b-b3cf-b0a59ca44c33" containerName="extract-utilities" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.606422 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d817fb-f2d8-419b-b3cf-b0a59ca44c33" containerName="registry-server" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.607593 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.621833 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjnm2"] Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.800049 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sdpz\" (UniqueName: \"kubernetes.io/projected/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-kube-api-access-9sdpz\") pod \"redhat-operators-gjnm2\" (UID: \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\") " pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.800101 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-utilities\") pod \"redhat-operators-gjnm2\" (UID: \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\") " pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.800134 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-catalog-content\") pod \"redhat-operators-gjnm2\" (UID: \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\") " pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.901243 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sdpz\" (UniqueName: \"kubernetes.io/projected/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-kube-api-access-9sdpz\") pod \"redhat-operators-gjnm2\" (UID: \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\") " pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.901285 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-utilities\") pod \"redhat-operators-gjnm2\" (UID: \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\") " pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.901306 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-catalog-content\") pod \"redhat-operators-gjnm2\" (UID: \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\") " pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.902110 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-utilities\") pod \"redhat-operators-gjnm2\" (UID: \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\") " pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.902200 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-catalog-content\") pod \"redhat-operators-gjnm2\" (UID: \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\") " pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.932214 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sdpz\" (UniqueName: \"kubernetes.io/projected/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-kube-api-access-9sdpz\") pod \"redhat-operators-gjnm2\" (UID: \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\") " pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:09 crc kubenswrapper[4831]: I1203 07:23:09.944821 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.393406 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hqncm"] Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.395599 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.401828 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqncm"] Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.409575 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa157fd-cecb-4cb7-8472-a16c6bef497b-utilities\") pod \"certified-operators-hqncm\" (UID: \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\") " pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.409623 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa157fd-cecb-4cb7-8472-a16c6bef497b-catalog-content\") pod \"certified-operators-hqncm\" (UID: \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\") " pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.409671 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mssn\" (UniqueName: \"kubernetes.io/projected/0fa157fd-cecb-4cb7-8472-a16c6bef497b-kube-api-access-4mssn\") pod \"certified-operators-hqncm\" (UID: \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\") " pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.424278 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gjnm2"] Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.510551 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa157fd-cecb-4cb7-8472-a16c6bef497b-catalog-content\") pod \"certified-operators-hqncm\" (UID: \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\") " pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.510589 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa157fd-cecb-4cb7-8472-a16c6bef497b-utilities\") pod \"certified-operators-hqncm\" (UID: \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\") " pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.510620 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mssn\" (UniqueName: \"kubernetes.io/projected/0fa157fd-cecb-4cb7-8472-a16c6bef497b-kube-api-access-4mssn\") pod \"certified-operators-hqncm\" (UID: \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\") " pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.511175 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa157fd-cecb-4cb7-8472-a16c6bef497b-catalog-content\") pod \"certified-operators-hqncm\" (UID: \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\") " pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.511271 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa157fd-cecb-4cb7-8472-a16c6bef497b-utilities\") pod \"certified-operators-hqncm\" (UID: \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\") " pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.530084 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mssn\" (UniqueName: \"kubernetes.io/projected/0fa157fd-cecb-4cb7-8472-a16c6bef497b-kube-api-access-4mssn\") pod \"certified-operators-hqncm\" (UID: \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\") " pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.533025 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnm2" event={"ID":"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020","Type":"ContainerStarted","Data":"1d12fa694ef549310bad78590fb11c228349f49234009cd8261ce68f1ab341da"} Dec 03 07:23:10 crc kubenswrapper[4831]: I1203 07:23:10.728621 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:11 crc kubenswrapper[4831]: I1203 07:23:11.023925 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hqncm"] Dec 03 07:23:11 crc kubenswrapper[4831]: I1203 07:23:11.548356 4831 generic.go:334] "Generic (PLEG): container finished" podID="0fa157fd-cecb-4cb7-8472-a16c6bef497b" containerID="368470d0df44b372a3159b5acef6c86a2bdfdfabe93c211f487a06c4e887e455" exitCode=0 Dec 03 07:23:11 crc kubenswrapper[4831]: I1203 07:23:11.548427 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqncm" event={"ID":"0fa157fd-cecb-4cb7-8472-a16c6bef497b","Type":"ContainerDied","Data":"368470d0df44b372a3159b5acef6c86a2bdfdfabe93c211f487a06c4e887e455"} Dec 03 07:23:11 crc kubenswrapper[4831]: I1203 07:23:11.548787 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqncm" event={"ID":"0fa157fd-cecb-4cb7-8472-a16c6bef497b","Type":"ContainerStarted","Data":"809a118fec3577c3affc3e243cd6224ab95ea7d6d4abb61fb7d8ce367793548c"} Dec 03 07:23:11 crc kubenswrapper[4831]: I1203 07:23:11.552193 4831 generic.go:334] "Generic (PLEG): container finished" podID="6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" containerID="356e9195f8a98a34e1f7a3a200f9a8e4e487153e6ae5199d3d7378ac3dec3a61" exitCode=0 Dec 03 07:23:11 crc kubenswrapper[4831]: I1203 07:23:11.552273 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnm2" event={"ID":"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020","Type":"ContainerDied","Data":"356e9195f8a98a34e1f7a3a200f9a8e4e487153e6ae5199d3d7378ac3dec3a61"} Dec 03 07:23:11 crc kubenswrapper[4831]: I1203 07:23:11.552759 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:23:13 crc kubenswrapper[4831]: I1203 07:23:13.579586 4831 generic.go:334] "Generic (PLEG): container finished" podID="6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" containerID="3e77fa64f571fa643c564313e27649b54560fb57ffe3ab88bb475381bce93d85" exitCode=0 Dec 03 07:23:13 crc kubenswrapper[4831]: I1203 07:23:13.579729 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnm2" event={"ID":"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020","Type":"ContainerDied","Data":"3e77fa64f571fa643c564313e27649b54560fb57ffe3ab88bb475381bce93d85"} Dec 03 07:23:13 crc kubenswrapper[4831]: I1203 07:23:13.585352 4831 generic.go:334] "Generic (PLEG): container finished" podID="0fa157fd-cecb-4cb7-8472-a16c6bef497b" containerID="85f7f0cb3573347ea7f12aad2986cdae01a19fc4df6a8c902b960b85a1f0e021" exitCode=0 Dec 03 07:23:13 crc kubenswrapper[4831]: I1203 07:23:13.585431 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqncm" event={"ID":"0fa157fd-cecb-4cb7-8472-a16c6bef497b","Type":"ContainerDied","Data":"85f7f0cb3573347ea7f12aad2986cdae01a19fc4df6a8c902b960b85a1f0e021"} Dec 03 07:23:14 crc kubenswrapper[4831]: I1203 07:23:14.599752 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnm2" event={"ID":"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020","Type":"ContainerStarted","Data":"8121dbf5be9fa117210c1aa0750f7035841f021c2c2b605bcf6575db08aa3ba4"} Dec 03 07:23:14 crc kubenswrapper[4831]: I1203 07:23:14.602483 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqncm" event={"ID":"0fa157fd-cecb-4cb7-8472-a16c6bef497b","Type":"ContainerStarted","Data":"c6f129c594c2325b401bb9728052aabd702893e823fb3e0ee608b112e07863c2"} Dec 03 07:23:14 crc kubenswrapper[4831]: I1203 07:23:14.630536 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gjnm2" podStartSLOduration=3.132732788 podStartE2EDuration="5.630481251s" podCreationTimestamp="2025-12-03 07:23:09 +0000 UTC" firstStartedPulling="2025-12-03 07:23:11.554156901 +0000 UTC m=+3128.897740449" lastFinishedPulling="2025-12-03 07:23:14.051905374 +0000 UTC m=+3131.395488912" observedRunningTime="2025-12-03 07:23:14.625551937 +0000 UTC m=+3131.969135455" watchObservedRunningTime="2025-12-03 07:23:14.630481251 +0000 UTC m=+3131.974064799" Dec 03 07:23:14 crc kubenswrapper[4831]: I1203 07:23:14.649109 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hqncm" podStartSLOduration=2.157561546 podStartE2EDuration="4.649085994s" podCreationTimestamp="2025-12-03 07:23:10 +0000 UTC" firstStartedPulling="2025-12-03 07:23:11.552365705 +0000 UTC m=+3128.895949243" lastFinishedPulling="2025-12-03 07:23:14.043890163 +0000 UTC m=+3131.387473691" observedRunningTime="2025-12-03 07:23:14.645755449 +0000 UTC m=+3131.989338997" watchObservedRunningTime="2025-12-03 07:23:14.649085994 +0000 UTC m=+3131.992669502" Dec 03 07:23:15 crc kubenswrapper[4831]: I1203 07:23:15.012538 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:23:15 crc kubenswrapper[4831]: E1203 07:23:15.012771 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:23:19 crc kubenswrapper[4831]: I1203 07:23:19.945118 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:19 crc kubenswrapper[4831]: I1203 07:23:19.945758 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:20 crc kubenswrapper[4831]: I1203 07:23:20.729556 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:20 crc kubenswrapper[4831]: I1203 07:23:20.729637 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:20 crc kubenswrapper[4831]: I1203 07:23:20.803471 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:20 crc kubenswrapper[4831]: I1203 07:23:20.999290 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gjnm2" podUID="6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" containerName="registry-server" probeResult="failure" output=< Dec 03 07:23:20 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 03 07:23:20 crc kubenswrapper[4831]: > Dec 03 07:23:21 crc kubenswrapper[4831]: I1203 07:23:21.739819 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:21 crc kubenswrapper[4831]: I1203 07:23:21.815818 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqncm"] Dec 03 07:23:23 crc kubenswrapper[4831]: I1203 07:23:23.692533 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-hqncm" podUID="0fa157fd-cecb-4cb7-8472-a16c6bef497b" containerName="registry-server" containerID="cri-o://c6f129c594c2325b401bb9728052aabd702893e823fb3e0ee608b112e07863c2" gracePeriod=2 Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.665167 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.703268 4831 generic.go:334] "Generic (PLEG): container finished" podID="0fa157fd-cecb-4cb7-8472-a16c6bef497b" containerID="c6f129c594c2325b401bb9728052aabd702893e823fb3e0ee608b112e07863c2" exitCode=0 Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.703366 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqncm" event={"ID":"0fa157fd-cecb-4cb7-8472-a16c6bef497b","Type":"ContainerDied","Data":"c6f129c594c2325b401bb9728052aabd702893e823fb3e0ee608b112e07863c2"} Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.703415 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hqncm" event={"ID":"0fa157fd-cecb-4cb7-8472-a16c6bef497b","Type":"ContainerDied","Data":"809a118fec3577c3affc3e243cd6224ab95ea7d6d4abb61fb7d8ce367793548c"} Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.703434 4831 scope.go:117] "RemoveContainer" containerID="c6f129c594c2325b401bb9728052aabd702893e823fb3e0ee608b112e07863c2" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.703361 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hqncm" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.729780 4831 scope.go:117] "RemoveContainer" containerID="85f7f0cb3573347ea7f12aad2986cdae01a19fc4df6a8c902b960b85a1f0e021" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.750728 4831 scope.go:117] "RemoveContainer" containerID="368470d0df44b372a3159b5acef6c86a2bdfdfabe93c211f487a06c4e887e455" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.767174 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mssn\" (UniqueName: \"kubernetes.io/projected/0fa157fd-cecb-4cb7-8472-a16c6bef497b-kube-api-access-4mssn\") pod \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\" (UID: \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\") " Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.767611 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa157fd-cecb-4cb7-8472-a16c6bef497b-catalog-content\") pod \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\" (UID: \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\") " Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.767692 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa157fd-cecb-4cb7-8472-a16c6bef497b-utilities\") pod \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\" (UID: \"0fa157fd-cecb-4cb7-8472-a16c6bef497b\") " Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.768972 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa157fd-cecb-4cb7-8472-a16c6bef497b-utilities" (OuterVolumeSpecName: "utilities") pod "0fa157fd-cecb-4cb7-8472-a16c6bef497b" (UID: "0fa157fd-cecb-4cb7-8472-a16c6bef497b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.774710 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fa157fd-cecb-4cb7-8472-a16c6bef497b-kube-api-access-4mssn" (OuterVolumeSpecName: "kube-api-access-4mssn") pod "0fa157fd-cecb-4cb7-8472-a16c6bef497b" (UID: "0fa157fd-cecb-4cb7-8472-a16c6bef497b"). InnerVolumeSpecName "kube-api-access-4mssn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.779197 4831 scope.go:117] "RemoveContainer" containerID="c6f129c594c2325b401bb9728052aabd702893e823fb3e0ee608b112e07863c2" Dec 03 07:23:24 crc kubenswrapper[4831]: E1203 07:23:24.779731 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6f129c594c2325b401bb9728052aabd702893e823fb3e0ee608b112e07863c2\": container with ID starting with c6f129c594c2325b401bb9728052aabd702893e823fb3e0ee608b112e07863c2 not found: ID does not exist" containerID="c6f129c594c2325b401bb9728052aabd702893e823fb3e0ee608b112e07863c2" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.779785 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6f129c594c2325b401bb9728052aabd702893e823fb3e0ee608b112e07863c2"} err="failed to get container status \"c6f129c594c2325b401bb9728052aabd702893e823fb3e0ee608b112e07863c2\": rpc error: code = NotFound desc = could not find container \"c6f129c594c2325b401bb9728052aabd702893e823fb3e0ee608b112e07863c2\": container with ID starting with c6f129c594c2325b401bb9728052aabd702893e823fb3e0ee608b112e07863c2 not found: ID does not exist" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.779822 4831 scope.go:117] "RemoveContainer" containerID="85f7f0cb3573347ea7f12aad2986cdae01a19fc4df6a8c902b960b85a1f0e021" Dec 03 07:23:24 crc kubenswrapper[4831]: E1203 07:23:24.780171 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f7f0cb3573347ea7f12aad2986cdae01a19fc4df6a8c902b960b85a1f0e021\": container with ID starting with 85f7f0cb3573347ea7f12aad2986cdae01a19fc4df6a8c902b960b85a1f0e021 not found: ID does not exist" containerID="85f7f0cb3573347ea7f12aad2986cdae01a19fc4df6a8c902b960b85a1f0e021" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.780205 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f7f0cb3573347ea7f12aad2986cdae01a19fc4df6a8c902b960b85a1f0e021"} err="failed to get container status \"85f7f0cb3573347ea7f12aad2986cdae01a19fc4df6a8c902b960b85a1f0e021\": rpc error: code = NotFound desc = could not find container \"85f7f0cb3573347ea7f12aad2986cdae01a19fc4df6a8c902b960b85a1f0e021\": container with ID starting with 85f7f0cb3573347ea7f12aad2986cdae01a19fc4df6a8c902b960b85a1f0e021 not found: ID does not exist" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.780223 4831 scope.go:117] "RemoveContainer" containerID="368470d0df44b372a3159b5acef6c86a2bdfdfabe93c211f487a06c4e887e455" Dec 03 07:23:24 crc kubenswrapper[4831]: E1203 07:23:24.780440 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"368470d0df44b372a3159b5acef6c86a2bdfdfabe93c211f487a06c4e887e455\": container with ID starting with 368470d0df44b372a3159b5acef6c86a2bdfdfabe93c211f487a06c4e887e455 not found: ID does not exist" containerID="368470d0df44b372a3159b5acef6c86a2bdfdfabe93c211f487a06c4e887e455" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.780463 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"368470d0df44b372a3159b5acef6c86a2bdfdfabe93c211f487a06c4e887e455"} err="failed to get container status \"368470d0df44b372a3159b5acef6c86a2bdfdfabe93c211f487a06c4e887e455\": rpc error: code = NotFound desc = could not find container \"368470d0df44b372a3159b5acef6c86a2bdfdfabe93c211f487a06c4e887e455\": container with ID starting with 368470d0df44b372a3159b5acef6c86a2bdfdfabe93c211f487a06c4e887e455 not found: ID does not exist" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.816642 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0fa157fd-cecb-4cb7-8472-a16c6bef497b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0fa157fd-cecb-4cb7-8472-a16c6bef497b" (UID: "0fa157fd-cecb-4cb7-8472-a16c6bef497b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.868998 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mssn\" (UniqueName: \"kubernetes.io/projected/0fa157fd-cecb-4cb7-8472-a16c6bef497b-kube-api-access-4mssn\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.869023 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0fa157fd-cecb-4cb7-8472-a16c6bef497b-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:24 crc kubenswrapper[4831]: I1203 07:23:24.869034 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0fa157fd-cecb-4cb7-8472-a16c6bef497b-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:25 crc kubenswrapper[4831]: I1203 07:23:25.052896 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-hqncm"] Dec 03 07:23:25 crc kubenswrapper[4831]: I1203 07:23:25.059421 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-hqncm"] Dec 03 07:23:26 crc kubenswrapper[4831]: I1203 07:23:26.013408 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:23:26 crc kubenswrapper[4831]: E1203 07:23:26.013879 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:23:27 crc kubenswrapper[4831]: I1203 07:23:27.025585 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fa157fd-cecb-4cb7-8472-a16c6bef497b" path="/var/lib/kubelet/pods/0fa157fd-cecb-4cb7-8472-a16c6bef497b/volumes" Dec 03 07:23:30 crc kubenswrapper[4831]: I1203 07:23:30.025723 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:30 crc kubenswrapper[4831]: I1203 07:23:30.085449 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:30 crc kubenswrapper[4831]: I1203 07:23:30.289424 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjnm2"] Dec 03 07:23:31 crc kubenswrapper[4831]: I1203 07:23:31.784259 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gjnm2" podUID="6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" containerName="registry-server" containerID="cri-o://8121dbf5be9fa117210c1aa0750f7035841f021c2c2b605bcf6575db08aa3ba4" gracePeriod=2 Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.237826 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.390711 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9sdpz\" (UniqueName: \"kubernetes.io/projected/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-kube-api-access-9sdpz\") pod \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\" (UID: \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\") " Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.390799 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-utilities\") pod \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\" (UID: \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\") " Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.390839 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-catalog-content\") pod \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\" (UID: \"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020\") " Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.392431 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-utilities" (OuterVolumeSpecName: "utilities") pod "6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" (UID: "6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.399571 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-kube-api-access-9sdpz" (OuterVolumeSpecName: "kube-api-access-9sdpz") pod "6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" (UID: "6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020"). InnerVolumeSpecName "kube-api-access-9sdpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.492872 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9sdpz\" (UniqueName: \"kubernetes.io/projected/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-kube-api-access-9sdpz\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.492921 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.557708 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" (UID: "6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.594885 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.798183 4831 generic.go:334] "Generic (PLEG): container finished" podID="6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" containerID="8121dbf5be9fa117210c1aa0750f7035841f021c2c2b605bcf6575db08aa3ba4" exitCode=0 Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.798422 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnm2" event={"ID":"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020","Type":"ContainerDied","Data":"8121dbf5be9fa117210c1aa0750f7035841f021c2c2b605bcf6575db08aa3ba4"} Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.798704 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gjnm2" event={"ID":"6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020","Type":"ContainerDied","Data":"1d12fa694ef549310bad78590fb11c228349f49234009cd8261ce68f1ab341da"} Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.798778 4831 scope.go:117] "RemoveContainer" containerID="8121dbf5be9fa117210c1aa0750f7035841f021c2c2b605bcf6575db08aa3ba4" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.798575 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gjnm2" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.829877 4831 scope.go:117] "RemoveContainer" containerID="3e77fa64f571fa643c564313e27649b54560fb57ffe3ab88bb475381bce93d85" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.865518 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gjnm2"] Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.869664 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gjnm2"] Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.872740 4831 scope.go:117] "RemoveContainer" containerID="356e9195f8a98a34e1f7a3a200f9a8e4e487153e6ae5199d3d7378ac3dec3a61" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.919913 4831 scope.go:117] "RemoveContainer" containerID="8121dbf5be9fa117210c1aa0750f7035841f021c2c2b605bcf6575db08aa3ba4" Dec 03 07:23:32 crc kubenswrapper[4831]: E1203 07:23:32.920872 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8121dbf5be9fa117210c1aa0750f7035841f021c2c2b605bcf6575db08aa3ba4\": container with ID starting with 8121dbf5be9fa117210c1aa0750f7035841f021c2c2b605bcf6575db08aa3ba4 not found: ID does not exist" containerID="8121dbf5be9fa117210c1aa0750f7035841f021c2c2b605bcf6575db08aa3ba4" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.920930 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8121dbf5be9fa117210c1aa0750f7035841f021c2c2b605bcf6575db08aa3ba4"} err="failed to get container status \"8121dbf5be9fa117210c1aa0750f7035841f021c2c2b605bcf6575db08aa3ba4\": rpc error: code = NotFound desc = could not find container \"8121dbf5be9fa117210c1aa0750f7035841f021c2c2b605bcf6575db08aa3ba4\": container with ID starting with 8121dbf5be9fa117210c1aa0750f7035841f021c2c2b605bcf6575db08aa3ba4 not found: ID does not exist" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.920965 4831 scope.go:117] "RemoveContainer" containerID="3e77fa64f571fa643c564313e27649b54560fb57ffe3ab88bb475381bce93d85" Dec 03 07:23:32 crc kubenswrapper[4831]: E1203 07:23:32.921673 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e77fa64f571fa643c564313e27649b54560fb57ffe3ab88bb475381bce93d85\": container with ID starting with 3e77fa64f571fa643c564313e27649b54560fb57ffe3ab88bb475381bce93d85 not found: ID does not exist" containerID="3e77fa64f571fa643c564313e27649b54560fb57ffe3ab88bb475381bce93d85" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.921748 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e77fa64f571fa643c564313e27649b54560fb57ffe3ab88bb475381bce93d85"} err="failed to get container status \"3e77fa64f571fa643c564313e27649b54560fb57ffe3ab88bb475381bce93d85\": rpc error: code = NotFound desc = could not find container \"3e77fa64f571fa643c564313e27649b54560fb57ffe3ab88bb475381bce93d85\": container with ID starting with 3e77fa64f571fa643c564313e27649b54560fb57ffe3ab88bb475381bce93d85 not found: ID does not exist" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.921798 4831 scope.go:117] "RemoveContainer" containerID="356e9195f8a98a34e1f7a3a200f9a8e4e487153e6ae5199d3d7378ac3dec3a61" Dec 03 07:23:32 crc kubenswrapper[4831]: E1203 07:23:32.922422 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"356e9195f8a98a34e1f7a3a200f9a8e4e487153e6ae5199d3d7378ac3dec3a61\": container with ID starting with 356e9195f8a98a34e1f7a3a200f9a8e4e487153e6ae5199d3d7378ac3dec3a61 not found: ID does not exist" containerID="356e9195f8a98a34e1f7a3a200f9a8e4e487153e6ae5199d3d7378ac3dec3a61" Dec 03 07:23:32 crc kubenswrapper[4831]: I1203 07:23:32.922471 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"356e9195f8a98a34e1f7a3a200f9a8e4e487153e6ae5199d3d7378ac3dec3a61"} err="failed to get container status \"356e9195f8a98a34e1f7a3a200f9a8e4e487153e6ae5199d3d7378ac3dec3a61\": rpc error: code = NotFound desc = could not find container \"356e9195f8a98a34e1f7a3a200f9a8e4e487153e6ae5199d3d7378ac3dec3a61\": container with ID starting with 356e9195f8a98a34e1f7a3a200f9a8e4e487153e6ae5199d3d7378ac3dec3a61 not found: ID does not exist" Dec 03 07:23:33 crc kubenswrapper[4831]: I1203 07:23:33.032495 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" path="/var/lib/kubelet/pods/6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020/volumes" Dec 03 07:23:39 crc kubenswrapper[4831]: I1203 07:23:39.013709 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:23:39 crc kubenswrapper[4831]: E1203 07:23:39.014746 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:23:54 crc kubenswrapper[4831]: I1203 07:23:54.012637 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:23:54 crc kubenswrapper[4831]: E1203 07:23:54.013724 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:24:05 crc kubenswrapper[4831]: I1203 07:24:05.012304 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:24:05 crc kubenswrapper[4831]: E1203 07:24:05.013223 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:24:17 crc kubenswrapper[4831]: I1203 07:24:17.013364 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:24:17 crc kubenswrapper[4831]: E1203 07:24:17.014468 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:24:31 crc kubenswrapper[4831]: I1203 07:24:31.014244 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:24:31 crc kubenswrapper[4831]: E1203 07:24:31.015146 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:24:46 crc kubenswrapper[4831]: I1203 07:24:46.013516 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:24:46 crc kubenswrapper[4831]: E1203 07:24:46.014634 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.441921 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-c49gt"] Dec 03 07:24:58 crc kubenswrapper[4831]: E1203 07:24:58.443018 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa157fd-cecb-4cb7-8472-a16c6bef497b" containerName="extract-utilities" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.443042 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa157fd-cecb-4cb7-8472-a16c6bef497b" containerName="extract-utilities" Dec 03 07:24:58 crc kubenswrapper[4831]: E1203 07:24:58.443061 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa157fd-cecb-4cb7-8472-a16c6bef497b" containerName="registry-server" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.443072 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa157fd-cecb-4cb7-8472-a16c6bef497b" containerName="registry-server" Dec 03 07:24:58 crc kubenswrapper[4831]: E1203 07:24:58.443098 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" containerName="registry-server" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.443109 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" containerName="registry-server" Dec 03 07:24:58 crc kubenswrapper[4831]: E1203 07:24:58.443122 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fa157fd-cecb-4cb7-8472-a16c6bef497b" containerName="extract-content" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.443135 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fa157fd-cecb-4cb7-8472-a16c6bef497b" containerName="extract-content" Dec 03 07:24:58 crc kubenswrapper[4831]: E1203 07:24:58.443151 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" containerName="extract-utilities" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.443162 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" containerName="extract-utilities" Dec 03 07:24:58 crc kubenswrapper[4831]: E1203 07:24:58.443184 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" containerName="extract-content" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.443194 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" containerName="extract-content" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.443445 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a9a9d9d-2c09-4b6c-8ca3-5b4e66902020" containerName="registry-server" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.443482 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fa157fd-cecb-4cb7-8472-a16c6bef497b" containerName="registry-server" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.449972 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.465075 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpwpz\" (UniqueName: \"kubernetes.io/projected/ec990c6a-30af-4a89-ac89-76374ce62e31-kube-api-access-zpwpz\") pod \"redhat-marketplace-c49gt\" (UID: \"ec990c6a-30af-4a89-ac89-76374ce62e31\") " pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.465190 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec990c6a-30af-4a89-ac89-76374ce62e31-utilities\") pod \"redhat-marketplace-c49gt\" (UID: \"ec990c6a-30af-4a89-ac89-76374ce62e31\") " pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.465233 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec990c6a-30af-4a89-ac89-76374ce62e31-catalog-content\") pod \"redhat-marketplace-c49gt\" (UID: \"ec990c6a-30af-4a89-ac89-76374ce62e31\") " pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.487764 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c49gt"] Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.566303 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec990c6a-30af-4a89-ac89-76374ce62e31-utilities\") pod \"redhat-marketplace-c49gt\" (UID: \"ec990c6a-30af-4a89-ac89-76374ce62e31\") " pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.566780 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec990c6a-30af-4a89-ac89-76374ce62e31-catalog-content\") pod \"redhat-marketplace-c49gt\" (UID: \"ec990c6a-30af-4a89-ac89-76374ce62e31\") " pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.566837 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpwpz\" (UniqueName: \"kubernetes.io/projected/ec990c6a-30af-4a89-ac89-76374ce62e31-kube-api-access-zpwpz\") pod \"redhat-marketplace-c49gt\" (UID: \"ec990c6a-30af-4a89-ac89-76374ce62e31\") " pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.566982 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec990c6a-30af-4a89-ac89-76374ce62e31-utilities\") pod \"redhat-marketplace-c49gt\" (UID: \"ec990c6a-30af-4a89-ac89-76374ce62e31\") " pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.567270 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec990c6a-30af-4a89-ac89-76374ce62e31-catalog-content\") pod \"redhat-marketplace-c49gt\" (UID: \"ec990c6a-30af-4a89-ac89-76374ce62e31\") " pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.592145 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpwpz\" (UniqueName: \"kubernetes.io/projected/ec990c6a-30af-4a89-ac89-76374ce62e31-kube-api-access-zpwpz\") pod \"redhat-marketplace-c49gt\" (UID: \"ec990c6a-30af-4a89-ac89-76374ce62e31\") " pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:24:58 crc kubenswrapper[4831]: I1203 07:24:58.792089 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:24:59 crc kubenswrapper[4831]: I1203 07:24:59.014601 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:24:59 crc kubenswrapper[4831]: W1203 07:24:59.314656 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec990c6a_30af_4a89_ac89_76374ce62e31.slice/crio-d58f2af3c0b50007da092ee1cb5118105d2fa60ddadb19a4040841109e1237c5 WatchSource:0}: Error finding container d58f2af3c0b50007da092ee1cb5118105d2fa60ddadb19a4040841109e1237c5: Status 404 returned error can't find the container with id d58f2af3c0b50007da092ee1cb5118105d2fa60ddadb19a4040841109e1237c5 Dec 03 07:24:59 crc kubenswrapper[4831]: I1203 07:24:59.323245 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-c49gt"] Dec 03 07:24:59 crc kubenswrapper[4831]: I1203 07:24:59.705113 4831 generic.go:334] "Generic (PLEG): container finished" podID="ec990c6a-30af-4a89-ac89-76374ce62e31" containerID="e70528d0b3bd4b3262e07c196f8ce8a0a1bbe93bb60655f2e6522e4aed2cf593" exitCode=0 Dec 03 07:24:59 crc kubenswrapper[4831]: I1203 07:24:59.705229 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c49gt" event={"ID":"ec990c6a-30af-4a89-ac89-76374ce62e31","Type":"ContainerDied","Data":"e70528d0b3bd4b3262e07c196f8ce8a0a1bbe93bb60655f2e6522e4aed2cf593"} Dec 03 07:24:59 crc kubenswrapper[4831]: I1203 07:24:59.705273 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c49gt" event={"ID":"ec990c6a-30af-4a89-ac89-76374ce62e31","Type":"ContainerStarted","Data":"d58f2af3c0b50007da092ee1cb5118105d2fa60ddadb19a4040841109e1237c5"} Dec 03 07:24:59 crc kubenswrapper[4831]: I1203 07:24:59.709338 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"30217bbfbbca1e96e56a9110617a3283cb42ac5aa1bdaaf69fb3702e97fe484a"} Dec 03 07:25:01 crc kubenswrapper[4831]: I1203 07:25:01.733915 4831 generic.go:334] "Generic (PLEG): container finished" podID="ec990c6a-30af-4a89-ac89-76374ce62e31" containerID="5f62b8f9b84c90730fec9ed257823b346ba0e8cb54111173089b4237aa04caa0" exitCode=0 Dec 03 07:25:01 crc kubenswrapper[4831]: I1203 07:25:01.734035 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c49gt" event={"ID":"ec990c6a-30af-4a89-ac89-76374ce62e31","Type":"ContainerDied","Data":"5f62b8f9b84c90730fec9ed257823b346ba0e8cb54111173089b4237aa04caa0"} Dec 03 07:25:02 crc kubenswrapper[4831]: I1203 07:25:02.744924 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c49gt" event={"ID":"ec990c6a-30af-4a89-ac89-76374ce62e31","Type":"ContainerStarted","Data":"51852b4edcdf8fbecb1a1a58df83eee96da5a843dc352f2a7dfaf07179b879f4"} Dec 03 07:25:02 crc kubenswrapper[4831]: I1203 07:25:02.769897 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-c49gt" podStartSLOduration=2.298091472 podStartE2EDuration="4.769873472s" podCreationTimestamp="2025-12-03 07:24:58 +0000 UTC" firstStartedPulling="2025-12-03 07:24:59.708513371 +0000 UTC m=+3237.052096909" lastFinishedPulling="2025-12-03 07:25:02.180295361 +0000 UTC m=+3239.523878909" observedRunningTime="2025-12-03 07:25:02.768404417 +0000 UTC m=+3240.111987955" watchObservedRunningTime="2025-12-03 07:25:02.769873472 +0000 UTC m=+3240.113457000" Dec 03 07:25:08 crc kubenswrapper[4831]: I1203 07:25:08.794633 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:25:08 crc kubenswrapper[4831]: I1203 07:25:08.795572 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:25:08 crc kubenswrapper[4831]: I1203 07:25:08.872608 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:25:09 crc kubenswrapper[4831]: I1203 07:25:09.890825 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:25:09 crc kubenswrapper[4831]: I1203 07:25:09.996225 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c49gt"] Dec 03 07:25:11 crc kubenswrapper[4831]: I1203 07:25:11.826454 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-c49gt" podUID="ec990c6a-30af-4a89-ac89-76374ce62e31" containerName="registry-server" containerID="cri-o://51852b4edcdf8fbecb1a1a58df83eee96da5a843dc352f2a7dfaf07179b879f4" gracePeriod=2 Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.302849 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.442632 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec990c6a-30af-4a89-ac89-76374ce62e31-utilities\") pod \"ec990c6a-30af-4a89-ac89-76374ce62e31\" (UID: \"ec990c6a-30af-4a89-ac89-76374ce62e31\") " Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.442688 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec990c6a-30af-4a89-ac89-76374ce62e31-catalog-content\") pod \"ec990c6a-30af-4a89-ac89-76374ce62e31\" (UID: \"ec990c6a-30af-4a89-ac89-76374ce62e31\") " Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.442716 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpwpz\" (UniqueName: \"kubernetes.io/projected/ec990c6a-30af-4a89-ac89-76374ce62e31-kube-api-access-zpwpz\") pod \"ec990c6a-30af-4a89-ac89-76374ce62e31\" (UID: \"ec990c6a-30af-4a89-ac89-76374ce62e31\") " Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.443630 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec990c6a-30af-4a89-ac89-76374ce62e31-utilities" (OuterVolumeSpecName: "utilities") pod "ec990c6a-30af-4a89-ac89-76374ce62e31" (UID: "ec990c6a-30af-4a89-ac89-76374ce62e31"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.449045 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec990c6a-30af-4a89-ac89-76374ce62e31-kube-api-access-zpwpz" (OuterVolumeSpecName: "kube-api-access-zpwpz") pod "ec990c6a-30af-4a89-ac89-76374ce62e31" (UID: "ec990c6a-30af-4a89-ac89-76374ce62e31"). InnerVolumeSpecName "kube-api-access-zpwpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.463482 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ec990c6a-30af-4a89-ac89-76374ce62e31-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ec990c6a-30af-4a89-ac89-76374ce62e31" (UID: "ec990c6a-30af-4a89-ac89-76374ce62e31"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.544466 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpwpz\" (UniqueName: \"kubernetes.io/projected/ec990c6a-30af-4a89-ac89-76374ce62e31-kube-api-access-zpwpz\") on node \"crc\" DevicePath \"\"" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.544521 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec990c6a-30af-4a89-ac89-76374ce62e31-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.544541 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec990c6a-30af-4a89-ac89-76374ce62e31-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.836783 4831 generic.go:334] "Generic (PLEG): container finished" podID="ec990c6a-30af-4a89-ac89-76374ce62e31" containerID="51852b4edcdf8fbecb1a1a58df83eee96da5a843dc352f2a7dfaf07179b879f4" exitCode=0 Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.836866 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-c49gt" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.836868 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c49gt" event={"ID":"ec990c6a-30af-4a89-ac89-76374ce62e31","Type":"ContainerDied","Data":"51852b4edcdf8fbecb1a1a58df83eee96da5a843dc352f2a7dfaf07179b879f4"} Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.838523 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-c49gt" event={"ID":"ec990c6a-30af-4a89-ac89-76374ce62e31","Type":"ContainerDied","Data":"d58f2af3c0b50007da092ee1cb5118105d2fa60ddadb19a4040841109e1237c5"} Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.838610 4831 scope.go:117] "RemoveContainer" containerID="51852b4edcdf8fbecb1a1a58df83eee96da5a843dc352f2a7dfaf07179b879f4" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.883355 4831 scope.go:117] "RemoveContainer" containerID="5f62b8f9b84c90730fec9ed257823b346ba0e8cb54111173089b4237aa04caa0" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.885382 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-c49gt"] Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.898890 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-c49gt"] Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.920676 4831 scope.go:117] "RemoveContainer" containerID="e70528d0b3bd4b3262e07c196f8ce8a0a1bbe93bb60655f2e6522e4aed2cf593" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.956909 4831 scope.go:117] "RemoveContainer" containerID="51852b4edcdf8fbecb1a1a58df83eee96da5a843dc352f2a7dfaf07179b879f4" Dec 03 07:25:12 crc kubenswrapper[4831]: E1203 07:25:12.957444 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51852b4edcdf8fbecb1a1a58df83eee96da5a843dc352f2a7dfaf07179b879f4\": container with ID starting with 51852b4edcdf8fbecb1a1a58df83eee96da5a843dc352f2a7dfaf07179b879f4 not found: ID does not exist" containerID="51852b4edcdf8fbecb1a1a58df83eee96da5a843dc352f2a7dfaf07179b879f4" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.957479 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51852b4edcdf8fbecb1a1a58df83eee96da5a843dc352f2a7dfaf07179b879f4"} err="failed to get container status \"51852b4edcdf8fbecb1a1a58df83eee96da5a843dc352f2a7dfaf07179b879f4\": rpc error: code = NotFound desc = could not find container \"51852b4edcdf8fbecb1a1a58df83eee96da5a843dc352f2a7dfaf07179b879f4\": container with ID starting with 51852b4edcdf8fbecb1a1a58df83eee96da5a843dc352f2a7dfaf07179b879f4 not found: ID does not exist" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.957507 4831 scope.go:117] "RemoveContainer" containerID="5f62b8f9b84c90730fec9ed257823b346ba0e8cb54111173089b4237aa04caa0" Dec 03 07:25:12 crc kubenswrapper[4831]: E1203 07:25:12.958145 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f62b8f9b84c90730fec9ed257823b346ba0e8cb54111173089b4237aa04caa0\": container with ID starting with 5f62b8f9b84c90730fec9ed257823b346ba0e8cb54111173089b4237aa04caa0 not found: ID does not exist" containerID="5f62b8f9b84c90730fec9ed257823b346ba0e8cb54111173089b4237aa04caa0" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.958175 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f62b8f9b84c90730fec9ed257823b346ba0e8cb54111173089b4237aa04caa0"} err="failed to get container status \"5f62b8f9b84c90730fec9ed257823b346ba0e8cb54111173089b4237aa04caa0\": rpc error: code = NotFound desc = could not find container \"5f62b8f9b84c90730fec9ed257823b346ba0e8cb54111173089b4237aa04caa0\": container with ID starting with 5f62b8f9b84c90730fec9ed257823b346ba0e8cb54111173089b4237aa04caa0 not found: ID does not exist" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.958193 4831 scope.go:117] "RemoveContainer" containerID="e70528d0b3bd4b3262e07c196f8ce8a0a1bbe93bb60655f2e6522e4aed2cf593" Dec 03 07:25:12 crc kubenswrapper[4831]: E1203 07:25:12.958696 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70528d0b3bd4b3262e07c196f8ce8a0a1bbe93bb60655f2e6522e4aed2cf593\": container with ID starting with e70528d0b3bd4b3262e07c196f8ce8a0a1bbe93bb60655f2e6522e4aed2cf593 not found: ID does not exist" containerID="e70528d0b3bd4b3262e07c196f8ce8a0a1bbe93bb60655f2e6522e4aed2cf593" Dec 03 07:25:12 crc kubenswrapper[4831]: I1203 07:25:12.958863 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70528d0b3bd4b3262e07c196f8ce8a0a1bbe93bb60655f2e6522e4aed2cf593"} err="failed to get container status \"e70528d0b3bd4b3262e07c196f8ce8a0a1bbe93bb60655f2e6522e4aed2cf593\": rpc error: code = NotFound desc = could not find container \"e70528d0b3bd4b3262e07c196f8ce8a0a1bbe93bb60655f2e6522e4aed2cf593\": container with ID starting with e70528d0b3bd4b3262e07c196f8ce8a0a1bbe93bb60655f2e6522e4aed2cf593 not found: ID does not exist" Dec 03 07:25:13 crc kubenswrapper[4831]: I1203 07:25:13.030108 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec990c6a-30af-4a89-ac89-76374ce62e31" path="/var/lib/kubelet/pods/ec990c6a-30af-4a89-ac89-76374ce62e31/volumes" Dec 03 07:27:27 crc kubenswrapper[4831]: I1203 07:27:27.597800 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:27:27 crc kubenswrapper[4831]: I1203 07:27:27.598584 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:27:57 crc kubenswrapper[4831]: I1203 07:27:57.597404 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:27:57 crc kubenswrapper[4831]: I1203 07:27:57.598052 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:28:27 crc kubenswrapper[4831]: I1203 07:28:27.596879 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:28:27 crc kubenswrapper[4831]: I1203 07:28:27.597447 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:28:27 crc kubenswrapper[4831]: I1203 07:28:27.597493 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 07:28:27 crc kubenswrapper[4831]: I1203 07:28:27.598082 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"30217bbfbbca1e96e56a9110617a3283cb42ac5aa1bdaaf69fb3702e97fe484a"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:28:27 crc kubenswrapper[4831]: I1203 07:28:27.598133 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://30217bbfbbca1e96e56a9110617a3283cb42ac5aa1bdaaf69fb3702e97fe484a" gracePeriod=600 Dec 03 07:28:27 crc kubenswrapper[4831]: I1203 07:28:27.747778 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="30217bbfbbca1e96e56a9110617a3283cb42ac5aa1bdaaf69fb3702e97fe484a" exitCode=0 Dec 03 07:28:27 crc kubenswrapper[4831]: I1203 07:28:27.747835 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"30217bbfbbca1e96e56a9110617a3283cb42ac5aa1bdaaf69fb3702e97fe484a"} Dec 03 07:28:27 crc kubenswrapper[4831]: I1203 07:28:27.747926 4831 scope.go:117] "RemoveContainer" containerID="8fae4abd9932f6005b51606f070f1ad7cfdb82176e26fc944c55e4c6e05045f9" Dec 03 07:28:28 crc kubenswrapper[4831]: I1203 07:28:28.762374 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721"} Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.077870 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tt4bd"] Dec 03 07:28:46 crc kubenswrapper[4831]: E1203 07:28:46.078740 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec990c6a-30af-4a89-ac89-76374ce62e31" containerName="extract-content" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.078757 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec990c6a-30af-4a89-ac89-76374ce62e31" containerName="extract-content" Dec 03 07:28:46 crc kubenswrapper[4831]: E1203 07:28:46.078771 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec990c6a-30af-4a89-ac89-76374ce62e31" containerName="registry-server" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.078779 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec990c6a-30af-4a89-ac89-76374ce62e31" containerName="registry-server" Dec 03 07:28:46 crc kubenswrapper[4831]: E1203 07:28:46.078790 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec990c6a-30af-4a89-ac89-76374ce62e31" containerName="extract-utilities" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.078798 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec990c6a-30af-4a89-ac89-76374ce62e31" containerName="extract-utilities" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.078973 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec990c6a-30af-4a89-ac89-76374ce62e31" containerName="registry-server" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.080117 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.114605 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tt4bd"] Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.129171 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d48dfc-b570-418b-bffb-f6c2256dc806-utilities\") pod \"community-operators-tt4bd\" (UID: \"45d48dfc-b570-418b-bffb-f6c2256dc806\") " pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.129251 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vs9dk\" (UniqueName: \"kubernetes.io/projected/45d48dfc-b570-418b-bffb-f6c2256dc806-kube-api-access-vs9dk\") pod \"community-operators-tt4bd\" (UID: \"45d48dfc-b570-418b-bffb-f6c2256dc806\") " pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.129330 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d48dfc-b570-418b-bffb-f6c2256dc806-catalog-content\") pod \"community-operators-tt4bd\" (UID: \"45d48dfc-b570-418b-bffb-f6c2256dc806\") " pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.229999 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d48dfc-b570-418b-bffb-f6c2256dc806-utilities\") pod \"community-operators-tt4bd\" (UID: \"45d48dfc-b570-418b-bffb-f6c2256dc806\") " pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.230098 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vs9dk\" (UniqueName: \"kubernetes.io/projected/45d48dfc-b570-418b-bffb-f6c2256dc806-kube-api-access-vs9dk\") pod \"community-operators-tt4bd\" (UID: \"45d48dfc-b570-418b-bffb-f6c2256dc806\") " pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.230182 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d48dfc-b570-418b-bffb-f6c2256dc806-catalog-content\") pod \"community-operators-tt4bd\" (UID: \"45d48dfc-b570-418b-bffb-f6c2256dc806\") " pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.230562 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d48dfc-b570-418b-bffb-f6c2256dc806-utilities\") pod \"community-operators-tt4bd\" (UID: \"45d48dfc-b570-418b-bffb-f6c2256dc806\") " pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.230670 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d48dfc-b570-418b-bffb-f6c2256dc806-catalog-content\") pod \"community-operators-tt4bd\" (UID: \"45d48dfc-b570-418b-bffb-f6c2256dc806\") " pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.251682 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vs9dk\" (UniqueName: \"kubernetes.io/projected/45d48dfc-b570-418b-bffb-f6c2256dc806-kube-api-access-vs9dk\") pod \"community-operators-tt4bd\" (UID: \"45d48dfc-b570-418b-bffb-f6c2256dc806\") " pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.426156 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:46 crc kubenswrapper[4831]: I1203 07:28:46.951303 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tt4bd"] Dec 03 07:28:46 crc kubenswrapper[4831]: W1203 07:28:46.952578 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45d48dfc_b570_418b_bffb_f6c2256dc806.slice/crio-7fb5f8f314482ce88723e8ff31fa03708137d7e3425dd595b462437881c59d26 WatchSource:0}: Error finding container 7fb5f8f314482ce88723e8ff31fa03708137d7e3425dd595b462437881c59d26: Status 404 returned error can't find the container with id 7fb5f8f314482ce88723e8ff31fa03708137d7e3425dd595b462437881c59d26 Dec 03 07:28:47 crc kubenswrapper[4831]: I1203 07:28:47.933123 4831 generic.go:334] "Generic (PLEG): container finished" podID="45d48dfc-b570-418b-bffb-f6c2256dc806" containerID="c9b0456b763d439c418f513fa501651178b35673a5116d2498d46fa867d92414" exitCode=0 Dec 03 07:28:47 crc kubenswrapper[4831]: I1203 07:28:47.933227 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tt4bd" event={"ID":"45d48dfc-b570-418b-bffb-f6c2256dc806","Type":"ContainerDied","Data":"c9b0456b763d439c418f513fa501651178b35673a5116d2498d46fa867d92414"} Dec 03 07:28:47 crc kubenswrapper[4831]: I1203 07:28:47.933584 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tt4bd" event={"ID":"45d48dfc-b570-418b-bffb-f6c2256dc806","Type":"ContainerStarted","Data":"7fb5f8f314482ce88723e8ff31fa03708137d7e3425dd595b462437881c59d26"} Dec 03 07:28:47 crc kubenswrapper[4831]: I1203 07:28:47.935904 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:28:48 crc kubenswrapper[4831]: I1203 07:28:48.946485 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tt4bd" event={"ID":"45d48dfc-b570-418b-bffb-f6c2256dc806","Type":"ContainerStarted","Data":"de9ea7d53ba9b6195c6da5c0ae230decc7fd3396720f6bf3041734a8183d2b79"} Dec 03 07:28:49 crc kubenswrapper[4831]: I1203 07:28:49.960555 4831 generic.go:334] "Generic (PLEG): container finished" podID="45d48dfc-b570-418b-bffb-f6c2256dc806" containerID="de9ea7d53ba9b6195c6da5c0ae230decc7fd3396720f6bf3041734a8183d2b79" exitCode=0 Dec 03 07:28:49 crc kubenswrapper[4831]: I1203 07:28:49.960732 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tt4bd" event={"ID":"45d48dfc-b570-418b-bffb-f6c2256dc806","Type":"ContainerDied","Data":"de9ea7d53ba9b6195c6da5c0ae230decc7fd3396720f6bf3041734a8183d2b79"} Dec 03 07:28:50 crc kubenswrapper[4831]: I1203 07:28:50.972454 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tt4bd" event={"ID":"45d48dfc-b570-418b-bffb-f6c2256dc806","Type":"ContainerStarted","Data":"ae310a98038cc34e4e82a7aa6f5f869b0763533f9d9c5c11f08dcb40c698abe1"} Dec 03 07:28:51 crc kubenswrapper[4831]: I1203 07:28:51.002969 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tt4bd" podStartSLOduration=2.506075983 podStartE2EDuration="5.002951496s" podCreationTimestamp="2025-12-03 07:28:46 +0000 UTC" firstStartedPulling="2025-12-03 07:28:47.935480891 +0000 UTC m=+3465.279064439" lastFinishedPulling="2025-12-03 07:28:50.432356434 +0000 UTC m=+3467.775939952" observedRunningTime="2025-12-03 07:28:50.997507006 +0000 UTC m=+3468.341090544" watchObservedRunningTime="2025-12-03 07:28:51.002951496 +0000 UTC m=+3468.346535014" Dec 03 07:28:56 crc kubenswrapper[4831]: I1203 07:28:56.426816 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:56 crc kubenswrapper[4831]: I1203 07:28:56.427192 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:56 crc kubenswrapper[4831]: I1203 07:28:56.483393 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:57 crc kubenswrapper[4831]: I1203 07:28:57.093258 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:57 crc kubenswrapper[4831]: I1203 07:28:57.168004 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tt4bd"] Dec 03 07:28:59 crc kubenswrapper[4831]: I1203 07:28:59.044587 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tt4bd" podUID="45d48dfc-b570-418b-bffb-f6c2256dc806" containerName="registry-server" containerID="cri-o://ae310a98038cc34e4e82a7aa6f5f869b0763533f9d9c5c11f08dcb40c698abe1" gracePeriod=2 Dec 03 07:28:59 crc kubenswrapper[4831]: I1203 07:28:59.491296 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:28:59 crc kubenswrapper[4831]: I1203 07:28:59.643197 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d48dfc-b570-418b-bffb-f6c2256dc806-catalog-content\") pod \"45d48dfc-b570-418b-bffb-f6c2256dc806\" (UID: \"45d48dfc-b570-418b-bffb-f6c2256dc806\") " Dec 03 07:28:59 crc kubenswrapper[4831]: I1203 07:28:59.643770 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vs9dk\" (UniqueName: \"kubernetes.io/projected/45d48dfc-b570-418b-bffb-f6c2256dc806-kube-api-access-vs9dk\") pod \"45d48dfc-b570-418b-bffb-f6c2256dc806\" (UID: \"45d48dfc-b570-418b-bffb-f6c2256dc806\") " Dec 03 07:28:59 crc kubenswrapper[4831]: I1203 07:28:59.645498 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d48dfc-b570-418b-bffb-f6c2256dc806-utilities\") pod \"45d48dfc-b570-418b-bffb-f6c2256dc806\" (UID: \"45d48dfc-b570-418b-bffb-f6c2256dc806\") " Dec 03 07:28:59 crc kubenswrapper[4831]: I1203 07:28:59.646276 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d48dfc-b570-418b-bffb-f6c2256dc806-utilities" (OuterVolumeSpecName: "utilities") pod "45d48dfc-b570-418b-bffb-f6c2256dc806" (UID: "45d48dfc-b570-418b-bffb-f6c2256dc806"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:28:59 crc kubenswrapper[4831]: I1203 07:28:59.654665 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/45d48dfc-b570-418b-bffb-f6c2256dc806-kube-api-access-vs9dk" (OuterVolumeSpecName: "kube-api-access-vs9dk") pod "45d48dfc-b570-418b-bffb-f6c2256dc806" (UID: "45d48dfc-b570-418b-bffb-f6c2256dc806"). InnerVolumeSpecName "kube-api-access-vs9dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:28:59 crc kubenswrapper[4831]: I1203 07:28:59.723692 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/45d48dfc-b570-418b-bffb-f6c2256dc806-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "45d48dfc-b570-418b-bffb-f6c2256dc806" (UID: "45d48dfc-b570-418b-bffb-f6c2256dc806"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:28:59 crc kubenswrapper[4831]: I1203 07:28:59.747433 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vs9dk\" (UniqueName: \"kubernetes.io/projected/45d48dfc-b570-418b-bffb-f6c2256dc806-kube-api-access-vs9dk\") on node \"crc\" DevicePath \"\"" Dec 03 07:28:59 crc kubenswrapper[4831]: I1203 07:28:59.747679 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/45d48dfc-b570-418b-bffb-f6c2256dc806-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:28:59 crc kubenswrapper[4831]: I1203 07:28:59.747692 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/45d48dfc-b570-418b-bffb-f6c2256dc806-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.055641 4831 generic.go:334] "Generic (PLEG): container finished" podID="45d48dfc-b570-418b-bffb-f6c2256dc806" containerID="ae310a98038cc34e4e82a7aa6f5f869b0763533f9d9c5c11f08dcb40c698abe1" exitCode=0 Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.055694 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tt4bd" event={"ID":"45d48dfc-b570-418b-bffb-f6c2256dc806","Type":"ContainerDied","Data":"ae310a98038cc34e4e82a7aa6f5f869b0763533f9d9c5c11f08dcb40c698abe1"} Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.055727 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tt4bd" event={"ID":"45d48dfc-b570-418b-bffb-f6c2256dc806","Type":"ContainerDied","Data":"7fb5f8f314482ce88723e8ff31fa03708137d7e3425dd595b462437881c59d26"} Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.055752 4831 scope.go:117] "RemoveContainer" containerID="ae310a98038cc34e4e82a7aa6f5f869b0763533f9d9c5c11f08dcb40c698abe1" Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.055879 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tt4bd" Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.082743 4831 scope.go:117] "RemoveContainer" containerID="de9ea7d53ba9b6195c6da5c0ae230decc7fd3396720f6bf3041734a8183d2b79" Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.110106 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tt4bd"] Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.118783 4831 scope.go:117] "RemoveContainer" containerID="c9b0456b763d439c418f513fa501651178b35673a5116d2498d46fa867d92414" Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.119592 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tt4bd"] Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.141637 4831 scope.go:117] "RemoveContainer" containerID="ae310a98038cc34e4e82a7aa6f5f869b0763533f9d9c5c11f08dcb40c698abe1" Dec 03 07:29:00 crc kubenswrapper[4831]: E1203 07:29:00.141955 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae310a98038cc34e4e82a7aa6f5f869b0763533f9d9c5c11f08dcb40c698abe1\": container with ID starting with ae310a98038cc34e4e82a7aa6f5f869b0763533f9d9c5c11f08dcb40c698abe1 not found: ID does not exist" containerID="ae310a98038cc34e4e82a7aa6f5f869b0763533f9d9c5c11f08dcb40c698abe1" Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.141988 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae310a98038cc34e4e82a7aa6f5f869b0763533f9d9c5c11f08dcb40c698abe1"} err="failed to get container status \"ae310a98038cc34e4e82a7aa6f5f869b0763533f9d9c5c11f08dcb40c698abe1\": rpc error: code = NotFound desc = could not find container \"ae310a98038cc34e4e82a7aa6f5f869b0763533f9d9c5c11f08dcb40c698abe1\": container with ID starting with ae310a98038cc34e4e82a7aa6f5f869b0763533f9d9c5c11f08dcb40c698abe1 not found: ID does not exist" Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.142010 4831 scope.go:117] "RemoveContainer" containerID="de9ea7d53ba9b6195c6da5c0ae230decc7fd3396720f6bf3041734a8183d2b79" Dec 03 07:29:00 crc kubenswrapper[4831]: E1203 07:29:00.142253 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de9ea7d53ba9b6195c6da5c0ae230decc7fd3396720f6bf3041734a8183d2b79\": container with ID starting with de9ea7d53ba9b6195c6da5c0ae230decc7fd3396720f6bf3041734a8183d2b79 not found: ID does not exist" containerID="de9ea7d53ba9b6195c6da5c0ae230decc7fd3396720f6bf3041734a8183d2b79" Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.142270 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de9ea7d53ba9b6195c6da5c0ae230decc7fd3396720f6bf3041734a8183d2b79"} err="failed to get container status \"de9ea7d53ba9b6195c6da5c0ae230decc7fd3396720f6bf3041734a8183d2b79\": rpc error: code = NotFound desc = could not find container \"de9ea7d53ba9b6195c6da5c0ae230decc7fd3396720f6bf3041734a8183d2b79\": container with ID starting with de9ea7d53ba9b6195c6da5c0ae230decc7fd3396720f6bf3041734a8183d2b79 not found: ID does not exist" Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.142283 4831 scope.go:117] "RemoveContainer" containerID="c9b0456b763d439c418f513fa501651178b35673a5116d2498d46fa867d92414" Dec 03 07:29:00 crc kubenswrapper[4831]: E1203 07:29:00.142720 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9b0456b763d439c418f513fa501651178b35673a5116d2498d46fa867d92414\": container with ID starting with c9b0456b763d439c418f513fa501651178b35673a5116d2498d46fa867d92414 not found: ID does not exist" containerID="c9b0456b763d439c418f513fa501651178b35673a5116d2498d46fa867d92414" Dec 03 07:29:00 crc kubenswrapper[4831]: I1203 07:29:00.142742 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9b0456b763d439c418f513fa501651178b35673a5116d2498d46fa867d92414"} err="failed to get container status \"c9b0456b763d439c418f513fa501651178b35673a5116d2498d46fa867d92414\": rpc error: code = NotFound desc = could not find container \"c9b0456b763d439c418f513fa501651178b35673a5116d2498d46fa867d92414\": container with ID starting with c9b0456b763d439c418f513fa501651178b35673a5116d2498d46fa867d92414 not found: ID does not exist" Dec 03 07:29:01 crc kubenswrapper[4831]: I1203 07:29:01.029279 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="45d48dfc-b570-418b-bffb-f6c2256dc806" path="/var/lib/kubelet/pods/45d48dfc-b570-418b-bffb-f6c2256dc806/volumes" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.165971 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz"] Dec 03 07:30:00 crc kubenswrapper[4831]: E1203 07:30:00.167078 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d48dfc-b570-418b-bffb-f6c2256dc806" containerName="extract-utilities" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.167096 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d48dfc-b570-418b-bffb-f6c2256dc806" containerName="extract-utilities" Dec 03 07:30:00 crc kubenswrapper[4831]: E1203 07:30:00.167115 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d48dfc-b570-418b-bffb-f6c2256dc806" containerName="extract-content" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.167122 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d48dfc-b570-418b-bffb-f6c2256dc806" containerName="extract-content" Dec 03 07:30:00 crc kubenswrapper[4831]: E1203 07:30:00.167137 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="45d48dfc-b570-418b-bffb-f6c2256dc806" containerName="registry-server" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.167142 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="45d48dfc-b570-418b-bffb-f6c2256dc806" containerName="registry-server" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.167288 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="45d48dfc-b570-418b-bffb-f6c2256dc806" containerName="registry-server" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.167789 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.170720 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.171052 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.186003 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz"] Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.205455 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed8f3db3-96a2-4695-8c68-446fc5d299da-config-volume\") pod \"collect-profiles-29412450-q7hsz\" (UID: \"ed8f3db3-96a2-4695-8c68-446fc5d299da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.205739 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed8f3db3-96a2-4695-8c68-446fc5d299da-secret-volume\") pod \"collect-profiles-29412450-q7hsz\" (UID: \"ed8f3db3-96a2-4695-8c68-446fc5d299da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.205833 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs72m\" (UniqueName: \"kubernetes.io/projected/ed8f3db3-96a2-4695-8c68-446fc5d299da-kube-api-access-bs72m\") pod \"collect-profiles-29412450-q7hsz\" (UID: \"ed8f3db3-96a2-4695-8c68-446fc5d299da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.306782 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed8f3db3-96a2-4695-8c68-446fc5d299da-secret-volume\") pod \"collect-profiles-29412450-q7hsz\" (UID: \"ed8f3db3-96a2-4695-8c68-446fc5d299da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.306839 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs72m\" (UniqueName: \"kubernetes.io/projected/ed8f3db3-96a2-4695-8c68-446fc5d299da-kube-api-access-bs72m\") pod \"collect-profiles-29412450-q7hsz\" (UID: \"ed8f3db3-96a2-4695-8c68-446fc5d299da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.306913 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed8f3db3-96a2-4695-8c68-446fc5d299da-config-volume\") pod \"collect-profiles-29412450-q7hsz\" (UID: \"ed8f3db3-96a2-4695-8c68-446fc5d299da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.308938 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed8f3db3-96a2-4695-8c68-446fc5d299da-config-volume\") pod \"collect-profiles-29412450-q7hsz\" (UID: \"ed8f3db3-96a2-4695-8c68-446fc5d299da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.314926 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed8f3db3-96a2-4695-8c68-446fc5d299da-secret-volume\") pod \"collect-profiles-29412450-q7hsz\" (UID: \"ed8f3db3-96a2-4695-8c68-446fc5d299da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.334166 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs72m\" (UniqueName: \"kubernetes.io/projected/ed8f3db3-96a2-4695-8c68-446fc5d299da-kube-api-access-bs72m\") pod \"collect-profiles-29412450-q7hsz\" (UID: \"ed8f3db3-96a2-4695-8c68-446fc5d299da\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.494401 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" Dec 03 07:30:00 crc kubenswrapper[4831]: I1203 07:30:00.788265 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz"] Dec 03 07:30:00 crc kubenswrapper[4831]: W1203 07:30:00.800205 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded8f3db3_96a2_4695_8c68_446fc5d299da.slice/crio-32d042cf9d51fbe99daa43227fff171f6cda896fd8f3124d62d8f1cbc5a2bba2 WatchSource:0}: Error finding container 32d042cf9d51fbe99daa43227fff171f6cda896fd8f3124d62d8f1cbc5a2bba2: Status 404 returned error can't find the container with id 32d042cf9d51fbe99daa43227fff171f6cda896fd8f3124d62d8f1cbc5a2bba2 Dec 03 07:30:01 crc kubenswrapper[4831]: I1203 07:30:01.665590 4831 generic.go:334] "Generic (PLEG): container finished" podID="ed8f3db3-96a2-4695-8c68-446fc5d299da" containerID="fd71482d9f20efd9e15333458c0dafd8184ac4a2e3cd91a3595f4d7d91bad990" exitCode=0 Dec 03 07:30:01 crc kubenswrapper[4831]: I1203 07:30:01.665657 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" event={"ID":"ed8f3db3-96a2-4695-8c68-446fc5d299da","Type":"ContainerDied","Data":"fd71482d9f20efd9e15333458c0dafd8184ac4a2e3cd91a3595f4d7d91bad990"} Dec 03 07:30:01 crc kubenswrapper[4831]: I1203 07:30:01.665707 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" event={"ID":"ed8f3db3-96a2-4695-8c68-446fc5d299da","Type":"ContainerStarted","Data":"32d042cf9d51fbe99daa43227fff171f6cda896fd8f3124d62d8f1cbc5a2bba2"} Dec 03 07:30:03 crc kubenswrapper[4831]: I1203 07:30:02.998255 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" Dec 03 07:30:03 crc kubenswrapper[4831]: I1203 07:30:03.173637 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed8f3db3-96a2-4695-8c68-446fc5d299da-config-volume\") pod \"ed8f3db3-96a2-4695-8c68-446fc5d299da\" (UID: \"ed8f3db3-96a2-4695-8c68-446fc5d299da\") " Dec 03 07:30:03 crc kubenswrapper[4831]: I1203 07:30:03.173778 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed8f3db3-96a2-4695-8c68-446fc5d299da-secret-volume\") pod \"ed8f3db3-96a2-4695-8c68-446fc5d299da\" (UID: \"ed8f3db3-96a2-4695-8c68-446fc5d299da\") " Dec 03 07:30:03 crc kubenswrapper[4831]: I1203 07:30:03.173847 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bs72m\" (UniqueName: \"kubernetes.io/projected/ed8f3db3-96a2-4695-8c68-446fc5d299da-kube-api-access-bs72m\") pod \"ed8f3db3-96a2-4695-8c68-446fc5d299da\" (UID: \"ed8f3db3-96a2-4695-8c68-446fc5d299da\") " Dec 03 07:30:03 crc kubenswrapper[4831]: I1203 07:30:03.174641 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ed8f3db3-96a2-4695-8c68-446fc5d299da-config-volume" (OuterVolumeSpecName: "config-volume") pod "ed8f3db3-96a2-4695-8c68-446fc5d299da" (UID: "ed8f3db3-96a2-4695-8c68-446fc5d299da"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:30:03 crc kubenswrapper[4831]: I1203 07:30:03.180279 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed8f3db3-96a2-4695-8c68-446fc5d299da-kube-api-access-bs72m" (OuterVolumeSpecName: "kube-api-access-bs72m") pod "ed8f3db3-96a2-4695-8c68-446fc5d299da" (UID: "ed8f3db3-96a2-4695-8c68-446fc5d299da"). InnerVolumeSpecName "kube-api-access-bs72m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:30:03 crc kubenswrapper[4831]: I1203 07:30:03.186498 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed8f3db3-96a2-4695-8c68-446fc5d299da-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "ed8f3db3-96a2-4695-8c68-446fc5d299da" (UID: "ed8f3db3-96a2-4695-8c68-446fc5d299da"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:30:03 crc kubenswrapper[4831]: I1203 07:30:03.275220 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/ed8f3db3-96a2-4695-8c68-446fc5d299da-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:30:03 crc kubenswrapper[4831]: I1203 07:30:03.275253 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bs72m\" (UniqueName: \"kubernetes.io/projected/ed8f3db3-96a2-4695-8c68-446fc5d299da-kube-api-access-bs72m\") on node \"crc\" DevicePath \"\"" Dec 03 07:30:03 crc kubenswrapper[4831]: I1203 07:30:03.275265 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed8f3db3-96a2-4695-8c68-446fc5d299da-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:30:03 crc kubenswrapper[4831]: I1203 07:30:03.687292 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" event={"ID":"ed8f3db3-96a2-4695-8c68-446fc5d299da","Type":"ContainerDied","Data":"32d042cf9d51fbe99daa43227fff171f6cda896fd8f3124d62d8f1cbc5a2bba2"} Dec 03 07:30:03 crc kubenswrapper[4831]: I1203 07:30:03.687409 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz" Dec 03 07:30:03 crc kubenswrapper[4831]: I1203 07:30:03.687421 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32d042cf9d51fbe99daa43227fff171f6cda896fd8f3124d62d8f1cbc5a2bba2" Dec 03 07:30:04 crc kubenswrapper[4831]: I1203 07:30:04.064925 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c"] Dec 03 07:30:04 crc kubenswrapper[4831]: I1203 07:30:04.069884 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412405-bq88c"] Dec 03 07:30:05 crc kubenswrapper[4831]: I1203 07:30:05.028937 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81173897-29c5-4ce0-a308-f48eadb82cc4" path="/var/lib/kubelet/pods/81173897-29c5-4ce0-a308-f48eadb82cc4/volumes" Dec 03 07:30:17 crc kubenswrapper[4831]: I1203 07:30:17.638568 4831 scope.go:117] "RemoveContainer" containerID="a05de0e8b4131a01238775493bdc11ac50bac649e8c4b73a406888571d862008" Dec 03 07:30:27 crc kubenswrapper[4831]: I1203 07:30:27.596892 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:30:27 crc kubenswrapper[4831]: I1203 07:30:27.597373 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:30:57 crc kubenswrapper[4831]: I1203 07:30:57.596975 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:30:57 crc kubenswrapper[4831]: I1203 07:30:57.597694 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:31:27 crc kubenswrapper[4831]: I1203 07:31:27.596591 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:31:27 crc kubenswrapper[4831]: I1203 07:31:27.597159 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:31:27 crc kubenswrapper[4831]: I1203 07:31:27.597210 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 07:31:27 crc kubenswrapper[4831]: I1203 07:31:27.597885 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:31:27 crc kubenswrapper[4831]: I1203 07:31:27.597943 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" gracePeriod=600 Dec 03 07:31:27 crc kubenswrapper[4831]: E1203 07:31:27.729719 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:31:28 crc kubenswrapper[4831]: I1203 07:31:28.492563 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" exitCode=0 Dec 03 07:31:28 crc kubenswrapper[4831]: I1203 07:31:28.492631 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721"} Dec 03 07:31:28 crc kubenswrapper[4831]: I1203 07:31:28.492679 4831 scope.go:117] "RemoveContainer" containerID="30217bbfbbca1e96e56a9110617a3283cb42ac5aa1bdaaf69fb3702e97fe484a" Dec 03 07:31:28 crc kubenswrapper[4831]: I1203 07:31:28.498747 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:31:28 crc kubenswrapper[4831]: E1203 07:31:28.499399 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:31:40 crc kubenswrapper[4831]: I1203 07:31:40.013130 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:31:40 crc kubenswrapper[4831]: E1203 07:31:40.014234 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:31:51 crc kubenswrapper[4831]: I1203 07:31:51.013701 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:31:51 crc kubenswrapper[4831]: E1203 07:31:51.016065 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:32:02 crc kubenswrapper[4831]: I1203 07:32:02.013186 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:32:02 crc kubenswrapper[4831]: E1203 07:32:02.014468 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:32:13 crc kubenswrapper[4831]: I1203 07:32:13.021130 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:32:13 crc kubenswrapper[4831]: E1203 07:32:13.022217 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:32:27 crc kubenswrapper[4831]: I1203 07:32:27.013206 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:32:27 crc kubenswrapper[4831]: E1203 07:32:27.014277 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:32:42 crc kubenswrapper[4831]: I1203 07:32:42.013029 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:32:42 crc kubenswrapper[4831]: E1203 07:32:42.014241 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:32:53 crc kubenswrapper[4831]: I1203 07:32:53.021342 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:32:53 crc kubenswrapper[4831]: E1203 07:32:53.023590 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:33:05 crc kubenswrapper[4831]: I1203 07:33:05.013927 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:33:05 crc kubenswrapper[4831]: E1203 07:33:05.015017 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:33:16 crc kubenswrapper[4831]: I1203 07:33:16.013841 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:33:16 crc kubenswrapper[4831]: E1203 07:33:16.014977 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:33:27 crc kubenswrapper[4831]: I1203 07:33:27.013006 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:33:27 crc kubenswrapper[4831]: E1203 07:33:27.013915 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:33:38 crc kubenswrapper[4831]: I1203 07:33:38.012107 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:33:38 crc kubenswrapper[4831]: E1203 07:33:38.012911 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:33:40 crc kubenswrapper[4831]: I1203 07:33:40.717003 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-l9n6h"] Dec 03 07:33:40 crc kubenswrapper[4831]: E1203 07:33:40.717914 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed8f3db3-96a2-4695-8c68-446fc5d299da" containerName="collect-profiles" Dec 03 07:33:40 crc kubenswrapper[4831]: I1203 07:33:40.717942 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed8f3db3-96a2-4695-8c68-446fc5d299da" containerName="collect-profiles" Dec 03 07:33:40 crc kubenswrapper[4831]: I1203 07:33:40.718365 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed8f3db3-96a2-4695-8c68-446fc5d299da" containerName="collect-profiles" Dec 03 07:33:40 crc kubenswrapper[4831]: I1203 07:33:40.720018 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:40 crc kubenswrapper[4831]: I1203 07:33:40.734904 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l9n6h"] Dec 03 07:33:40 crc kubenswrapper[4831]: I1203 07:33:40.912774 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqp6x\" (UniqueName: \"kubernetes.io/projected/4de80cca-4733-4917-9753-bc84b0667513-kube-api-access-cqp6x\") pod \"certified-operators-l9n6h\" (UID: \"4de80cca-4733-4917-9753-bc84b0667513\") " pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:40 crc kubenswrapper[4831]: I1203 07:33:40.912848 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4de80cca-4733-4917-9753-bc84b0667513-utilities\") pod \"certified-operators-l9n6h\" (UID: \"4de80cca-4733-4917-9753-bc84b0667513\") " pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:40 crc kubenswrapper[4831]: I1203 07:33:40.912932 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4de80cca-4733-4917-9753-bc84b0667513-catalog-content\") pod \"certified-operators-l9n6h\" (UID: \"4de80cca-4733-4917-9753-bc84b0667513\") " pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:41 crc kubenswrapper[4831]: I1203 07:33:41.014461 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqp6x\" (UniqueName: \"kubernetes.io/projected/4de80cca-4733-4917-9753-bc84b0667513-kube-api-access-cqp6x\") pod \"certified-operators-l9n6h\" (UID: \"4de80cca-4733-4917-9753-bc84b0667513\") " pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:41 crc kubenswrapper[4831]: I1203 07:33:41.014900 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4de80cca-4733-4917-9753-bc84b0667513-utilities\") pod \"certified-operators-l9n6h\" (UID: \"4de80cca-4733-4917-9753-bc84b0667513\") " pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:41 crc kubenswrapper[4831]: I1203 07:33:41.015123 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4de80cca-4733-4917-9753-bc84b0667513-catalog-content\") pod \"certified-operators-l9n6h\" (UID: \"4de80cca-4733-4917-9753-bc84b0667513\") " pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:41 crc kubenswrapper[4831]: I1203 07:33:41.015481 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4de80cca-4733-4917-9753-bc84b0667513-utilities\") pod \"certified-operators-l9n6h\" (UID: \"4de80cca-4733-4917-9753-bc84b0667513\") " pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:41 crc kubenswrapper[4831]: I1203 07:33:41.015562 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4de80cca-4733-4917-9753-bc84b0667513-catalog-content\") pod \"certified-operators-l9n6h\" (UID: \"4de80cca-4733-4917-9753-bc84b0667513\") " pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:41 crc kubenswrapper[4831]: I1203 07:33:41.044722 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqp6x\" (UniqueName: \"kubernetes.io/projected/4de80cca-4733-4917-9753-bc84b0667513-kube-api-access-cqp6x\") pod \"certified-operators-l9n6h\" (UID: \"4de80cca-4733-4917-9753-bc84b0667513\") " pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:41 crc kubenswrapper[4831]: I1203 07:33:41.052242 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:41 crc kubenswrapper[4831]: I1203 07:33:41.512663 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-l9n6h"] Dec 03 07:33:41 crc kubenswrapper[4831]: I1203 07:33:41.816213 4831 generic.go:334] "Generic (PLEG): container finished" podID="4de80cca-4733-4917-9753-bc84b0667513" containerID="f1eb85ed084915e1a935122011b728e9f412ef3e7b69cbad17ee825825b3d1ba" exitCode=0 Dec 03 07:33:41 crc kubenswrapper[4831]: I1203 07:33:41.816589 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9n6h" event={"ID":"4de80cca-4733-4917-9753-bc84b0667513","Type":"ContainerDied","Data":"f1eb85ed084915e1a935122011b728e9f412ef3e7b69cbad17ee825825b3d1ba"} Dec 03 07:33:41 crc kubenswrapper[4831]: I1203 07:33:41.817380 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9n6h" event={"ID":"4de80cca-4733-4917-9753-bc84b0667513","Type":"ContainerStarted","Data":"fae8920375aa3d40485029a8ddcdf09e37aae123a74c2f9dc5b4eaa2aac3084b"} Dec 03 07:33:43 crc kubenswrapper[4831]: I1203 07:33:43.835168 4831 generic.go:334] "Generic (PLEG): container finished" podID="4de80cca-4733-4917-9753-bc84b0667513" containerID="e380f263d8af1354e5c0521cde71cc067ab237c26496b2223c9fd86c73dd055d" exitCode=0 Dec 03 07:33:43 crc kubenswrapper[4831]: I1203 07:33:43.835237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9n6h" event={"ID":"4de80cca-4733-4917-9753-bc84b0667513","Type":"ContainerDied","Data":"e380f263d8af1354e5c0521cde71cc067ab237c26496b2223c9fd86c73dd055d"} Dec 03 07:33:44 crc kubenswrapper[4831]: I1203 07:33:44.846740 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9n6h" event={"ID":"4de80cca-4733-4917-9753-bc84b0667513","Type":"ContainerStarted","Data":"957346e56fa9ded956458888e74e8ecb525369147a3b12a2177e7e3dc98cb44f"} Dec 03 07:33:44 crc kubenswrapper[4831]: I1203 07:33:44.876697 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-l9n6h" podStartSLOduration=2.238824772 podStartE2EDuration="4.876676755s" podCreationTimestamp="2025-12-03 07:33:40 +0000 UTC" firstStartedPulling="2025-12-03 07:33:41.820050429 +0000 UTC m=+3759.163633937" lastFinishedPulling="2025-12-03 07:33:44.457902392 +0000 UTC m=+3761.801485920" observedRunningTime="2025-12-03 07:33:44.874585389 +0000 UTC m=+3762.218168927" watchObservedRunningTime="2025-12-03 07:33:44.876676755 +0000 UTC m=+3762.220260263" Dec 03 07:33:49 crc kubenswrapper[4831]: I1203 07:33:49.012912 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:33:49 crc kubenswrapper[4831]: E1203 07:33:49.013455 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:33:51 crc kubenswrapper[4831]: I1203 07:33:51.053598 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:51 crc kubenswrapper[4831]: I1203 07:33:51.054180 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:51 crc kubenswrapper[4831]: I1203 07:33:51.130140 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:51 crc kubenswrapper[4831]: I1203 07:33:51.980458 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:52 crc kubenswrapper[4831]: I1203 07:33:52.055553 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l9n6h"] Dec 03 07:33:53 crc kubenswrapper[4831]: I1203 07:33:53.918903 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-l9n6h" podUID="4de80cca-4733-4917-9753-bc84b0667513" containerName="registry-server" containerID="cri-o://957346e56fa9ded956458888e74e8ecb525369147a3b12a2177e7e3dc98cb44f" gracePeriod=2 Dec 03 07:33:54 crc kubenswrapper[4831]: I1203 07:33:54.801376 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:54 crc kubenswrapper[4831]: I1203 07:33:54.925889 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4de80cca-4733-4917-9753-bc84b0667513-utilities\") pod \"4de80cca-4733-4917-9753-bc84b0667513\" (UID: \"4de80cca-4733-4917-9753-bc84b0667513\") " Dec 03 07:33:54 crc kubenswrapper[4831]: I1203 07:33:54.926056 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqp6x\" (UniqueName: \"kubernetes.io/projected/4de80cca-4733-4917-9753-bc84b0667513-kube-api-access-cqp6x\") pod \"4de80cca-4733-4917-9753-bc84b0667513\" (UID: \"4de80cca-4733-4917-9753-bc84b0667513\") " Dec 03 07:33:54 crc kubenswrapper[4831]: I1203 07:33:54.926076 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4de80cca-4733-4917-9753-bc84b0667513-catalog-content\") pod \"4de80cca-4733-4917-9753-bc84b0667513\" (UID: \"4de80cca-4733-4917-9753-bc84b0667513\") " Dec 03 07:33:54 crc kubenswrapper[4831]: I1203 07:33:54.927297 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4de80cca-4733-4917-9753-bc84b0667513-utilities" (OuterVolumeSpecName: "utilities") pod "4de80cca-4733-4917-9753-bc84b0667513" (UID: "4de80cca-4733-4917-9753-bc84b0667513"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:33:54 crc kubenswrapper[4831]: I1203 07:33:54.930134 4831 generic.go:334] "Generic (PLEG): container finished" podID="4de80cca-4733-4917-9753-bc84b0667513" containerID="957346e56fa9ded956458888e74e8ecb525369147a3b12a2177e7e3dc98cb44f" exitCode=0 Dec 03 07:33:54 crc kubenswrapper[4831]: I1203 07:33:54.930185 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9n6h" event={"ID":"4de80cca-4733-4917-9753-bc84b0667513","Type":"ContainerDied","Data":"957346e56fa9ded956458888e74e8ecb525369147a3b12a2177e7e3dc98cb44f"} Dec 03 07:33:54 crc kubenswrapper[4831]: I1203 07:33:54.930227 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-l9n6h" event={"ID":"4de80cca-4733-4917-9753-bc84b0667513","Type":"ContainerDied","Data":"fae8920375aa3d40485029a8ddcdf09e37aae123a74c2f9dc5b4eaa2aac3084b"} Dec 03 07:33:54 crc kubenswrapper[4831]: I1203 07:33:54.930255 4831 scope.go:117] "RemoveContainer" containerID="957346e56fa9ded956458888e74e8ecb525369147a3b12a2177e7e3dc98cb44f" Dec 03 07:33:54 crc kubenswrapper[4831]: I1203 07:33:54.930446 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-l9n6h" Dec 03 07:33:54 crc kubenswrapper[4831]: I1203 07:33:54.942700 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4de80cca-4733-4917-9753-bc84b0667513-kube-api-access-cqp6x" (OuterVolumeSpecName: "kube-api-access-cqp6x") pod "4de80cca-4733-4917-9753-bc84b0667513" (UID: "4de80cca-4733-4917-9753-bc84b0667513"). InnerVolumeSpecName "kube-api-access-cqp6x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:33:54 crc kubenswrapper[4831]: I1203 07:33:54.976912 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4de80cca-4733-4917-9753-bc84b0667513-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4de80cca-4733-4917-9753-bc84b0667513" (UID: "4de80cca-4733-4917-9753-bc84b0667513"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:33:54 crc kubenswrapper[4831]: I1203 07:33:54.977397 4831 scope.go:117] "RemoveContainer" containerID="e380f263d8af1354e5c0521cde71cc067ab237c26496b2223c9fd86c73dd055d" Dec 03 07:33:55 crc kubenswrapper[4831]: I1203 07:33:55.005464 4831 scope.go:117] "RemoveContainer" containerID="f1eb85ed084915e1a935122011b728e9f412ef3e7b69cbad17ee825825b3d1ba" Dec 03 07:33:55 crc kubenswrapper[4831]: I1203 07:33:55.027227 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4de80cca-4733-4917-9753-bc84b0667513-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:33:55 crc kubenswrapper[4831]: I1203 07:33:55.027267 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqp6x\" (UniqueName: \"kubernetes.io/projected/4de80cca-4733-4917-9753-bc84b0667513-kube-api-access-cqp6x\") on node \"crc\" DevicePath \"\"" Dec 03 07:33:55 crc kubenswrapper[4831]: I1203 07:33:55.027304 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4de80cca-4733-4917-9753-bc84b0667513-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:33:55 crc kubenswrapper[4831]: I1203 07:33:55.030298 4831 scope.go:117] "RemoveContainer" containerID="957346e56fa9ded956458888e74e8ecb525369147a3b12a2177e7e3dc98cb44f" Dec 03 07:33:55 crc kubenswrapper[4831]: E1203 07:33:55.030762 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"957346e56fa9ded956458888e74e8ecb525369147a3b12a2177e7e3dc98cb44f\": container with ID starting with 957346e56fa9ded956458888e74e8ecb525369147a3b12a2177e7e3dc98cb44f not found: ID does not exist" containerID="957346e56fa9ded956458888e74e8ecb525369147a3b12a2177e7e3dc98cb44f" Dec 03 07:33:55 crc kubenswrapper[4831]: I1203 07:33:55.030810 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"957346e56fa9ded956458888e74e8ecb525369147a3b12a2177e7e3dc98cb44f"} err="failed to get container status \"957346e56fa9ded956458888e74e8ecb525369147a3b12a2177e7e3dc98cb44f\": rpc error: code = NotFound desc = could not find container \"957346e56fa9ded956458888e74e8ecb525369147a3b12a2177e7e3dc98cb44f\": container with ID starting with 957346e56fa9ded956458888e74e8ecb525369147a3b12a2177e7e3dc98cb44f not found: ID does not exist" Dec 03 07:33:55 crc kubenswrapper[4831]: I1203 07:33:55.030843 4831 scope.go:117] "RemoveContainer" containerID="e380f263d8af1354e5c0521cde71cc067ab237c26496b2223c9fd86c73dd055d" Dec 03 07:33:55 crc kubenswrapper[4831]: E1203 07:33:55.031284 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e380f263d8af1354e5c0521cde71cc067ab237c26496b2223c9fd86c73dd055d\": container with ID starting with e380f263d8af1354e5c0521cde71cc067ab237c26496b2223c9fd86c73dd055d not found: ID does not exist" containerID="e380f263d8af1354e5c0521cde71cc067ab237c26496b2223c9fd86c73dd055d" Dec 03 07:33:55 crc kubenswrapper[4831]: I1203 07:33:55.031331 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e380f263d8af1354e5c0521cde71cc067ab237c26496b2223c9fd86c73dd055d"} err="failed to get container status \"e380f263d8af1354e5c0521cde71cc067ab237c26496b2223c9fd86c73dd055d\": rpc error: code = NotFound desc = could not find container \"e380f263d8af1354e5c0521cde71cc067ab237c26496b2223c9fd86c73dd055d\": container with ID starting with e380f263d8af1354e5c0521cde71cc067ab237c26496b2223c9fd86c73dd055d not found: ID does not exist" Dec 03 07:33:55 crc kubenswrapper[4831]: I1203 07:33:55.031352 4831 scope.go:117] "RemoveContainer" containerID="f1eb85ed084915e1a935122011b728e9f412ef3e7b69cbad17ee825825b3d1ba" Dec 03 07:33:55 crc kubenswrapper[4831]: E1203 07:33:55.031750 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1eb85ed084915e1a935122011b728e9f412ef3e7b69cbad17ee825825b3d1ba\": container with ID starting with f1eb85ed084915e1a935122011b728e9f412ef3e7b69cbad17ee825825b3d1ba not found: ID does not exist" containerID="f1eb85ed084915e1a935122011b728e9f412ef3e7b69cbad17ee825825b3d1ba" Dec 03 07:33:55 crc kubenswrapper[4831]: I1203 07:33:55.031815 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1eb85ed084915e1a935122011b728e9f412ef3e7b69cbad17ee825825b3d1ba"} err="failed to get container status \"f1eb85ed084915e1a935122011b728e9f412ef3e7b69cbad17ee825825b3d1ba\": rpc error: code = NotFound desc = could not find container \"f1eb85ed084915e1a935122011b728e9f412ef3e7b69cbad17ee825825b3d1ba\": container with ID starting with f1eb85ed084915e1a935122011b728e9f412ef3e7b69cbad17ee825825b3d1ba not found: ID does not exist" Dec 03 07:33:55 crc kubenswrapper[4831]: I1203 07:33:55.255785 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-l9n6h"] Dec 03 07:33:55 crc kubenswrapper[4831]: I1203 07:33:55.260217 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-l9n6h"] Dec 03 07:33:57 crc kubenswrapper[4831]: I1203 07:33:57.021254 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4de80cca-4733-4917-9753-bc84b0667513" path="/var/lib/kubelet/pods/4de80cca-4733-4917-9753-bc84b0667513/volumes" Dec 03 07:34:02 crc kubenswrapper[4831]: I1203 07:34:02.013544 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:34:02 crc kubenswrapper[4831]: E1203 07:34:02.014381 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:34:14 crc kubenswrapper[4831]: I1203 07:34:14.013938 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:34:14 crc kubenswrapper[4831]: E1203 07:34:14.014886 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:34:25 crc kubenswrapper[4831]: I1203 07:34:25.013190 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:34:25 crc kubenswrapper[4831]: E1203 07:34:25.014097 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:34:40 crc kubenswrapper[4831]: I1203 07:34:40.014069 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:34:40 crc kubenswrapper[4831]: E1203 07:34:40.015377 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:34:51 crc kubenswrapper[4831]: I1203 07:34:51.013822 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:34:51 crc kubenswrapper[4831]: E1203 07:34:51.014554 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:35:06 crc kubenswrapper[4831]: I1203 07:35:06.013796 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:35:06 crc kubenswrapper[4831]: E1203 07:35:06.014907 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:35:18 crc kubenswrapper[4831]: I1203 07:35:18.013843 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:35:18 crc kubenswrapper[4831]: E1203 07:35:18.015207 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:35:33 crc kubenswrapper[4831]: I1203 07:35:33.021096 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:35:33 crc kubenswrapper[4831]: E1203 07:35:33.022214 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:35:44 crc kubenswrapper[4831]: I1203 07:35:44.012898 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:35:44 crc kubenswrapper[4831]: E1203 07:35:44.014030 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:35:56 crc kubenswrapper[4831]: I1203 07:35:56.012698 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:35:56 crc kubenswrapper[4831]: E1203 07:35:56.013889 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.718897 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-92lp7"] Dec 03 07:35:59 crc kubenswrapper[4831]: E1203 07:35:59.719251 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de80cca-4733-4917-9753-bc84b0667513" containerName="registry-server" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.719262 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de80cca-4733-4917-9753-bc84b0667513" containerName="registry-server" Dec 03 07:35:59 crc kubenswrapper[4831]: E1203 07:35:59.719281 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de80cca-4733-4917-9753-bc84b0667513" containerName="extract-utilities" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.719287 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de80cca-4733-4917-9753-bc84b0667513" containerName="extract-utilities" Dec 03 07:35:59 crc kubenswrapper[4831]: E1203 07:35:59.719303 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4de80cca-4733-4917-9753-bc84b0667513" containerName="extract-content" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.719309 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4de80cca-4733-4917-9753-bc84b0667513" containerName="extract-content" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.719462 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4de80cca-4733-4917-9753-bc84b0667513" containerName="registry-server" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.720453 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.743967 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-92lp7"] Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.755888 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnqpv\" (UniqueName: \"kubernetes.io/projected/865814ac-a91d-416f-b37e-a01f68b93277-kube-api-access-tnqpv\") pod \"redhat-marketplace-92lp7\" (UID: \"865814ac-a91d-416f-b37e-a01f68b93277\") " pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.755943 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/865814ac-a91d-416f-b37e-a01f68b93277-utilities\") pod \"redhat-marketplace-92lp7\" (UID: \"865814ac-a91d-416f-b37e-a01f68b93277\") " pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.755962 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/865814ac-a91d-416f-b37e-a01f68b93277-catalog-content\") pod \"redhat-marketplace-92lp7\" (UID: \"865814ac-a91d-416f-b37e-a01f68b93277\") " pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.856752 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnqpv\" (UniqueName: \"kubernetes.io/projected/865814ac-a91d-416f-b37e-a01f68b93277-kube-api-access-tnqpv\") pod \"redhat-marketplace-92lp7\" (UID: \"865814ac-a91d-416f-b37e-a01f68b93277\") " pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.856812 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/865814ac-a91d-416f-b37e-a01f68b93277-utilities\") pod \"redhat-marketplace-92lp7\" (UID: \"865814ac-a91d-416f-b37e-a01f68b93277\") " pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.856847 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/865814ac-a91d-416f-b37e-a01f68b93277-catalog-content\") pod \"redhat-marketplace-92lp7\" (UID: \"865814ac-a91d-416f-b37e-a01f68b93277\") " pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.857416 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/865814ac-a91d-416f-b37e-a01f68b93277-catalog-content\") pod \"redhat-marketplace-92lp7\" (UID: \"865814ac-a91d-416f-b37e-a01f68b93277\") " pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.857632 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/865814ac-a91d-416f-b37e-a01f68b93277-utilities\") pod \"redhat-marketplace-92lp7\" (UID: \"865814ac-a91d-416f-b37e-a01f68b93277\") " pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:35:59 crc kubenswrapper[4831]: I1203 07:35:59.879362 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnqpv\" (UniqueName: \"kubernetes.io/projected/865814ac-a91d-416f-b37e-a01f68b93277-kube-api-access-tnqpv\") pod \"redhat-marketplace-92lp7\" (UID: \"865814ac-a91d-416f-b37e-a01f68b93277\") " pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:36:00 crc kubenswrapper[4831]: I1203 07:36:00.037335 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:36:00 crc kubenswrapper[4831]: I1203 07:36:00.459433 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-92lp7"] Dec 03 07:36:01 crc kubenswrapper[4831]: I1203 07:36:01.051239 4831 generic.go:334] "Generic (PLEG): container finished" podID="865814ac-a91d-416f-b37e-a01f68b93277" containerID="45ced1876e5ccc165bc542e14e196ae74d9861e50379e98e656b7e5468e86588" exitCode=0 Dec 03 07:36:01 crc kubenswrapper[4831]: I1203 07:36:01.051356 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92lp7" event={"ID":"865814ac-a91d-416f-b37e-a01f68b93277","Type":"ContainerDied","Data":"45ced1876e5ccc165bc542e14e196ae74d9861e50379e98e656b7e5468e86588"} Dec 03 07:36:01 crc kubenswrapper[4831]: I1203 07:36:01.051631 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92lp7" event={"ID":"865814ac-a91d-416f-b37e-a01f68b93277","Type":"ContainerStarted","Data":"7605575e3c9a4789436c92748b8b243aef025855b8066c131d84eb1f299f0231"} Dec 03 07:36:01 crc kubenswrapper[4831]: I1203 07:36:01.053290 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:36:03 crc kubenswrapper[4831]: I1203 07:36:03.070421 4831 generic.go:334] "Generic (PLEG): container finished" podID="865814ac-a91d-416f-b37e-a01f68b93277" containerID="708aff33ed28bbaf9e750ae437d1ce9b22109ffc851d308374165bd97d8054e8" exitCode=0 Dec 03 07:36:03 crc kubenswrapper[4831]: I1203 07:36:03.070512 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92lp7" event={"ID":"865814ac-a91d-416f-b37e-a01f68b93277","Type":"ContainerDied","Data":"708aff33ed28bbaf9e750ae437d1ce9b22109ffc851d308374165bd97d8054e8"} Dec 03 07:36:04 crc kubenswrapper[4831]: I1203 07:36:04.084702 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92lp7" event={"ID":"865814ac-a91d-416f-b37e-a01f68b93277","Type":"ContainerStarted","Data":"26063d23d53151c993f622878dcc0ae0ac4872f23eb433c4b812746438b4ca7f"} Dec 03 07:36:04 crc kubenswrapper[4831]: I1203 07:36:04.113130 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-92lp7" podStartSLOduration=2.628523384 podStartE2EDuration="5.113108384s" podCreationTimestamp="2025-12-03 07:35:59 +0000 UTC" firstStartedPulling="2025-12-03 07:36:01.053071915 +0000 UTC m=+3898.396655423" lastFinishedPulling="2025-12-03 07:36:03.537656915 +0000 UTC m=+3900.881240423" observedRunningTime="2025-12-03 07:36:04.107826999 +0000 UTC m=+3901.451410517" watchObservedRunningTime="2025-12-03 07:36:04.113108384 +0000 UTC m=+3901.456691882" Dec 03 07:36:09 crc kubenswrapper[4831]: I1203 07:36:09.013370 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:36:09 crc kubenswrapper[4831]: E1203 07:36:09.013861 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:36:10 crc kubenswrapper[4831]: I1203 07:36:10.037946 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:36:10 crc kubenswrapper[4831]: I1203 07:36:10.038203 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:36:10 crc kubenswrapper[4831]: I1203 07:36:10.099609 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:36:10 crc kubenswrapper[4831]: I1203 07:36:10.201747 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:36:10 crc kubenswrapper[4831]: I1203 07:36:10.344356 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-92lp7"] Dec 03 07:36:12 crc kubenswrapper[4831]: I1203 07:36:12.154740 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-92lp7" podUID="865814ac-a91d-416f-b37e-a01f68b93277" containerName="registry-server" containerID="cri-o://26063d23d53151c993f622878dcc0ae0ac4872f23eb433c4b812746438b4ca7f" gracePeriod=2 Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.115221 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.165567 4831 generic.go:334] "Generic (PLEG): container finished" podID="865814ac-a91d-416f-b37e-a01f68b93277" containerID="26063d23d53151c993f622878dcc0ae0ac4872f23eb433c4b812746438b4ca7f" exitCode=0 Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.165602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92lp7" event={"ID":"865814ac-a91d-416f-b37e-a01f68b93277","Type":"ContainerDied","Data":"26063d23d53151c993f622878dcc0ae0ac4872f23eb433c4b812746438b4ca7f"} Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.165616 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-92lp7" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.165625 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-92lp7" event={"ID":"865814ac-a91d-416f-b37e-a01f68b93277","Type":"ContainerDied","Data":"7605575e3c9a4789436c92748b8b243aef025855b8066c131d84eb1f299f0231"} Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.165641 4831 scope.go:117] "RemoveContainer" containerID="26063d23d53151c993f622878dcc0ae0ac4872f23eb433c4b812746438b4ca7f" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.169891 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/865814ac-a91d-416f-b37e-a01f68b93277-utilities\") pod \"865814ac-a91d-416f-b37e-a01f68b93277\" (UID: \"865814ac-a91d-416f-b37e-a01f68b93277\") " Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.169970 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnqpv\" (UniqueName: \"kubernetes.io/projected/865814ac-a91d-416f-b37e-a01f68b93277-kube-api-access-tnqpv\") pod \"865814ac-a91d-416f-b37e-a01f68b93277\" (UID: \"865814ac-a91d-416f-b37e-a01f68b93277\") " Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.169998 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/865814ac-a91d-416f-b37e-a01f68b93277-catalog-content\") pod \"865814ac-a91d-416f-b37e-a01f68b93277\" (UID: \"865814ac-a91d-416f-b37e-a01f68b93277\") " Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.172128 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865814ac-a91d-416f-b37e-a01f68b93277-utilities" (OuterVolumeSpecName: "utilities") pod "865814ac-a91d-416f-b37e-a01f68b93277" (UID: "865814ac-a91d-416f-b37e-a01f68b93277"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.175295 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/865814ac-a91d-416f-b37e-a01f68b93277-kube-api-access-tnqpv" (OuterVolumeSpecName: "kube-api-access-tnqpv") pod "865814ac-a91d-416f-b37e-a01f68b93277" (UID: "865814ac-a91d-416f-b37e-a01f68b93277"). InnerVolumeSpecName "kube-api-access-tnqpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.187662 4831 scope.go:117] "RemoveContainer" containerID="708aff33ed28bbaf9e750ae437d1ce9b22109ffc851d308374165bd97d8054e8" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.196212 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/865814ac-a91d-416f-b37e-a01f68b93277-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "865814ac-a91d-416f-b37e-a01f68b93277" (UID: "865814ac-a91d-416f-b37e-a01f68b93277"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.211416 4831 scope.go:117] "RemoveContainer" containerID="45ced1876e5ccc165bc542e14e196ae74d9861e50379e98e656b7e5468e86588" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.233843 4831 scope.go:117] "RemoveContainer" containerID="26063d23d53151c993f622878dcc0ae0ac4872f23eb433c4b812746438b4ca7f" Dec 03 07:36:13 crc kubenswrapper[4831]: E1203 07:36:13.234272 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26063d23d53151c993f622878dcc0ae0ac4872f23eb433c4b812746438b4ca7f\": container with ID starting with 26063d23d53151c993f622878dcc0ae0ac4872f23eb433c4b812746438b4ca7f not found: ID does not exist" containerID="26063d23d53151c993f622878dcc0ae0ac4872f23eb433c4b812746438b4ca7f" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.234350 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26063d23d53151c993f622878dcc0ae0ac4872f23eb433c4b812746438b4ca7f"} err="failed to get container status \"26063d23d53151c993f622878dcc0ae0ac4872f23eb433c4b812746438b4ca7f\": rpc error: code = NotFound desc = could not find container \"26063d23d53151c993f622878dcc0ae0ac4872f23eb433c4b812746438b4ca7f\": container with ID starting with 26063d23d53151c993f622878dcc0ae0ac4872f23eb433c4b812746438b4ca7f not found: ID does not exist" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.234395 4831 scope.go:117] "RemoveContainer" containerID="708aff33ed28bbaf9e750ae437d1ce9b22109ffc851d308374165bd97d8054e8" Dec 03 07:36:13 crc kubenswrapper[4831]: E1203 07:36:13.268157 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"708aff33ed28bbaf9e750ae437d1ce9b22109ffc851d308374165bd97d8054e8\": container with ID starting with 708aff33ed28bbaf9e750ae437d1ce9b22109ffc851d308374165bd97d8054e8 not found: ID does not exist" containerID="708aff33ed28bbaf9e750ae437d1ce9b22109ffc851d308374165bd97d8054e8" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.269043 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"708aff33ed28bbaf9e750ae437d1ce9b22109ffc851d308374165bd97d8054e8"} err="failed to get container status \"708aff33ed28bbaf9e750ae437d1ce9b22109ffc851d308374165bd97d8054e8\": rpc error: code = NotFound desc = could not find container \"708aff33ed28bbaf9e750ae437d1ce9b22109ffc851d308374165bd97d8054e8\": container with ID starting with 708aff33ed28bbaf9e750ae437d1ce9b22109ffc851d308374165bd97d8054e8 not found: ID does not exist" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.269077 4831 scope.go:117] "RemoveContainer" containerID="45ced1876e5ccc165bc542e14e196ae74d9861e50379e98e656b7e5468e86588" Dec 03 07:36:13 crc kubenswrapper[4831]: E1203 07:36:13.269601 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45ced1876e5ccc165bc542e14e196ae74d9861e50379e98e656b7e5468e86588\": container with ID starting with 45ced1876e5ccc165bc542e14e196ae74d9861e50379e98e656b7e5468e86588 not found: ID does not exist" containerID="45ced1876e5ccc165bc542e14e196ae74d9861e50379e98e656b7e5468e86588" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.269645 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45ced1876e5ccc165bc542e14e196ae74d9861e50379e98e656b7e5468e86588"} err="failed to get container status \"45ced1876e5ccc165bc542e14e196ae74d9861e50379e98e656b7e5468e86588\": rpc error: code = NotFound desc = could not find container \"45ced1876e5ccc165bc542e14e196ae74d9861e50379e98e656b7e5468e86588\": container with ID starting with 45ced1876e5ccc165bc542e14e196ae74d9861e50379e98e656b7e5468e86588 not found: ID does not exist" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.271606 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/865814ac-a91d-416f-b37e-a01f68b93277-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.271627 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnqpv\" (UniqueName: \"kubernetes.io/projected/865814ac-a91d-416f-b37e-a01f68b93277-kube-api-access-tnqpv\") on node \"crc\" DevicePath \"\"" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.271640 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/865814ac-a91d-416f-b37e-a01f68b93277-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.508032 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-92lp7"] Dec 03 07:36:13 crc kubenswrapper[4831]: I1203 07:36:13.514455 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-92lp7"] Dec 03 07:36:13 crc kubenswrapper[4831]: E1203 07:36:13.616256 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865814ac_a91d_416f_b37e_a01f68b93277.slice/crio-7605575e3c9a4789436c92748b8b243aef025855b8066c131d84eb1f299f0231\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865814ac_a91d_416f_b37e_a01f68b93277.slice\": RecentStats: unable to find data in memory cache]" Dec 03 07:36:15 crc kubenswrapper[4831]: I1203 07:36:15.023610 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="865814ac-a91d-416f-b37e-a01f68b93277" path="/var/lib/kubelet/pods/865814ac-a91d-416f-b37e-a01f68b93277/volumes" Dec 03 07:36:24 crc kubenswrapper[4831]: I1203 07:36:24.013013 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:36:24 crc kubenswrapper[4831]: E1203 07:36:24.014576 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:36:39 crc kubenswrapper[4831]: I1203 07:36:39.012830 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:36:39 crc kubenswrapper[4831]: I1203 07:36:39.414607 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"8cb86e009a8773a29b8c741c9bd040665945d8c4b58ef0b0622c9afdb3272358"} Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.573262 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-98ccv"] Dec 03 07:38:37 crc kubenswrapper[4831]: E1203 07:38:37.574646 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="865814ac-a91d-416f-b37e-a01f68b93277" containerName="extract-content" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.574679 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="865814ac-a91d-416f-b37e-a01f68b93277" containerName="extract-content" Dec 03 07:38:37 crc kubenswrapper[4831]: E1203 07:38:37.574716 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="865814ac-a91d-416f-b37e-a01f68b93277" containerName="extract-utilities" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.574733 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="865814ac-a91d-416f-b37e-a01f68b93277" containerName="extract-utilities" Dec 03 07:38:37 crc kubenswrapper[4831]: E1203 07:38:37.574760 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="865814ac-a91d-416f-b37e-a01f68b93277" containerName="registry-server" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.574777 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="865814ac-a91d-416f-b37e-a01f68b93277" containerName="registry-server" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.575280 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="865814ac-a91d-416f-b37e-a01f68b93277" containerName="registry-server" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.577621 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.610165 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98ccv"] Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.747569 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69612aa3-2f33-4b7b-80c8-52f76656542c-utilities\") pod \"redhat-operators-98ccv\" (UID: \"69612aa3-2f33-4b7b-80c8-52f76656542c\") " pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.747662 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69612aa3-2f33-4b7b-80c8-52f76656542c-catalog-content\") pod \"redhat-operators-98ccv\" (UID: \"69612aa3-2f33-4b7b-80c8-52f76656542c\") " pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.747797 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkskv\" (UniqueName: \"kubernetes.io/projected/69612aa3-2f33-4b7b-80c8-52f76656542c-kube-api-access-qkskv\") pod \"redhat-operators-98ccv\" (UID: \"69612aa3-2f33-4b7b-80c8-52f76656542c\") " pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.849091 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qkskv\" (UniqueName: \"kubernetes.io/projected/69612aa3-2f33-4b7b-80c8-52f76656542c-kube-api-access-qkskv\") pod \"redhat-operators-98ccv\" (UID: \"69612aa3-2f33-4b7b-80c8-52f76656542c\") " pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.849217 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69612aa3-2f33-4b7b-80c8-52f76656542c-utilities\") pod \"redhat-operators-98ccv\" (UID: \"69612aa3-2f33-4b7b-80c8-52f76656542c\") " pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.849255 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69612aa3-2f33-4b7b-80c8-52f76656542c-catalog-content\") pod \"redhat-operators-98ccv\" (UID: \"69612aa3-2f33-4b7b-80c8-52f76656542c\") " pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.849878 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69612aa3-2f33-4b7b-80c8-52f76656542c-utilities\") pod \"redhat-operators-98ccv\" (UID: \"69612aa3-2f33-4b7b-80c8-52f76656542c\") " pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.849928 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69612aa3-2f33-4b7b-80c8-52f76656542c-catalog-content\") pod \"redhat-operators-98ccv\" (UID: \"69612aa3-2f33-4b7b-80c8-52f76656542c\") " pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:37 crc kubenswrapper[4831]: I1203 07:38:37.920736 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkskv\" (UniqueName: \"kubernetes.io/projected/69612aa3-2f33-4b7b-80c8-52f76656542c-kube-api-access-qkskv\") pod \"redhat-operators-98ccv\" (UID: \"69612aa3-2f33-4b7b-80c8-52f76656542c\") " pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:38 crc kubenswrapper[4831]: I1203 07:38:38.207703 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:38 crc kubenswrapper[4831]: I1203 07:38:38.694561 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-98ccv"] Dec 03 07:38:39 crc kubenswrapper[4831]: I1203 07:38:39.525734 4831 generic.go:334] "Generic (PLEG): container finished" podID="69612aa3-2f33-4b7b-80c8-52f76656542c" containerID="281f5f7d08d9e3ea737e34de9db67a952b06accd485050b5a8deebb70411b66e" exitCode=0 Dec 03 07:38:39 crc kubenswrapper[4831]: I1203 07:38:39.526073 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98ccv" event={"ID":"69612aa3-2f33-4b7b-80c8-52f76656542c","Type":"ContainerDied","Data":"281f5f7d08d9e3ea737e34de9db67a952b06accd485050b5a8deebb70411b66e"} Dec 03 07:38:39 crc kubenswrapper[4831]: I1203 07:38:39.526112 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98ccv" event={"ID":"69612aa3-2f33-4b7b-80c8-52f76656542c","Type":"ContainerStarted","Data":"65e836140bf41ca78c332089aee15432a7b627be2880ab360ce97b253cc452c3"} Dec 03 07:38:40 crc kubenswrapper[4831]: I1203 07:38:40.542606 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98ccv" event={"ID":"69612aa3-2f33-4b7b-80c8-52f76656542c","Type":"ContainerStarted","Data":"fcbc743aa435f4dc9d9bfdfd7a1db6038cd619f782b52165abaa88eba2d8dd5f"} Dec 03 07:38:41 crc kubenswrapper[4831]: I1203 07:38:41.554841 4831 generic.go:334] "Generic (PLEG): container finished" podID="69612aa3-2f33-4b7b-80c8-52f76656542c" containerID="fcbc743aa435f4dc9d9bfdfd7a1db6038cd619f782b52165abaa88eba2d8dd5f" exitCode=0 Dec 03 07:38:41 crc kubenswrapper[4831]: I1203 07:38:41.554928 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98ccv" event={"ID":"69612aa3-2f33-4b7b-80c8-52f76656542c","Type":"ContainerDied","Data":"fcbc743aa435f4dc9d9bfdfd7a1db6038cd619f782b52165abaa88eba2d8dd5f"} Dec 03 07:38:42 crc kubenswrapper[4831]: I1203 07:38:42.564827 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98ccv" event={"ID":"69612aa3-2f33-4b7b-80c8-52f76656542c","Type":"ContainerStarted","Data":"026280b7bf9e2957467f52d41698442812342ba88d36b261b760d1aa9a111f3a"} Dec 03 07:38:42 crc kubenswrapper[4831]: I1203 07:38:42.609048 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-98ccv" podStartSLOduration=3.157435948 podStartE2EDuration="5.609030366s" podCreationTimestamp="2025-12-03 07:38:37 +0000 UTC" firstStartedPulling="2025-12-03 07:38:39.528451745 +0000 UTC m=+4056.872035283" lastFinishedPulling="2025-12-03 07:38:41.980046163 +0000 UTC m=+4059.323629701" observedRunningTime="2025-12-03 07:38:42.603491793 +0000 UTC m=+4059.947075321" watchObservedRunningTime="2025-12-03 07:38:42.609030366 +0000 UTC m=+4059.952613884" Dec 03 07:38:48 crc kubenswrapper[4831]: I1203 07:38:48.208586 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:48 crc kubenswrapper[4831]: I1203 07:38:48.210859 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:49 crc kubenswrapper[4831]: I1203 07:38:49.273064 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-98ccv" podUID="69612aa3-2f33-4b7b-80c8-52f76656542c" containerName="registry-server" probeResult="failure" output=< Dec 03 07:38:49 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 03 07:38:49 crc kubenswrapper[4831]: > Dec 03 07:38:57 crc kubenswrapper[4831]: I1203 07:38:57.597059 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:38:57 crc kubenswrapper[4831]: I1203 07:38:57.597918 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:38:58 crc kubenswrapper[4831]: I1203 07:38:58.280546 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:58 crc kubenswrapper[4831]: I1203 07:38:58.356812 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:38:58 crc kubenswrapper[4831]: I1203 07:38:58.527650 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98ccv"] Dec 03 07:38:59 crc kubenswrapper[4831]: I1203 07:38:59.717717 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-98ccv" podUID="69612aa3-2f33-4b7b-80c8-52f76656542c" containerName="registry-server" containerID="cri-o://026280b7bf9e2957467f52d41698442812342ba88d36b261b760d1aa9a111f3a" gracePeriod=2 Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.223659 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.401558 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69612aa3-2f33-4b7b-80c8-52f76656542c-catalog-content\") pod \"69612aa3-2f33-4b7b-80c8-52f76656542c\" (UID: \"69612aa3-2f33-4b7b-80c8-52f76656542c\") " Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.401726 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qkskv\" (UniqueName: \"kubernetes.io/projected/69612aa3-2f33-4b7b-80c8-52f76656542c-kube-api-access-qkskv\") pod \"69612aa3-2f33-4b7b-80c8-52f76656542c\" (UID: \"69612aa3-2f33-4b7b-80c8-52f76656542c\") " Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.401798 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69612aa3-2f33-4b7b-80c8-52f76656542c-utilities\") pod \"69612aa3-2f33-4b7b-80c8-52f76656542c\" (UID: \"69612aa3-2f33-4b7b-80c8-52f76656542c\") " Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.402745 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69612aa3-2f33-4b7b-80c8-52f76656542c-utilities" (OuterVolumeSpecName: "utilities") pod "69612aa3-2f33-4b7b-80c8-52f76656542c" (UID: "69612aa3-2f33-4b7b-80c8-52f76656542c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.407527 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69612aa3-2f33-4b7b-80c8-52f76656542c-kube-api-access-qkskv" (OuterVolumeSpecName: "kube-api-access-qkskv") pod "69612aa3-2f33-4b7b-80c8-52f76656542c" (UID: "69612aa3-2f33-4b7b-80c8-52f76656542c"). InnerVolumeSpecName "kube-api-access-qkskv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.503855 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qkskv\" (UniqueName: \"kubernetes.io/projected/69612aa3-2f33-4b7b-80c8-52f76656542c-kube-api-access-qkskv\") on node \"crc\" DevicePath \"\"" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.503895 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/69612aa3-2f33-4b7b-80c8-52f76656542c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.550220 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69612aa3-2f33-4b7b-80c8-52f76656542c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "69612aa3-2f33-4b7b-80c8-52f76656542c" (UID: "69612aa3-2f33-4b7b-80c8-52f76656542c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.606006 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/69612aa3-2f33-4b7b-80c8-52f76656542c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.734019 4831 generic.go:334] "Generic (PLEG): container finished" podID="69612aa3-2f33-4b7b-80c8-52f76656542c" containerID="026280b7bf9e2957467f52d41698442812342ba88d36b261b760d1aa9a111f3a" exitCode=0 Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.734078 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98ccv" event={"ID":"69612aa3-2f33-4b7b-80c8-52f76656542c","Type":"ContainerDied","Data":"026280b7bf9e2957467f52d41698442812342ba88d36b261b760d1aa9a111f3a"} Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.734150 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-98ccv" event={"ID":"69612aa3-2f33-4b7b-80c8-52f76656542c","Type":"ContainerDied","Data":"65e836140bf41ca78c332089aee15432a7b627be2880ab360ce97b253cc452c3"} Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.734221 4831 scope.go:117] "RemoveContainer" containerID="026280b7bf9e2957467f52d41698442812342ba88d36b261b760d1aa9a111f3a" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.736035 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-98ccv" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.759805 4831 scope.go:117] "RemoveContainer" containerID="fcbc743aa435f4dc9d9bfdfd7a1db6038cd619f782b52165abaa88eba2d8dd5f" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.799049 4831 scope.go:117] "RemoveContainer" containerID="281f5f7d08d9e3ea737e34de9db67a952b06accd485050b5a8deebb70411b66e" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.802883 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-98ccv"] Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.818389 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-98ccv"] Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.834641 4831 scope.go:117] "RemoveContainer" containerID="026280b7bf9e2957467f52d41698442812342ba88d36b261b760d1aa9a111f3a" Dec 03 07:39:00 crc kubenswrapper[4831]: E1203 07:39:00.835966 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"026280b7bf9e2957467f52d41698442812342ba88d36b261b760d1aa9a111f3a\": container with ID starting with 026280b7bf9e2957467f52d41698442812342ba88d36b261b760d1aa9a111f3a not found: ID does not exist" containerID="026280b7bf9e2957467f52d41698442812342ba88d36b261b760d1aa9a111f3a" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.836037 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"026280b7bf9e2957467f52d41698442812342ba88d36b261b760d1aa9a111f3a"} err="failed to get container status \"026280b7bf9e2957467f52d41698442812342ba88d36b261b760d1aa9a111f3a\": rpc error: code = NotFound desc = could not find container \"026280b7bf9e2957467f52d41698442812342ba88d36b261b760d1aa9a111f3a\": container with ID starting with 026280b7bf9e2957467f52d41698442812342ba88d36b261b760d1aa9a111f3a not found: ID does not exist" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.836083 4831 scope.go:117] "RemoveContainer" containerID="fcbc743aa435f4dc9d9bfdfd7a1db6038cd619f782b52165abaa88eba2d8dd5f" Dec 03 07:39:00 crc kubenswrapper[4831]: E1203 07:39:00.836693 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fcbc743aa435f4dc9d9bfdfd7a1db6038cd619f782b52165abaa88eba2d8dd5f\": container with ID starting with fcbc743aa435f4dc9d9bfdfd7a1db6038cd619f782b52165abaa88eba2d8dd5f not found: ID does not exist" containerID="fcbc743aa435f4dc9d9bfdfd7a1db6038cd619f782b52165abaa88eba2d8dd5f" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.836752 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fcbc743aa435f4dc9d9bfdfd7a1db6038cd619f782b52165abaa88eba2d8dd5f"} err="failed to get container status \"fcbc743aa435f4dc9d9bfdfd7a1db6038cd619f782b52165abaa88eba2d8dd5f\": rpc error: code = NotFound desc = could not find container \"fcbc743aa435f4dc9d9bfdfd7a1db6038cd619f782b52165abaa88eba2d8dd5f\": container with ID starting with fcbc743aa435f4dc9d9bfdfd7a1db6038cd619f782b52165abaa88eba2d8dd5f not found: ID does not exist" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.836786 4831 scope.go:117] "RemoveContainer" containerID="281f5f7d08d9e3ea737e34de9db67a952b06accd485050b5a8deebb70411b66e" Dec 03 07:39:00 crc kubenswrapper[4831]: E1203 07:39:00.837189 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"281f5f7d08d9e3ea737e34de9db67a952b06accd485050b5a8deebb70411b66e\": container with ID starting with 281f5f7d08d9e3ea737e34de9db67a952b06accd485050b5a8deebb70411b66e not found: ID does not exist" containerID="281f5f7d08d9e3ea737e34de9db67a952b06accd485050b5a8deebb70411b66e" Dec 03 07:39:00 crc kubenswrapper[4831]: I1203 07:39:00.837229 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"281f5f7d08d9e3ea737e34de9db67a952b06accd485050b5a8deebb70411b66e"} err="failed to get container status \"281f5f7d08d9e3ea737e34de9db67a952b06accd485050b5a8deebb70411b66e\": rpc error: code = NotFound desc = could not find container \"281f5f7d08d9e3ea737e34de9db67a952b06accd485050b5a8deebb70411b66e\": container with ID starting with 281f5f7d08d9e3ea737e34de9db67a952b06accd485050b5a8deebb70411b66e not found: ID does not exist" Dec 03 07:39:01 crc kubenswrapper[4831]: I1203 07:39:01.023567 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69612aa3-2f33-4b7b-80c8-52f76656542c" path="/var/lib/kubelet/pods/69612aa3-2f33-4b7b-80c8-52f76656542c/volumes" Dec 03 07:39:27 crc kubenswrapper[4831]: I1203 07:39:27.597843 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:39:27 crc kubenswrapper[4831]: I1203 07:39:27.598545 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.437495 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6ws8r"] Dec 03 07:39:57 crc kubenswrapper[4831]: E1203 07:39:57.438500 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69612aa3-2f33-4b7b-80c8-52f76656542c" containerName="registry-server" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.438518 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="69612aa3-2f33-4b7b-80c8-52f76656542c" containerName="registry-server" Dec 03 07:39:57 crc kubenswrapper[4831]: E1203 07:39:57.438533 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69612aa3-2f33-4b7b-80c8-52f76656542c" containerName="extract-content" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.438542 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="69612aa3-2f33-4b7b-80c8-52f76656542c" containerName="extract-content" Dec 03 07:39:57 crc kubenswrapper[4831]: E1203 07:39:57.438561 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69612aa3-2f33-4b7b-80c8-52f76656542c" containerName="extract-utilities" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.438568 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="69612aa3-2f33-4b7b-80c8-52f76656542c" containerName="extract-utilities" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.438757 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="69612aa3-2f33-4b7b-80c8-52f76656542c" containerName="registry-server" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.466880 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.481775 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6ws8r"] Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.536785 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9119e77-c547-4584-98b5-44c99988a1c0-utilities\") pod \"community-operators-6ws8r\" (UID: \"d9119e77-c547-4584-98b5-44c99988a1c0\") " pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.536867 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9119e77-c547-4584-98b5-44c99988a1c0-catalog-content\") pod \"community-operators-6ws8r\" (UID: \"d9119e77-c547-4584-98b5-44c99988a1c0\") " pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.537018 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-glmkd\" (UniqueName: \"kubernetes.io/projected/d9119e77-c547-4584-98b5-44c99988a1c0-kube-api-access-glmkd\") pod \"community-operators-6ws8r\" (UID: \"d9119e77-c547-4584-98b5-44c99988a1c0\") " pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.596253 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.596312 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.596567 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.597034 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8cb86e009a8773a29b8c741c9bd040665945d8c4b58ef0b0622c9afdb3272358"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.597090 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://8cb86e009a8773a29b8c741c9bd040665945d8c4b58ef0b0622c9afdb3272358" gracePeriod=600 Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.638817 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-glmkd\" (UniqueName: \"kubernetes.io/projected/d9119e77-c547-4584-98b5-44c99988a1c0-kube-api-access-glmkd\") pod \"community-operators-6ws8r\" (UID: \"d9119e77-c547-4584-98b5-44c99988a1c0\") " pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.639136 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9119e77-c547-4584-98b5-44c99988a1c0-utilities\") pod \"community-operators-6ws8r\" (UID: \"d9119e77-c547-4584-98b5-44c99988a1c0\") " pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.639353 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9119e77-c547-4584-98b5-44c99988a1c0-catalog-content\") pod \"community-operators-6ws8r\" (UID: \"d9119e77-c547-4584-98b5-44c99988a1c0\") " pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.639961 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9119e77-c547-4584-98b5-44c99988a1c0-catalog-content\") pod \"community-operators-6ws8r\" (UID: \"d9119e77-c547-4584-98b5-44c99988a1c0\") " pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.640081 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9119e77-c547-4584-98b5-44c99988a1c0-utilities\") pod \"community-operators-6ws8r\" (UID: \"d9119e77-c547-4584-98b5-44c99988a1c0\") " pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.681164 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-glmkd\" (UniqueName: \"kubernetes.io/projected/d9119e77-c547-4584-98b5-44c99988a1c0-kube-api-access-glmkd\") pod \"community-operators-6ws8r\" (UID: \"d9119e77-c547-4584-98b5-44c99988a1c0\") " pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:39:57 crc kubenswrapper[4831]: I1203 07:39:57.802364 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:39:58 crc kubenswrapper[4831]: I1203 07:39:58.083525 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6ws8r"] Dec 03 07:39:58 crc kubenswrapper[4831]: W1203 07:39:58.091351 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9119e77_c547_4584_98b5_44c99988a1c0.slice/crio-a6192e06bfd139bc96b9b2fad6bb6bf5b0edc6f886d5f466af7f355f4bdaf3af WatchSource:0}: Error finding container a6192e06bfd139bc96b9b2fad6bb6bf5b0edc6f886d5f466af7f355f4bdaf3af: Status 404 returned error can't find the container with id a6192e06bfd139bc96b9b2fad6bb6bf5b0edc6f886d5f466af7f355f4bdaf3af Dec 03 07:39:58 crc kubenswrapper[4831]: I1203 07:39:58.294833 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="8cb86e009a8773a29b8c741c9bd040665945d8c4b58ef0b0622c9afdb3272358" exitCode=0 Dec 03 07:39:58 crc kubenswrapper[4831]: I1203 07:39:58.294911 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"8cb86e009a8773a29b8c741c9bd040665945d8c4b58ef0b0622c9afdb3272358"} Dec 03 07:39:58 crc kubenswrapper[4831]: I1203 07:39:58.294955 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc"} Dec 03 07:39:58 crc kubenswrapper[4831]: I1203 07:39:58.294975 4831 scope.go:117] "RemoveContainer" containerID="a99166a31a8cdfca263d0d4062c91e96060efa1b7bb5dc271b65a867a588f721" Dec 03 07:39:58 crc kubenswrapper[4831]: I1203 07:39:58.297613 4831 generic.go:334] "Generic (PLEG): container finished" podID="d9119e77-c547-4584-98b5-44c99988a1c0" containerID="bf6facd14154fb2fe4a4c0fa6dcc2dc75ba1133bf49b6d70a97375fcbff1221b" exitCode=0 Dec 03 07:39:58 crc kubenswrapper[4831]: I1203 07:39:58.297650 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ws8r" event={"ID":"d9119e77-c547-4584-98b5-44c99988a1c0","Type":"ContainerDied","Data":"bf6facd14154fb2fe4a4c0fa6dcc2dc75ba1133bf49b6d70a97375fcbff1221b"} Dec 03 07:39:58 crc kubenswrapper[4831]: I1203 07:39:58.297676 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ws8r" event={"ID":"d9119e77-c547-4584-98b5-44c99988a1c0","Type":"ContainerStarted","Data":"a6192e06bfd139bc96b9b2fad6bb6bf5b0edc6f886d5f466af7f355f4bdaf3af"} Dec 03 07:40:02 crc kubenswrapper[4831]: I1203 07:40:02.349185 4831 generic.go:334] "Generic (PLEG): container finished" podID="d9119e77-c547-4584-98b5-44c99988a1c0" containerID="9431bd86f2910a98ba7181e7d429e4edf702441a1b4933e3dd7c2851e5cf2a1f" exitCode=0 Dec 03 07:40:02 crc kubenswrapper[4831]: I1203 07:40:02.349265 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ws8r" event={"ID":"d9119e77-c547-4584-98b5-44c99988a1c0","Type":"ContainerDied","Data":"9431bd86f2910a98ba7181e7d429e4edf702441a1b4933e3dd7c2851e5cf2a1f"} Dec 03 07:40:03 crc kubenswrapper[4831]: I1203 07:40:03.363205 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6ws8r" event={"ID":"d9119e77-c547-4584-98b5-44c99988a1c0","Type":"ContainerStarted","Data":"68da69071fe0b356a36fb6900410a974215a8a54a0186b91501e65541c9e8bb8"} Dec 03 07:40:03 crc kubenswrapper[4831]: I1203 07:40:03.396452 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6ws8r" podStartSLOduration=1.677649046 podStartE2EDuration="6.396418442s" podCreationTimestamp="2025-12-03 07:39:57 +0000 UTC" firstStartedPulling="2025-12-03 07:39:58.298801749 +0000 UTC m=+4135.642385257" lastFinishedPulling="2025-12-03 07:40:03.017571155 +0000 UTC m=+4140.361154653" observedRunningTime="2025-12-03 07:40:03.388310529 +0000 UTC m=+4140.731894067" watchObservedRunningTime="2025-12-03 07:40:03.396418442 +0000 UTC m=+4140.740001990" Dec 03 07:40:07 crc kubenswrapper[4831]: I1203 07:40:07.803379 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:40:07 crc kubenswrapper[4831]: I1203 07:40:07.804032 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:40:07 crc kubenswrapper[4831]: I1203 07:40:07.864651 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:40:08 crc kubenswrapper[4831]: I1203 07:40:08.477260 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6ws8r" Dec 03 07:40:08 crc kubenswrapper[4831]: I1203 07:40:08.587168 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6ws8r"] Dec 03 07:40:08 crc kubenswrapper[4831]: I1203 07:40:08.659234 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xdz7k"] Dec 03 07:40:08 crc kubenswrapper[4831]: I1203 07:40:08.659675 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xdz7k" podUID="3947b643-3e94-4f5b-81d2-b2132bb654f8" containerName="registry-server" containerID="cri-o://e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688" gracePeriod=2 Dec 03 07:40:08 crc kubenswrapper[4831]: E1203 07:40:08.842726 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3947b643_3e94_4f5b_81d2_b2132bb654f8.slice/crio-e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688.scope\": RecentStats: unable to find data in memory cache]" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.086971 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdz7k" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.223195 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3947b643-3e94-4f5b-81d2-b2132bb654f8-utilities\") pod \"3947b643-3e94-4f5b-81d2-b2132bb654f8\" (UID: \"3947b643-3e94-4f5b-81d2-b2132bb654f8\") " Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.223982 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6wpn\" (UniqueName: \"kubernetes.io/projected/3947b643-3e94-4f5b-81d2-b2132bb654f8-kube-api-access-v6wpn\") pod \"3947b643-3e94-4f5b-81d2-b2132bb654f8\" (UID: \"3947b643-3e94-4f5b-81d2-b2132bb654f8\") " Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.223888 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3947b643-3e94-4f5b-81d2-b2132bb654f8-utilities" (OuterVolumeSpecName: "utilities") pod "3947b643-3e94-4f5b-81d2-b2132bb654f8" (UID: "3947b643-3e94-4f5b-81d2-b2132bb654f8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.224748 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3947b643-3e94-4f5b-81d2-b2132bb654f8-catalog-content\") pod \"3947b643-3e94-4f5b-81d2-b2132bb654f8\" (UID: \"3947b643-3e94-4f5b-81d2-b2132bb654f8\") " Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.225007 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3947b643-3e94-4f5b-81d2-b2132bb654f8-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.229662 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3947b643-3e94-4f5b-81d2-b2132bb654f8-kube-api-access-v6wpn" (OuterVolumeSpecName: "kube-api-access-v6wpn") pod "3947b643-3e94-4f5b-81d2-b2132bb654f8" (UID: "3947b643-3e94-4f5b-81d2-b2132bb654f8"). InnerVolumeSpecName "kube-api-access-v6wpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.270420 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3947b643-3e94-4f5b-81d2-b2132bb654f8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3947b643-3e94-4f5b-81d2-b2132bb654f8" (UID: "3947b643-3e94-4f5b-81d2-b2132bb654f8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.326576 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3947b643-3e94-4f5b-81d2-b2132bb654f8-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.326607 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6wpn\" (UniqueName: \"kubernetes.io/projected/3947b643-3e94-4f5b-81d2-b2132bb654f8-kube-api-access-v6wpn\") on node \"crc\" DevicePath \"\"" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.411383 4831 generic.go:334] "Generic (PLEG): container finished" podID="3947b643-3e94-4f5b-81d2-b2132bb654f8" containerID="e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688" exitCode=0 Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.412051 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xdz7k" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.416402 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdz7k" event={"ID":"3947b643-3e94-4f5b-81d2-b2132bb654f8","Type":"ContainerDied","Data":"e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688"} Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.416469 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xdz7k" event={"ID":"3947b643-3e94-4f5b-81d2-b2132bb654f8","Type":"ContainerDied","Data":"b79afb3b88ece017c9ee646e826fc66d355e200077bdc50b5eb9f707aa27989e"} Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.416493 4831 scope.go:117] "RemoveContainer" containerID="e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.446125 4831 scope.go:117] "RemoveContainer" containerID="314282480cdfb5db852f4b7e2c8fac6c0ead84eccd80ad70cf59bd1e7402e3c2" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.447633 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xdz7k"] Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.455943 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xdz7k"] Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.463335 4831 scope.go:117] "RemoveContainer" containerID="d60331c48ea2f7b73c5fe3185dc9f893d5651749b3b79e7062c9d0cc1b1408eb" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.490870 4831 scope.go:117] "RemoveContainer" containerID="e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688" Dec 03 07:40:09 crc kubenswrapper[4831]: E1203 07:40:09.491182 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688\": container with ID starting with e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688 not found: ID does not exist" containerID="e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.491224 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688"} err="failed to get container status \"e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688\": rpc error: code = NotFound desc = could not find container \"e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688\": container with ID starting with e25a93646255b1e2c4e4a9b209e9f47d639c7bd47f16aaffcee59143d4f89688 not found: ID does not exist" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.491250 4831 scope.go:117] "RemoveContainer" containerID="314282480cdfb5db852f4b7e2c8fac6c0ead84eccd80ad70cf59bd1e7402e3c2" Dec 03 07:40:09 crc kubenswrapper[4831]: E1203 07:40:09.491536 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"314282480cdfb5db852f4b7e2c8fac6c0ead84eccd80ad70cf59bd1e7402e3c2\": container with ID starting with 314282480cdfb5db852f4b7e2c8fac6c0ead84eccd80ad70cf59bd1e7402e3c2 not found: ID does not exist" containerID="314282480cdfb5db852f4b7e2c8fac6c0ead84eccd80ad70cf59bd1e7402e3c2" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.491567 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"314282480cdfb5db852f4b7e2c8fac6c0ead84eccd80ad70cf59bd1e7402e3c2"} err="failed to get container status \"314282480cdfb5db852f4b7e2c8fac6c0ead84eccd80ad70cf59bd1e7402e3c2\": rpc error: code = NotFound desc = could not find container \"314282480cdfb5db852f4b7e2c8fac6c0ead84eccd80ad70cf59bd1e7402e3c2\": container with ID starting with 314282480cdfb5db852f4b7e2c8fac6c0ead84eccd80ad70cf59bd1e7402e3c2 not found: ID does not exist" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.491584 4831 scope.go:117] "RemoveContainer" containerID="d60331c48ea2f7b73c5fe3185dc9f893d5651749b3b79e7062c9d0cc1b1408eb" Dec 03 07:40:09 crc kubenswrapper[4831]: E1203 07:40:09.491833 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d60331c48ea2f7b73c5fe3185dc9f893d5651749b3b79e7062c9d0cc1b1408eb\": container with ID starting with d60331c48ea2f7b73c5fe3185dc9f893d5651749b3b79e7062c9d0cc1b1408eb not found: ID does not exist" containerID="d60331c48ea2f7b73c5fe3185dc9f893d5651749b3b79e7062c9d0cc1b1408eb" Dec 03 07:40:09 crc kubenswrapper[4831]: I1203 07:40:09.491860 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d60331c48ea2f7b73c5fe3185dc9f893d5651749b3b79e7062c9d0cc1b1408eb"} err="failed to get container status \"d60331c48ea2f7b73c5fe3185dc9f893d5651749b3b79e7062c9d0cc1b1408eb\": rpc error: code = NotFound desc = could not find container \"d60331c48ea2f7b73c5fe3185dc9f893d5651749b3b79e7062c9d0cc1b1408eb\": container with ID starting with d60331c48ea2f7b73c5fe3185dc9f893d5651749b3b79e7062c9d0cc1b1408eb not found: ID does not exist" Dec 03 07:40:11 crc kubenswrapper[4831]: I1203 07:40:11.040227 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3947b643-3e94-4f5b-81d2-b2132bb654f8" path="/var/lib/kubelet/pods/3947b643-3e94-4f5b-81d2-b2132bb654f8/volumes" Dec 03 07:41:57 crc kubenswrapper[4831]: I1203 07:41:57.596448 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:41:57 crc kubenswrapper[4831]: I1203 07:41:57.598453 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:42:27 crc kubenswrapper[4831]: I1203 07:42:27.597561 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:42:27 crc kubenswrapper[4831]: I1203 07:42:27.598278 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:42:57 crc kubenswrapper[4831]: I1203 07:42:57.597185 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:42:57 crc kubenswrapper[4831]: I1203 07:42:57.598103 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:42:57 crc kubenswrapper[4831]: I1203 07:42:57.598192 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 07:42:57 crc kubenswrapper[4831]: I1203 07:42:57.599513 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:42:57 crc kubenswrapper[4831]: I1203 07:42:57.599629 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" gracePeriod=600 Dec 03 07:42:57 crc kubenswrapper[4831]: E1203 07:42:57.736055 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:42:58 crc kubenswrapper[4831]: I1203 07:42:58.110987 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" exitCode=0 Dec 03 07:42:58 crc kubenswrapper[4831]: I1203 07:42:58.111057 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc"} Dec 03 07:42:58 crc kubenswrapper[4831]: I1203 07:42:58.111162 4831 scope.go:117] "RemoveContainer" containerID="8cb86e009a8773a29b8c741c9bd040665945d8c4b58ef0b0622c9afdb3272358" Dec 03 07:42:58 crc kubenswrapper[4831]: I1203 07:42:58.111895 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:42:58 crc kubenswrapper[4831]: E1203 07:42:58.112368 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:43:12 crc kubenswrapper[4831]: I1203 07:43:12.012857 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:43:12 crc kubenswrapper[4831]: E1203 07:43:12.014022 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:43:24 crc kubenswrapper[4831]: I1203 07:43:24.013137 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:43:24 crc kubenswrapper[4831]: E1203 07:43:24.013891 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:43:39 crc kubenswrapper[4831]: I1203 07:43:39.013077 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:43:39 crc kubenswrapper[4831]: E1203 07:43:39.014197 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.680059 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-krlrw"] Dec 03 07:43:46 crc kubenswrapper[4831]: E1203 07:43:46.680964 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3947b643-3e94-4f5b-81d2-b2132bb654f8" containerName="extract-content" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.680978 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3947b643-3e94-4f5b-81d2-b2132bb654f8" containerName="extract-content" Dec 03 07:43:46 crc kubenswrapper[4831]: E1203 07:43:46.681008 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3947b643-3e94-4f5b-81d2-b2132bb654f8" containerName="extract-utilities" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.681020 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3947b643-3e94-4f5b-81d2-b2132bb654f8" containerName="extract-utilities" Dec 03 07:43:46 crc kubenswrapper[4831]: E1203 07:43:46.681049 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3947b643-3e94-4f5b-81d2-b2132bb654f8" containerName="registry-server" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.681058 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3947b643-3e94-4f5b-81d2-b2132bb654f8" containerName="registry-server" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.682807 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3947b643-3e94-4f5b-81d2-b2132bb654f8" containerName="registry-server" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.685125 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.691765 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krlrw"] Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.801007 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b98fba-1746-44cd-a4b4-5d2e718a24f0-utilities\") pod \"certified-operators-krlrw\" (UID: \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\") " pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.801166 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b98fba-1746-44cd-a4b4-5d2e718a24f0-catalog-content\") pod \"certified-operators-krlrw\" (UID: \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\") " pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.801424 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9hc6\" (UniqueName: \"kubernetes.io/projected/59b98fba-1746-44cd-a4b4-5d2e718a24f0-kube-api-access-d9hc6\") pod \"certified-operators-krlrw\" (UID: \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\") " pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.902244 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b98fba-1746-44cd-a4b4-5d2e718a24f0-catalog-content\") pod \"certified-operators-krlrw\" (UID: \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\") " pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.902408 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9hc6\" (UniqueName: \"kubernetes.io/projected/59b98fba-1746-44cd-a4b4-5d2e718a24f0-kube-api-access-d9hc6\") pod \"certified-operators-krlrw\" (UID: \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\") " pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.902445 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b98fba-1746-44cd-a4b4-5d2e718a24f0-utilities\") pod \"certified-operators-krlrw\" (UID: \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\") " pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.902915 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b98fba-1746-44cd-a4b4-5d2e718a24f0-catalog-content\") pod \"certified-operators-krlrw\" (UID: \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\") " pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:46 crc kubenswrapper[4831]: I1203 07:43:46.903001 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b98fba-1746-44cd-a4b4-5d2e718a24f0-utilities\") pod \"certified-operators-krlrw\" (UID: \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\") " pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:47 crc kubenswrapper[4831]: I1203 07:43:47.522379 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9hc6\" (UniqueName: \"kubernetes.io/projected/59b98fba-1746-44cd-a4b4-5d2e718a24f0-kube-api-access-d9hc6\") pod \"certified-operators-krlrw\" (UID: \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\") " pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:47 crc kubenswrapper[4831]: I1203 07:43:47.615234 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:48 crc kubenswrapper[4831]: I1203 07:43:48.067303 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-krlrw"] Dec 03 07:43:48 crc kubenswrapper[4831]: I1203 07:43:48.599403 4831 generic.go:334] "Generic (PLEG): container finished" podID="59b98fba-1746-44cd-a4b4-5d2e718a24f0" containerID="4a56e9086220b808c611da6b2f6a52de58cf643e71b086637f132823ea0fd01b" exitCode=0 Dec 03 07:43:48 crc kubenswrapper[4831]: I1203 07:43:48.599489 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krlrw" event={"ID":"59b98fba-1746-44cd-a4b4-5d2e718a24f0","Type":"ContainerDied","Data":"4a56e9086220b808c611da6b2f6a52de58cf643e71b086637f132823ea0fd01b"} Dec 03 07:43:48 crc kubenswrapper[4831]: I1203 07:43:48.599778 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krlrw" event={"ID":"59b98fba-1746-44cd-a4b4-5d2e718a24f0","Type":"ContainerStarted","Data":"e2117107869556fcb753154299532cd0efb08a8bc74e3dc770808564c5f12a0c"} Dec 03 07:43:48 crc kubenswrapper[4831]: I1203 07:43:48.602181 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:43:50 crc kubenswrapper[4831]: I1203 07:43:50.616909 4831 generic.go:334] "Generic (PLEG): container finished" podID="59b98fba-1746-44cd-a4b4-5d2e718a24f0" containerID="571e3bc4609687e6a1a494897baadc52a3eeec886f2c589d9a1be73a9f14b472" exitCode=0 Dec 03 07:43:50 crc kubenswrapper[4831]: I1203 07:43:50.617028 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krlrw" event={"ID":"59b98fba-1746-44cd-a4b4-5d2e718a24f0","Type":"ContainerDied","Data":"571e3bc4609687e6a1a494897baadc52a3eeec886f2c589d9a1be73a9f14b472"} Dec 03 07:43:51 crc kubenswrapper[4831]: I1203 07:43:51.629007 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krlrw" event={"ID":"59b98fba-1746-44cd-a4b4-5d2e718a24f0","Type":"ContainerStarted","Data":"ff139dbffac0dc42fcc41bc713c768ca0b1e6602218669a7d707ca7d5a680780"} Dec 03 07:43:51 crc kubenswrapper[4831]: I1203 07:43:51.657822 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-krlrw" podStartSLOduration=3.140815294 podStartE2EDuration="5.657789573s" podCreationTimestamp="2025-12-03 07:43:46 +0000 UTC" firstStartedPulling="2025-12-03 07:43:48.601703126 +0000 UTC m=+4365.945286664" lastFinishedPulling="2025-12-03 07:43:51.118677425 +0000 UTC m=+4368.462260943" observedRunningTime="2025-12-03 07:43:51.654475809 +0000 UTC m=+4368.998059317" watchObservedRunningTime="2025-12-03 07:43:51.657789573 +0000 UTC m=+4369.001373091" Dec 03 07:43:52 crc kubenswrapper[4831]: I1203 07:43:52.012618 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:43:52 crc kubenswrapper[4831]: E1203 07:43:52.013064 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:43:57 crc kubenswrapper[4831]: I1203 07:43:57.616796 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:57 crc kubenswrapper[4831]: I1203 07:43:57.617092 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:57 crc kubenswrapper[4831]: I1203 07:43:57.675003 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:57 crc kubenswrapper[4831]: I1203 07:43:57.760783 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:43:57 crc kubenswrapper[4831]: I1203 07:43:57.926540 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krlrw"] Dec 03 07:43:59 crc kubenswrapper[4831]: I1203 07:43:59.704829 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-krlrw" podUID="59b98fba-1746-44cd-a4b4-5d2e718a24f0" containerName="registry-server" containerID="cri-o://ff139dbffac0dc42fcc41bc713c768ca0b1e6602218669a7d707ca7d5a680780" gracePeriod=2 Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.223214 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.424450 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9hc6\" (UniqueName: \"kubernetes.io/projected/59b98fba-1746-44cd-a4b4-5d2e718a24f0-kube-api-access-d9hc6\") pod \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\" (UID: \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\") " Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.424516 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b98fba-1746-44cd-a4b4-5d2e718a24f0-utilities\") pod \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\" (UID: \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\") " Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.424566 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b98fba-1746-44cd-a4b4-5d2e718a24f0-catalog-content\") pod \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\" (UID: \"59b98fba-1746-44cd-a4b4-5d2e718a24f0\") " Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.425713 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59b98fba-1746-44cd-a4b4-5d2e718a24f0-utilities" (OuterVolumeSpecName: "utilities") pod "59b98fba-1746-44cd-a4b4-5d2e718a24f0" (UID: "59b98fba-1746-44cd-a4b4-5d2e718a24f0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.432950 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/59b98fba-1746-44cd-a4b4-5d2e718a24f0-kube-api-access-d9hc6" (OuterVolumeSpecName: "kube-api-access-d9hc6") pod "59b98fba-1746-44cd-a4b4-5d2e718a24f0" (UID: "59b98fba-1746-44cd-a4b4-5d2e718a24f0"). InnerVolumeSpecName "kube-api-access-d9hc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.526860 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9hc6\" (UniqueName: \"kubernetes.io/projected/59b98fba-1746-44cd-a4b4-5d2e718a24f0-kube-api-access-d9hc6\") on node \"crc\" DevicePath \"\"" Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.526916 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/59b98fba-1746-44cd-a4b4-5d2e718a24f0-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.717656 4831 generic.go:334] "Generic (PLEG): container finished" podID="59b98fba-1746-44cd-a4b4-5d2e718a24f0" containerID="ff139dbffac0dc42fcc41bc713c768ca0b1e6602218669a7d707ca7d5a680780" exitCode=0 Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.717740 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krlrw" event={"ID":"59b98fba-1746-44cd-a4b4-5d2e718a24f0","Type":"ContainerDied","Data":"ff139dbffac0dc42fcc41bc713c768ca0b1e6602218669a7d707ca7d5a680780"} Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.717770 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-krlrw" Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.717799 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-krlrw" event={"ID":"59b98fba-1746-44cd-a4b4-5d2e718a24f0","Type":"ContainerDied","Data":"e2117107869556fcb753154299532cd0efb08a8bc74e3dc770808564c5f12a0c"} Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.717869 4831 scope.go:117] "RemoveContainer" containerID="ff139dbffac0dc42fcc41bc713c768ca0b1e6602218669a7d707ca7d5a680780" Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.745417 4831 scope.go:117] "RemoveContainer" containerID="571e3bc4609687e6a1a494897baadc52a3eeec886f2c589d9a1be73a9f14b472" Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.771924 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/59b98fba-1746-44cd-a4b4-5d2e718a24f0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "59b98fba-1746-44cd-a4b4-5d2e718a24f0" (UID: "59b98fba-1746-44cd-a4b4-5d2e718a24f0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:44:00 crc kubenswrapper[4831]: I1203 07:44:00.833784 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/59b98fba-1746-44cd-a4b4-5d2e718a24f0-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:44:01 crc kubenswrapper[4831]: I1203 07:44:01.075115 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-krlrw"] Dec 03 07:44:01 crc kubenswrapper[4831]: I1203 07:44:01.086525 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-krlrw"] Dec 03 07:44:01 crc kubenswrapper[4831]: I1203 07:44:01.131418 4831 scope.go:117] "RemoveContainer" containerID="4a56e9086220b808c611da6b2f6a52de58cf643e71b086637f132823ea0fd01b" Dec 03 07:44:01 crc kubenswrapper[4831]: I1203 07:44:01.174690 4831 scope.go:117] "RemoveContainer" containerID="ff139dbffac0dc42fcc41bc713c768ca0b1e6602218669a7d707ca7d5a680780" Dec 03 07:44:01 crc kubenswrapper[4831]: E1203 07:44:01.175260 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff139dbffac0dc42fcc41bc713c768ca0b1e6602218669a7d707ca7d5a680780\": container with ID starting with ff139dbffac0dc42fcc41bc713c768ca0b1e6602218669a7d707ca7d5a680780 not found: ID does not exist" containerID="ff139dbffac0dc42fcc41bc713c768ca0b1e6602218669a7d707ca7d5a680780" Dec 03 07:44:01 crc kubenswrapper[4831]: I1203 07:44:01.175356 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff139dbffac0dc42fcc41bc713c768ca0b1e6602218669a7d707ca7d5a680780"} err="failed to get container status \"ff139dbffac0dc42fcc41bc713c768ca0b1e6602218669a7d707ca7d5a680780\": rpc error: code = NotFound desc = could not find container \"ff139dbffac0dc42fcc41bc713c768ca0b1e6602218669a7d707ca7d5a680780\": container with ID starting with ff139dbffac0dc42fcc41bc713c768ca0b1e6602218669a7d707ca7d5a680780 not found: ID does not exist" Dec 03 07:44:01 crc kubenswrapper[4831]: I1203 07:44:01.175403 4831 scope.go:117] "RemoveContainer" containerID="571e3bc4609687e6a1a494897baadc52a3eeec886f2c589d9a1be73a9f14b472" Dec 03 07:44:01 crc kubenswrapper[4831]: E1203 07:44:01.177019 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"571e3bc4609687e6a1a494897baadc52a3eeec886f2c589d9a1be73a9f14b472\": container with ID starting with 571e3bc4609687e6a1a494897baadc52a3eeec886f2c589d9a1be73a9f14b472 not found: ID does not exist" containerID="571e3bc4609687e6a1a494897baadc52a3eeec886f2c589d9a1be73a9f14b472" Dec 03 07:44:01 crc kubenswrapper[4831]: I1203 07:44:01.177061 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"571e3bc4609687e6a1a494897baadc52a3eeec886f2c589d9a1be73a9f14b472"} err="failed to get container status \"571e3bc4609687e6a1a494897baadc52a3eeec886f2c589d9a1be73a9f14b472\": rpc error: code = NotFound desc = could not find container \"571e3bc4609687e6a1a494897baadc52a3eeec886f2c589d9a1be73a9f14b472\": container with ID starting with 571e3bc4609687e6a1a494897baadc52a3eeec886f2c589d9a1be73a9f14b472 not found: ID does not exist" Dec 03 07:44:01 crc kubenswrapper[4831]: I1203 07:44:01.177094 4831 scope.go:117] "RemoveContainer" containerID="4a56e9086220b808c611da6b2f6a52de58cf643e71b086637f132823ea0fd01b" Dec 03 07:44:01 crc kubenswrapper[4831]: E1203 07:44:01.177860 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a56e9086220b808c611da6b2f6a52de58cf643e71b086637f132823ea0fd01b\": container with ID starting with 4a56e9086220b808c611da6b2f6a52de58cf643e71b086637f132823ea0fd01b not found: ID does not exist" containerID="4a56e9086220b808c611da6b2f6a52de58cf643e71b086637f132823ea0fd01b" Dec 03 07:44:01 crc kubenswrapper[4831]: I1203 07:44:01.177924 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a56e9086220b808c611da6b2f6a52de58cf643e71b086637f132823ea0fd01b"} err="failed to get container status \"4a56e9086220b808c611da6b2f6a52de58cf643e71b086637f132823ea0fd01b\": rpc error: code = NotFound desc = could not find container \"4a56e9086220b808c611da6b2f6a52de58cf643e71b086637f132823ea0fd01b\": container with ID starting with 4a56e9086220b808c611da6b2f6a52de58cf643e71b086637f132823ea0fd01b not found: ID does not exist" Dec 03 07:44:03 crc kubenswrapper[4831]: I1203 07:44:03.022146 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:44:03 crc kubenswrapper[4831]: E1203 07:44:03.022842 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:44:03 crc kubenswrapper[4831]: I1203 07:44:03.030758 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="59b98fba-1746-44cd-a4b4-5d2e718a24f0" path="/var/lib/kubelet/pods/59b98fba-1746-44cd-a4b4-5d2e718a24f0/volumes" Dec 03 07:44:17 crc kubenswrapper[4831]: I1203 07:44:17.012839 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:44:17 crc kubenswrapper[4831]: E1203 07:44:17.013646 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:44:29 crc kubenswrapper[4831]: I1203 07:44:29.013853 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:44:29 crc kubenswrapper[4831]: E1203 07:44:29.015058 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:44:41 crc kubenswrapper[4831]: I1203 07:44:41.013445 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:44:41 crc kubenswrapper[4831]: E1203 07:44:41.014407 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:44:52 crc kubenswrapper[4831]: I1203 07:44:52.013517 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:44:52 crc kubenswrapper[4831]: E1203 07:44:52.014660 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.185936 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m"] Dec 03 07:45:00 crc kubenswrapper[4831]: E1203 07:45:00.187026 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b98fba-1746-44cd-a4b4-5d2e718a24f0" containerName="extract-utilities" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.187042 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b98fba-1746-44cd-a4b4-5d2e718a24f0" containerName="extract-utilities" Dec 03 07:45:00 crc kubenswrapper[4831]: E1203 07:45:00.187070 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b98fba-1746-44cd-a4b4-5d2e718a24f0" containerName="extract-content" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.187077 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b98fba-1746-44cd-a4b4-5d2e718a24f0" containerName="extract-content" Dec 03 07:45:00 crc kubenswrapper[4831]: E1203 07:45:00.187097 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="59b98fba-1746-44cd-a4b4-5d2e718a24f0" containerName="registry-server" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.187104 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="59b98fba-1746-44cd-a4b4-5d2e718a24f0" containerName="registry-server" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.187243 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="59b98fba-1746-44cd-a4b4-5d2e718a24f0" containerName="registry-server" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.187740 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.190942 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.191189 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.200667 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m"] Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.362356 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9aaf4a58-6301-4f40-99a4-90993f851a8d-secret-volume\") pod \"collect-profiles-29412465-t7g9m\" (UID: \"9aaf4a58-6301-4f40-99a4-90993f851a8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.362695 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjbwx\" (UniqueName: \"kubernetes.io/projected/9aaf4a58-6301-4f40-99a4-90993f851a8d-kube-api-access-sjbwx\") pod \"collect-profiles-29412465-t7g9m\" (UID: \"9aaf4a58-6301-4f40-99a4-90993f851a8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.362846 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9aaf4a58-6301-4f40-99a4-90993f851a8d-config-volume\") pod \"collect-profiles-29412465-t7g9m\" (UID: \"9aaf4a58-6301-4f40-99a4-90993f851a8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.464157 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9aaf4a58-6301-4f40-99a4-90993f851a8d-config-volume\") pod \"collect-profiles-29412465-t7g9m\" (UID: \"9aaf4a58-6301-4f40-99a4-90993f851a8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.464728 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9aaf4a58-6301-4f40-99a4-90993f851a8d-secret-volume\") pod \"collect-profiles-29412465-t7g9m\" (UID: \"9aaf4a58-6301-4f40-99a4-90993f851a8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.464951 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjbwx\" (UniqueName: \"kubernetes.io/projected/9aaf4a58-6301-4f40-99a4-90993f851a8d-kube-api-access-sjbwx\") pod \"collect-profiles-29412465-t7g9m\" (UID: \"9aaf4a58-6301-4f40-99a4-90993f851a8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.465708 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9aaf4a58-6301-4f40-99a4-90993f851a8d-config-volume\") pod \"collect-profiles-29412465-t7g9m\" (UID: \"9aaf4a58-6301-4f40-99a4-90993f851a8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.476187 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9aaf4a58-6301-4f40-99a4-90993f851a8d-secret-volume\") pod \"collect-profiles-29412465-t7g9m\" (UID: \"9aaf4a58-6301-4f40-99a4-90993f851a8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.481656 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjbwx\" (UniqueName: \"kubernetes.io/projected/9aaf4a58-6301-4f40-99a4-90993f851a8d-kube-api-access-sjbwx\") pod \"collect-profiles-29412465-t7g9m\" (UID: \"9aaf4a58-6301-4f40-99a4-90993f851a8d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.543430 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" Dec 03 07:45:00 crc kubenswrapper[4831]: I1203 07:45:00.984982 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m"] Dec 03 07:45:01 crc kubenswrapper[4831]: I1203 07:45:01.271125 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" event={"ID":"9aaf4a58-6301-4f40-99a4-90993f851a8d","Type":"ContainerStarted","Data":"7e1d6c9217938e0f35c6dd4a17856d7002ac9e4bdfab8ece00be02a1dda9e274"} Dec 03 07:45:01 crc kubenswrapper[4831]: I1203 07:45:01.271199 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" event={"ID":"9aaf4a58-6301-4f40-99a4-90993f851a8d","Type":"ContainerStarted","Data":"927c35be7655d5bd19d5521e437aceecca90ba74b679f02121d6a7de1411aa3d"} Dec 03 07:45:01 crc kubenswrapper[4831]: I1203 07:45:01.293512 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" podStartSLOduration=1.293487422 podStartE2EDuration="1.293487422s" podCreationTimestamp="2025-12-03 07:45:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:45:01.288563618 +0000 UTC m=+4438.632147156" watchObservedRunningTime="2025-12-03 07:45:01.293487422 +0000 UTC m=+4438.637070930" Dec 03 07:45:02 crc kubenswrapper[4831]: I1203 07:45:02.283467 4831 generic.go:334] "Generic (PLEG): container finished" podID="9aaf4a58-6301-4f40-99a4-90993f851a8d" containerID="7e1d6c9217938e0f35c6dd4a17856d7002ac9e4bdfab8ece00be02a1dda9e274" exitCode=0 Dec 03 07:45:02 crc kubenswrapper[4831]: I1203 07:45:02.283542 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" event={"ID":"9aaf4a58-6301-4f40-99a4-90993f851a8d","Type":"ContainerDied","Data":"7e1d6c9217938e0f35c6dd4a17856d7002ac9e4bdfab8ece00be02a1dda9e274"} Dec 03 07:45:03 crc kubenswrapper[4831]: I1203 07:45:03.673908 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" Dec 03 07:45:03 crc kubenswrapper[4831]: I1203 07:45:03.819935 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjbwx\" (UniqueName: \"kubernetes.io/projected/9aaf4a58-6301-4f40-99a4-90993f851a8d-kube-api-access-sjbwx\") pod \"9aaf4a58-6301-4f40-99a4-90993f851a8d\" (UID: \"9aaf4a58-6301-4f40-99a4-90993f851a8d\") " Dec 03 07:45:03 crc kubenswrapper[4831]: I1203 07:45:03.820114 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9aaf4a58-6301-4f40-99a4-90993f851a8d-secret-volume\") pod \"9aaf4a58-6301-4f40-99a4-90993f851a8d\" (UID: \"9aaf4a58-6301-4f40-99a4-90993f851a8d\") " Dec 03 07:45:03 crc kubenswrapper[4831]: I1203 07:45:03.820241 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9aaf4a58-6301-4f40-99a4-90993f851a8d-config-volume\") pod \"9aaf4a58-6301-4f40-99a4-90993f851a8d\" (UID: \"9aaf4a58-6301-4f40-99a4-90993f851a8d\") " Dec 03 07:45:03 crc kubenswrapper[4831]: I1203 07:45:03.821157 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9aaf4a58-6301-4f40-99a4-90993f851a8d-config-volume" (OuterVolumeSpecName: "config-volume") pod "9aaf4a58-6301-4f40-99a4-90993f851a8d" (UID: "9aaf4a58-6301-4f40-99a4-90993f851a8d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:45:03 crc kubenswrapper[4831]: I1203 07:45:03.821918 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9aaf4a58-6301-4f40-99a4-90993f851a8d-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:45:03 crc kubenswrapper[4831]: I1203 07:45:03.827953 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aaf4a58-6301-4f40-99a4-90993f851a8d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "9aaf4a58-6301-4f40-99a4-90993f851a8d" (UID: "9aaf4a58-6301-4f40-99a4-90993f851a8d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:45:03 crc kubenswrapper[4831]: I1203 07:45:03.828859 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aaf4a58-6301-4f40-99a4-90993f851a8d-kube-api-access-sjbwx" (OuterVolumeSpecName: "kube-api-access-sjbwx") pod "9aaf4a58-6301-4f40-99a4-90993f851a8d" (UID: "9aaf4a58-6301-4f40-99a4-90993f851a8d"). InnerVolumeSpecName "kube-api-access-sjbwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:45:03 crc kubenswrapper[4831]: I1203 07:45:03.923531 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjbwx\" (UniqueName: \"kubernetes.io/projected/9aaf4a58-6301-4f40-99a4-90993f851a8d-kube-api-access-sjbwx\") on node \"crc\" DevicePath \"\"" Dec 03 07:45:03 crc kubenswrapper[4831]: I1203 07:45:03.923590 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/9aaf4a58-6301-4f40-99a4-90993f851a8d-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 07:45:04 crc kubenswrapper[4831]: I1203 07:45:04.308164 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" event={"ID":"9aaf4a58-6301-4f40-99a4-90993f851a8d","Type":"ContainerDied","Data":"927c35be7655d5bd19d5521e437aceecca90ba74b679f02121d6a7de1411aa3d"} Dec 03 07:45:04 crc kubenswrapper[4831]: I1203 07:45:04.308620 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="927c35be7655d5bd19d5521e437aceecca90ba74b679f02121d6a7de1411aa3d" Dec 03 07:45:04 crc kubenswrapper[4831]: I1203 07:45:04.308240 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m" Dec 03 07:45:04 crc kubenswrapper[4831]: I1203 07:45:04.393192 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh"] Dec 03 07:45:04 crc kubenswrapper[4831]: I1203 07:45:04.400595 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412420-x54xh"] Dec 03 07:45:05 crc kubenswrapper[4831]: I1203 07:45:05.015224 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:45:05 crc kubenswrapper[4831]: E1203 07:45:05.015569 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:45:05 crc kubenswrapper[4831]: I1203 07:45:05.027479 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba20a9e0-a368-4398-af6d-2b8ce42de91a" path="/var/lib/kubelet/pods/ba20a9e0-a368-4398-af6d-2b8ce42de91a/volumes" Dec 03 07:45:18 crc kubenswrapper[4831]: I1203 07:45:18.017573 4831 scope.go:117] "RemoveContainer" containerID="8dff0c63ec02427be9dc3bb1766830881fa00b3e23393bff150f01bc6ef7f266" Dec 03 07:45:20 crc kubenswrapper[4831]: I1203 07:45:20.012493 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:45:20 crc kubenswrapper[4831]: E1203 07:45:20.014258 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:45:32 crc kubenswrapper[4831]: I1203 07:45:32.013296 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:45:32 crc kubenswrapper[4831]: E1203 07:45:32.014910 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:45:45 crc kubenswrapper[4831]: I1203 07:45:45.013052 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:45:45 crc kubenswrapper[4831]: E1203 07:45:45.014211 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:45:56 crc kubenswrapper[4831]: I1203 07:45:56.013041 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:45:56 crc kubenswrapper[4831]: E1203 07:45:56.013808 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:46:11 crc kubenswrapper[4831]: I1203 07:46:11.012782 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:46:11 crc kubenswrapper[4831]: E1203 07:46:11.013899 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.248167 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5t86s"] Dec 03 07:46:14 crc kubenswrapper[4831]: E1203 07:46:14.248952 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aaf4a58-6301-4f40-99a4-90993f851a8d" containerName="collect-profiles" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.248993 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aaf4a58-6301-4f40-99a4-90993f851a8d" containerName="collect-profiles" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.249227 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aaf4a58-6301-4f40-99a4-90993f851a8d" containerName="collect-profiles" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.251287 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.264752 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5t86s"] Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.416274 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76516b9d-8ed8-45d7-adcc-d26065ac1237-utilities\") pod \"redhat-marketplace-5t86s\" (UID: \"76516b9d-8ed8-45d7-adcc-d26065ac1237\") " pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.416469 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76516b9d-8ed8-45d7-adcc-d26065ac1237-catalog-content\") pod \"redhat-marketplace-5t86s\" (UID: \"76516b9d-8ed8-45d7-adcc-d26065ac1237\") " pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.416684 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x58z2\" (UniqueName: \"kubernetes.io/projected/76516b9d-8ed8-45d7-adcc-d26065ac1237-kube-api-access-x58z2\") pod \"redhat-marketplace-5t86s\" (UID: \"76516b9d-8ed8-45d7-adcc-d26065ac1237\") " pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.517993 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76516b9d-8ed8-45d7-adcc-d26065ac1237-catalog-content\") pod \"redhat-marketplace-5t86s\" (UID: \"76516b9d-8ed8-45d7-adcc-d26065ac1237\") " pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.518228 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x58z2\" (UniqueName: \"kubernetes.io/projected/76516b9d-8ed8-45d7-adcc-d26065ac1237-kube-api-access-x58z2\") pod \"redhat-marketplace-5t86s\" (UID: \"76516b9d-8ed8-45d7-adcc-d26065ac1237\") " pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.518519 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76516b9d-8ed8-45d7-adcc-d26065ac1237-utilities\") pod \"redhat-marketplace-5t86s\" (UID: \"76516b9d-8ed8-45d7-adcc-d26065ac1237\") " pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.519010 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76516b9d-8ed8-45d7-adcc-d26065ac1237-catalog-content\") pod \"redhat-marketplace-5t86s\" (UID: \"76516b9d-8ed8-45d7-adcc-d26065ac1237\") " pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.519048 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76516b9d-8ed8-45d7-adcc-d26065ac1237-utilities\") pod \"redhat-marketplace-5t86s\" (UID: \"76516b9d-8ed8-45d7-adcc-d26065ac1237\") " pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.546995 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x58z2\" (UniqueName: \"kubernetes.io/projected/76516b9d-8ed8-45d7-adcc-d26065ac1237-kube-api-access-x58z2\") pod \"redhat-marketplace-5t86s\" (UID: \"76516b9d-8ed8-45d7-adcc-d26065ac1237\") " pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:14 crc kubenswrapper[4831]: I1203 07:46:14.589954 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:15 crc kubenswrapper[4831]: I1203 07:46:15.119094 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5t86s"] Dec 03 07:46:15 crc kubenswrapper[4831]: I1203 07:46:15.942913 4831 generic.go:334] "Generic (PLEG): container finished" podID="76516b9d-8ed8-45d7-adcc-d26065ac1237" containerID="49630976e7fb8afc4adbd3cbaf0e095679d5b5f644e3d6b8d023db21da3ae1c6" exitCode=0 Dec 03 07:46:15 crc kubenswrapper[4831]: I1203 07:46:15.943020 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5t86s" event={"ID":"76516b9d-8ed8-45d7-adcc-d26065ac1237","Type":"ContainerDied","Data":"49630976e7fb8afc4adbd3cbaf0e095679d5b5f644e3d6b8d023db21da3ae1c6"} Dec 03 07:46:15 crc kubenswrapper[4831]: I1203 07:46:15.943344 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5t86s" event={"ID":"76516b9d-8ed8-45d7-adcc-d26065ac1237","Type":"ContainerStarted","Data":"62fbf96d8fb35072fff290ccfb47720ce728118c0fe3620047357fbfbac12a07"} Dec 03 07:46:16 crc kubenswrapper[4831]: I1203 07:46:16.955372 4831 generic.go:334] "Generic (PLEG): container finished" podID="76516b9d-8ed8-45d7-adcc-d26065ac1237" containerID="8715c38d9a5c580643523f51a3a9e615ad80d85056859b85d7fcb9963d0ce2a5" exitCode=0 Dec 03 07:46:16 crc kubenswrapper[4831]: I1203 07:46:16.955438 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5t86s" event={"ID":"76516b9d-8ed8-45d7-adcc-d26065ac1237","Type":"ContainerDied","Data":"8715c38d9a5c580643523f51a3a9e615ad80d85056859b85d7fcb9963d0ce2a5"} Dec 03 07:46:17 crc kubenswrapper[4831]: I1203 07:46:17.964358 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5t86s" event={"ID":"76516b9d-8ed8-45d7-adcc-d26065ac1237","Type":"ContainerStarted","Data":"e38b4507305a0ed640a5cf479cd4b9343ed4409d9609dbb9831335a341f4e4ff"} Dec 03 07:46:17 crc kubenswrapper[4831]: I1203 07:46:17.991230 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5t86s" podStartSLOduration=2.467849528 podStartE2EDuration="3.99121058s" podCreationTimestamp="2025-12-03 07:46:14 +0000 UTC" firstStartedPulling="2025-12-03 07:46:15.944996057 +0000 UTC m=+4513.288579595" lastFinishedPulling="2025-12-03 07:46:17.468357099 +0000 UTC m=+4514.811940647" observedRunningTime="2025-12-03 07:46:17.984435659 +0000 UTC m=+4515.328019167" watchObservedRunningTime="2025-12-03 07:46:17.99121058 +0000 UTC m=+4515.334794098" Dec 03 07:46:24 crc kubenswrapper[4831]: I1203 07:46:24.014047 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:46:24 crc kubenswrapper[4831]: E1203 07:46:24.015274 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:46:24 crc kubenswrapper[4831]: I1203 07:46:24.590195 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:24 crc kubenswrapper[4831]: I1203 07:46:24.590595 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:24 crc kubenswrapper[4831]: I1203 07:46:24.674530 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:25 crc kubenswrapper[4831]: I1203 07:46:25.084973 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:25 crc kubenswrapper[4831]: I1203 07:46:25.193195 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5t86s"] Dec 03 07:46:27 crc kubenswrapper[4831]: I1203 07:46:27.039809 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5t86s" podUID="76516b9d-8ed8-45d7-adcc-d26065ac1237" containerName="registry-server" containerID="cri-o://e38b4507305a0ed640a5cf479cd4b9343ed4409d9609dbb9831335a341f4e4ff" gracePeriod=2 Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.050417 4831 generic.go:334] "Generic (PLEG): container finished" podID="76516b9d-8ed8-45d7-adcc-d26065ac1237" containerID="e38b4507305a0ed640a5cf479cd4b9343ed4409d9609dbb9831335a341f4e4ff" exitCode=0 Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.050640 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5t86s" event={"ID":"76516b9d-8ed8-45d7-adcc-d26065ac1237","Type":"ContainerDied","Data":"e38b4507305a0ed640a5cf479cd4b9343ed4409d9609dbb9831335a341f4e4ff"} Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.050814 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5t86s" event={"ID":"76516b9d-8ed8-45d7-adcc-d26065ac1237","Type":"ContainerDied","Data":"62fbf96d8fb35072fff290ccfb47720ce728118c0fe3620047357fbfbac12a07"} Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.050856 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="62fbf96d8fb35072fff290ccfb47720ce728118c0fe3620047357fbfbac12a07" Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.069596 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.261080 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76516b9d-8ed8-45d7-adcc-d26065ac1237-utilities\") pod \"76516b9d-8ed8-45d7-adcc-d26065ac1237\" (UID: \"76516b9d-8ed8-45d7-adcc-d26065ac1237\") " Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.261395 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76516b9d-8ed8-45d7-adcc-d26065ac1237-catalog-content\") pod \"76516b9d-8ed8-45d7-adcc-d26065ac1237\" (UID: \"76516b9d-8ed8-45d7-adcc-d26065ac1237\") " Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.261426 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x58z2\" (UniqueName: \"kubernetes.io/projected/76516b9d-8ed8-45d7-adcc-d26065ac1237-kube-api-access-x58z2\") pod \"76516b9d-8ed8-45d7-adcc-d26065ac1237\" (UID: \"76516b9d-8ed8-45d7-adcc-d26065ac1237\") " Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.261979 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76516b9d-8ed8-45d7-adcc-d26065ac1237-utilities" (OuterVolumeSpecName: "utilities") pod "76516b9d-8ed8-45d7-adcc-d26065ac1237" (UID: "76516b9d-8ed8-45d7-adcc-d26065ac1237"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.266879 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76516b9d-8ed8-45d7-adcc-d26065ac1237-kube-api-access-x58z2" (OuterVolumeSpecName: "kube-api-access-x58z2") pod "76516b9d-8ed8-45d7-adcc-d26065ac1237" (UID: "76516b9d-8ed8-45d7-adcc-d26065ac1237"). InnerVolumeSpecName "kube-api-access-x58z2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.300730 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/76516b9d-8ed8-45d7-adcc-d26065ac1237-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "76516b9d-8ed8-45d7-adcc-d26065ac1237" (UID: "76516b9d-8ed8-45d7-adcc-d26065ac1237"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.364457 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/76516b9d-8ed8-45d7-adcc-d26065ac1237-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.364513 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/76516b9d-8ed8-45d7-adcc-d26065ac1237-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:46:28 crc kubenswrapper[4831]: I1203 07:46:28.364538 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x58z2\" (UniqueName: \"kubernetes.io/projected/76516b9d-8ed8-45d7-adcc-d26065ac1237-kube-api-access-x58z2\") on node \"crc\" DevicePath \"\"" Dec 03 07:46:29 crc kubenswrapper[4831]: I1203 07:46:29.061824 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5t86s" Dec 03 07:46:29 crc kubenswrapper[4831]: I1203 07:46:29.101884 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5t86s"] Dec 03 07:46:29 crc kubenswrapper[4831]: I1203 07:46:29.112433 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5t86s"] Dec 03 07:46:31 crc kubenswrapper[4831]: I1203 07:46:31.025658 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76516b9d-8ed8-45d7-adcc-d26065ac1237" path="/var/lib/kubelet/pods/76516b9d-8ed8-45d7-adcc-d26065ac1237/volumes" Dec 03 07:46:37 crc kubenswrapper[4831]: I1203 07:46:37.016787 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:46:37 crc kubenswrapper[4831]: E1203 07:46:37.017216 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:46:51 crc kubenswrapper[4831]: I1203 07:46:51.013096 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:46:51 crc kubenswrapper[4831]: E1203 07:46:51.014115 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.486889 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-p2xx8"] Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.493894 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-p2xx8"] Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.614169 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-wnrkd"] Dec 03 07:47:01 crc kubenswrapper[4831]: E1203 07:47:01.614807 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76516b9d-8ed8-45d7-adcc-d26065ac1237" containerName="extract-content" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.614843 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="76516b9d-8ed8-45d7-adcc-d26065ac1237" containerName="extract-content" Dec 03 07:47:01 crc kubenswrapper[4831]: E1203 07:47:01.614884 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76516b9d-8ed8-45d7-adcc-d26065ac1237" containerName="extract-utilities" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.614901 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="76516b9d-8ed8-45d7-adcc-d26065ac1237" containerName="extract-utilities" Dec 03 07:47:01 crc kubenswrapper[4831]: E1203 07:47:01.614923 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76516b9d-8ed8-45d7-adcc-d26065ac1237" containerName="registry-server" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.614939 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="76516b9d-8ed8-45d7-adcc-d26065ac1237" containerName="registry-server" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.615391 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="76516b9d-8ed8-45d7-adcc-d26065ac1237" containerName="registry-server" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.616455 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wnrkd" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.619529 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.619600 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.619529 4831 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-6mcp8" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.620091 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.626273 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-wnrkd"] Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.800929 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d93333ea-53fd-4bfb-b277-a79a3ff7972b-crc-storage\") pod \"crc-storage-crc-wnrkd\" (UID: \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\") " pod="crc-storage/crc-storage-crc-wnrkd" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.800989 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzqh7\" (UniqueName: \"kubernetes.io/projected/d93333ea-53fd-4bfb-b277-a79a3ff7972b-kube-api-access-jzqh7\") pod \"crc-storage-crc-wnrkd\" (UID: \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\") " pod="crc-storage/crc-storage-crc-wnrkd" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.801061 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d93333ea-53fd-4bfb-b277-a79a3ff7972b-node-mnt\") pod \"crc-storage-crc-wnrkd\" (UID: \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\") " pod="crc-storage/crc-storage-crc-wnrkd" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.902531 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d93333ea-53fd-4bfb-b277-a79a3ff7972b-node-mnt\") pod \"crc-storage-crc-wnrkd\" (UID: \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\") " pod="crc-storage/crc-storage-crc-wnrkd" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.902761 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d93333ea-53fd-4bfb-b277-a79a3ff7972b-crc-storage\") pod \"crc-storage-crc-wnrkd\" (UID: \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\") " pod="crc-storage/crc-storage-crc-wnrkd" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.902798 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzqh7\" (UniqueName: \"kubernetes.io/projected/d93333ea-53fd-4bfb-b277-a79a3ff7972b-kube-api-access-jzqh7\") pod \"crc-storage-crc-wnrkd\" (UID: \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\") " pod="crc-storage/crc-storage-crc-wnrkd" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.903011 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d93333ea-53fd-4bfb-b277-a79a3ff7972b-node-mnt\") pod \"crc-storage-crc-wnrkd\" (UID: \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\") " pod="crc-storage/crc-storage-crc-wnrkd" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.904522 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d93333ea-53fd-4bfb-b277-a79a3ff7972b-crc-storage\") pod \"crc-storage-crc-wnrkd\" (UID: \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\") " pod="crc-storage/crc-storage-crc-wnrkd" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.938848 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzqh7\" (UniqueName: \"kubernetes.io/projected/d93333ea-53fd-4bfb-b277-a79a3ff7972b-kube-api-access-jzqh7\") pod \"crc-storage-crc-wnrkd\" (UID: \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\") " pod="crc-storage/crc-storage-crc-wnrkd" Dec 03 07:47:01 crc kubenswrapper[4831]: I1203 07:47:01.941204 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wnrkd" Dec 03 07:47:02 crc kubenswrapper[4831]: I1203 07:47:02.231533 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-wnrkd"] Dec 03 07:47:02 crc kubenswrapper[4831]: I1203 07:47:02.388127 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wnrkd" event={"ID":"d93333ea-53fd-4bfb-b277-a79a3ff7972b","Type":"ContainerStarted","Data":"6dd58aaee405acb616f552954a1a83dee35cef34b93d8ce667292413160f12fb"} Dec 03 07:47:03 crc kubenswrapper[4831]: I1203 07:47:03.037553 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a25efc27-9981-4e58-bf27-8e9650464fc4" path="/var/lib/kubelet/pods/a25efc27-9981-4e58-bf27-8e9650464fc4/volumes" Dec 03 07:47:03 crc kubenswrapper[4831]: I1203 07:47:03.398176 4831 generic.go:334] "Generic (PLEG): container finished" podID="d93333ea-53fd-4bfb-b277-a79a3ff7972b" containerID="ccd04f15bf9e07532d64edd9da97c634a3ca7e855be113135dedbec011e7debc" exitCode=0 Dec 03 07:47:03 crc kubenswrapper[4831]: I1203 07:47:03.398253 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wnrkd" event={"ID":"d93333ea-53fd-4bfb-b277-a79a3ff7972b","Type":"ContainerDied","Data":"ccd04f15bf9e07532d64edd9da97c634a3ca7e855be113135dedbec011e7debc"} Dec 03 07:47:04 crc kubenswrapper[4831]: I1203 07:47:04.781058 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wnrkd" Dec 03 07:47:04 crc kubenswrapper[4831]: I1203 07:47:04.953440 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d93333ea-53fd-4bfb-b277-a79a3ff7972b-node-mnt\") pod \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\" (UID: \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\") " Dec 03 07:47:04 crc kubenswrapper[4831]: I1203 07:47:04.953887 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d93333ea-53fd-4bfb-b277-a79a3ff7972b-crc-storage\") pod \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\" (UID: \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\") " Dec 03 07:47:04 crc kubenswrapper[4831]: I1203 07:47:04.953949 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzqh7\" (UniqueName: \"kubernetes.io/projected/d93333ea-53fd-4bfb-b277-a79a3ff7972b-kube-api-access-jzqh7\") pod \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\" (UID: \"d93333ea-53fd-4bfb-b277-a79a3ff7972b\") " Dec 03 07:47:04 crc kubenswrapper[4831]: I1203 07:47:04.953596 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d93333ea-53fd-4bfb-b277-a79a3ff7972b-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "d93333ea-53fd-4bfb-b277-a79a3ff7972b" (UID: "d93333ea-53fd-4bfb-b277-a79a3ff7972b"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:47:04 crc kubenswrapper[4831]: I1203 07:47:04.954223 4831 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/d93333ea-53fd-4bfb-b277-a79a3ff7972b-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 03 07:47:04 crc kubenswrapper[4831]: I1203 07:47:04.964933 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d93333ea-53fd-4bfb-b277-a79a3ff7972b-kube-api-access-jzqh7" (OuterVolumeSpecName: "kube-api-access-jzqh7") pod "d93333ea-53fd-4bfb-b277-a79a3ff7972b" (UID: "d93333ea-53fd-4bfb-b277-a79a3ff7972b"). InnerVolumeSpecName "kube-api-access-jzqh7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:47:04 crc kubenswrapper[4831]: I1203 07:47:04.977467 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d93333ea-53fd-4bfb-b277-a79a3ff7972b-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "d93333ea-53fd-4bfb-b277-a79a3ff7972b" (UID: "d93333ea-53fd-4bfb-b277-a79a3ff7972b"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:47:05 crc kubenswrapper[4831]: I1203 07:47:05.056451 4831 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/d93333ea-53fd-4bfb-b277-a79a3ff7972b-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 03 07:47:05 crc kubenswrapper[4831]: I1203 07:47:05.056492 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzqh7\" (UniqueName: \"kubernetes.io/projected/d93333ea-53fd-4bfb-b277-a79a3ff7972b-kube-api-access-jzqh7\") on node \"crc\" DevicePath \"\"" Dec 03 07:47:05 crc kubenswrapper[4831]: I1203 07:47:05.431376 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-wnrkd" event={"ID":"d93333ea-53fd-4bfb-b277-a79a3ff7972b","Type":"ContainerDied","Data":"6dd58aaee405acb616f552954a1a83dee35cef34b93d8ce667292413160f12fb"} Dec 03 07:47:05 crc kubenswrapper[4831]: I1203 07:47:05.431416 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6dd58aaee405acb616f552954a1a83dee35cef34b93d8ce667292413160f12fb" Dec 03 07:47:05 crc kubenswrapper[4831]: I1203 07:47:05.431510 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-wnrkd" Dec 03 07:47:06 crc kubenswrapper[4831]: I1203 07:47:06.012674 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:47:06 crc kubenswrapper[4831]: E1203 07:47:06.013160 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:47:06 crc kubenswrapper[4831]: I1203 07:47:06.976694 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["crc-storage/crc-storage-crc-wnrkd"] Dec 03 07:47:06 crc kubenswrapper[4831]: I1203 07:47:06.983445 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["crc-storage/crc-storage-crc-wnrkd"] Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.028073 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d93333ea-53fd-4bfb-b277-a79a3ff7972b" path="/var/lib/kubelet/pods/d93333ea-53fd-4bfb-b277-a79a3ff7972b/volumes" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.093480 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["crc-storage/crc-storage-crc-ldzxk"] Dec 03 07:47:07 crc kubenswrapper[4831]: E1203 07:47:07.093746 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d93333ea-53fd-4bfb-b277-a79a3ff7972b" containerName="storage" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.093757 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d93333ea-53fd-4bfb-b277-a79a3ff7972b" containerName="storage" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.093891 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d93333ea-53fd-4bfb-b277-a79a3ff7972b" containerName="storage" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.094393 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ldzxk" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.096829 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"openshift-service-ca.crt" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.097256 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"kube-root-ca.crt" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.098521 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"crc-storage"/"crc-storage" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.098535 4831 reflector.go:368] Caches populated for *v1.Secret from object-"crc-storage"/"crc-storage-dockercfg-6mcp8" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.106734 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ldzxk"] Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.285830 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9l4n\" (UniqueName: \"kubernetes.io/projected/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-kube-api-access-q9l4n\") pod \"crc-storage-crc-ldzxk\" (UID: \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\") " pod="crc-storage/crc-storage-crc-ldzxk" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.286016 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-node-mnt\") pod \"crc-storage-crc-ldzxk\" (UID: \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\") " pod="crc-storage/crc-storage-crc-ldzxk" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.286061 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-crc-storage\") pod \"crc-storage-crc-ldzxk\" (UID: \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\") " pod="crc-storage/crc-storage-crc-ldzxk" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.387777 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-node-mnt\") pod \"crc-storage-crc-ldzxk\" (UID: \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\") " pod="crc-storage/crc-storage-crc-ldzxk" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.387894 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-crc-storage\") pod \"crc-storage-crc-ldzxk\" (UID: \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\") " pod="crc-storage/crc-storage-crc-ldzxk" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.388020 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9l4n\" (UniqueName: \"kubernetes.io/projected/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-kube-api-access-q9l4n\") pod \"crc-storage-crc-ldzxk\" (UID: \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\") " pod="crc-storage/crc-storage-crc-ldzxk" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.388143 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-node-mnt\") pod \"crc-storage-crc-ldzxk\" (UID: \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\") " pod="crc-storage/crc-storage-crc-ldzxk" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.389368 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-crc-storage\") pod \"crc-storage-crc-ldzxk\" (UID: \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\") " pod="crc-storage/crc-storage-crc-ldzxk" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.420423 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9l4n\" (UniqueName: \"kubernetes.io/projected/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-kube-api-access-q9l4n\") pod \"crc-storage-crc-ldzxk\" (UID: \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\") " pod="crc-storage/crc-storage-crc-ldzxk" Dec 03 07:47:07 crc kubenswrapper[4831]: I1203 07:47:07.709289 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ldzxk" Dec 03 07:47:08 crc kubenswrapper[4831]: I1203 07:47:08.190868 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["crc-storage/crc-storage-crc-ldzxk"] Dec 03 07:47:08 crc kubenswrapper[4831]: I1203 07:47:08.461269 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ldzxk" event={"ID":"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f","Type":"ContainerStarted","Data":"a74bea6fbd52d4c54db21d805c1fcfccbb5a518c855341014aab3b53c4ce050b"} Dec 03 07:47:09 crc kubenswrapper[4831]: I1203 07:47:09.471055 4831 generic.go:334] "Generic (PLEG): container finished" podID="a79bf067-a2a7-4dc0-996b-511e1dfcfc5f" containerID="946069dedbdaf2881635fd106944a4d9f5b92cf2f6ccf26d15c3661cfe2c9862" exitCode=0 Dec 03 07:47:09 crc kubenswrapper[4831]: I1203 07:47:09.471124 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ldzxk" event={"ID":"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f","Type":"ContainerDied","Data":"946069dedbdaf2881635fd106944a4d9f5b92cf2f6ccf26d15c3661cfe2c9862"} Dec 03 07:47:10 crc kubenswrapper[4831]: I1203 07:47:10.890555 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ldzxk" Dec 03 07:47:11 crc kubenswrapper[4831]: I1203 07:47:11.041859 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-node-mnt\") pod \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\" (UID: \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\") " Dec 03 07:47:11 crc kubenswrapper[4831]: I1203 07:47:11.041973 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-crc-storage\") pod \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\" (UID: \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\") " Dec 03 07:47:11 crc kubenswrapper[4831]: I1203 07:47:11.042065 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9l4n\" (UniqueName: \"kubernetes.io/projected/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-kube-api-access-q9l4n\") pod \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\" (UID: \"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f\") " Dec 03 07:47:11 crc kubenswrapper[4831]: I1203 07:47:11.042094 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-node-mnt" (OuterVolumeSpecName: "node-mnt") pod "a79bf067-a2a7-4dc0-996b-511e1dfcfc5f" (UID: "a79bf067-a2a7-4dc0-996b-511e1dfcfc5f"). InnerVolumeSpecName "node-mnt". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 07:47:11 crc kubenswrapper[4831]: I1203 07:47:11.042466 4831 reconciler_common.go:293] "Volume detached for volume \"node-mnt\" (UniqueName: \"kubernetes.io/host-path/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-node-mnt\") on node \"crc\" DevicePath \"\"" Dec 03 07:47:11 crc kubenswrapper[4831]: I1203 07:47:11.057674 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-kube-api-access-q9l4n" (OuterVolumeSpecName: "kube-api-access-q9l4n") pod "a79bf067-a2a7-4dc0-996b-511e1dfcfc5f" (UID: "a79bf067-a2a7-4dc0-996b-511e1dfcfc5f"). InnerVolumeSpecName "kube-api-access-q9l4n". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:47:11 crc kubenswrapper[4831]: I1203 07:47:11.081434 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-crc-storage" (OuterVolumeSpecName: "crc-storage") pod "a79bf067-a2a7-4dc0-996b-511e1dfcfc5f" (UID: "a79bf067-a2a7-4dc0-996b-511e1dfcfc5f"). InnerVolumeSpecName "crc-storage". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:47:11 crc kubenswrapper[4831]: I1203 07:47:11.144936 4831 reconciler_common.go:293] "Volume detached for volume \"crc-storage\" (UniqueName: \"kubernetes.io/configmap/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-crc-storage\") on node \"crc\" DevicePath \"\"" Dec 03 07:47:11 crc kubenswrapper[4831]: I1203 07:47:11.145000 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q9l4n\" (UniqueName: \"kubernetes.io/projected/a79bf067-a2a7-4dc0-996b-511e1dfcfc5f-kube-api-access-q9l4n\") on node \"crc\" DevicePath \"\"" Dec 03 07:47:11 crc kubenswrapper[4831]: I1203 07:47:11.496050 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="crc-storage/crc-storage-crc-ldzxk" event={"ID":"a79bf067-a2a7-4dc0-996b-511e1dfcfc5f","Type":"ContainerDied","Data":"a74bea6fbd52d4c54db21d805c1fcfccbb5a518c855341014aab3b53c4ce050b"} Dec 03 07:47:11 crc kubenswrapper[4831]: I1203 07:47:11.496105 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="crc-storage/crc-storage-crc-ldzxk" Dec 03 07:47:11 crc kubenswrapper[4831]: I1203 07:47:11.496127 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a74bea6fbd52d4c54db21d805c1fcfccbb5a518c855341014aab3b53c4ce050b" Dec 03 07:47:18 crc kubenswrapper[4831]: I1203 07:47:18.112993 4831 scope.go:117] "RemoveContainer" containerID="ef6263ec235effaf9016453dc7996fc4818b46714a07f7a0cfbfe3d2031d0c15" Dec 03 07:47:21 crc kubenswrapper[4831]: I1203 07:47:21.013092 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:47:21 crc kubenswrapper[4831]: E1203 07:47:21.013540 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:47:35 crc kubenswrapper[4831]: I1203 07:47:35.012988 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:47:35 crc kubenswrapper[4831]: E1203 07:47:35.014654 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:47:41 crc kubenswrapper[4831]: I1203 07:47:41.031448 4831 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-j22fq container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.217.0.62:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 07:47:41 crc kubenswrapper[4831]: I1203 07:47:41.031737 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" podUID="5c596f0f-729f-4beb-b1f7-58ce65c9a928" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.62:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 07:47:41 crc kubenswrapper[4831]: I1203 07:47:41.031457 4831 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-j22fq container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.62:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Dec 03 07:47:41 crc kubenswrapper[4831]: I1203 07:47:41.031846 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-j22fq" podUID="5c596f0f-729f-4beb-b1f7-58ce65c9a928" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.62:8080/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 07:47:50 crc kubenswrapper[4831]: I1203 07:47:50.013135 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:47:50 crc kubenswrapper[4831]: E1203 07:47:50.014004 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:48:03 crc kubenswrapper[4831]: I1203 07:48:03.022436 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:48:03 crc kubenswrapper[4831]: I1203 07:48:03.581355 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"3baec98df1747de5e0f1821202cad3b811d612c0c379c2da31045f274fd26772"} Dec 03 07:49:05 crc kubenswrapper[4831]: I1203 07:49:05.941927 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kqfch"] Dec 03 07:49:05 crc kubenswrapper[4831]: E1203 07:49:05.943135 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79bf067-a2a7-4dc0-996b-511e1dfcfc5f" containerName="storage" Dec 03 07:49:05 crc kubenswrapper[4831]: I1203 07:49:05.943160 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79bf067-a2a7-4dc0-996b-511e1dfcfc5f" containerName="storage" Dec 03 07:49:05 crc kubenswrapper[4831]: I1203 07:49:05.943634 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79bf067-a2a7-4dc0-996b-511e1dfcfc5f" containerName="storage" Dec 03 07:49:05 crc kubenswrapper[4831]: I1203 07:49:05.946170 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:05 crc kubenswrapper[4831]: I1203 07:49:05.954404 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kqfch"] Dec 03 07:49:06 crc kubenswrapper[4831]: I1203 07:49:06.046567 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-catalog-content\") pod \"redhat-operators-kqfch\" (UID: \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\") " pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:06 crc kubenswrapper[4831]: I1203 07:49:06.046656 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq5w9\" (UniqueName: \"kubernetes.io/projected/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-kube-api-access-cq5w9\") pod \"redhat-operators-kqfch\" (UID: \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\") " pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:06 crc kubenswrapper[4831]: I1203 07:49:06.046751 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-utilities\") pod \"redhat-operators-kqfch\" (UID: \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\") " pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:06 crc kubenswrapper[4831]: I1203 07:49:06.148571 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-utilities\") pod \"redhat-operators-kqfch\" (UID: \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\") " pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:06 crc kubenswrapper[4831]: I1203 07:49:06.148751 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-catalog-content\") pod \"redhat-operators-kqfch\" (UID: \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\") " pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:06 crc kubenswrapper[4831]: I1203 07:49:06.148836 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq5w9\" (UniqueName: \"kubernetes.io/projected/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-kube-api-access-cq5w9\") pod \"redhat-operators-kqfch\" (UID: \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\") " pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:06 crc kubenswrapper[4831]: I1203 07:49:06.150129 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-catalog-content\") pod \"redhat-operators-kqfch\" (UID: \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\") " pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:06 crc kubenswrapper[4831]: I1203 07:49:06.187918 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-utilities\") pod \"redhat-operators-kqfch\" (UID: \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\") " pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:06 crc kubenswrapper[4831]: I1203 07:49:06.209848 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq5w9\" (UniqueName: \"kubernetes.io/projected/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-kube-api-access-cq5w9\") pod \"redhat-operators-kqfch\" (UID: \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\") " pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:06 crc kubenswrapper[4831]: I1203 07:49:06.310991 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:06 crc kubenswrapper[4831]: I1203 07:49:06.747445 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kqfch"] Dec 03 07:49:07 crc kubenswrapper[4831]: I1203 07:49:07.199844 4831 generic.go:334] "Generic (PLEG): container finished" podID="60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" containerID="c8d382678c56d09cb7c345a0a33193a6d0c42d8c151632fbcee5e5b385eab228" exitCode=0 Dec 03 07:49:07 crc kubenswrapper[4831]: I1203 07:49:07.199896 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqfch" event={"ID":"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e","Type":"ContainerDied","Data":"c8d382678c56d09cb7c345a0a33193a6d0c42d8c151632fbcee5e5b385eab228"} Dec 03 07:49:07 crc kubenswrapper[4831]: I1203 07:49:07.200166 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqfch" event={"ID":"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e","Type":"ContainerStarted","Data":"e14cd0993295632e873f7d7518e4c88e9b7ced64a48dbe92f738634a0ecc6544"} Dec 03 07:49:07 crc kubenswrapper[4831]: I1203 07:49:07.201971 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:49:08 crc kubenswrapper[4831]: I1203 07:49:08.208339 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqfch" event={"ID":"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e","Type":"ContainerStarted","Data":"295de267422a25101f3690b2c0e7d389e78d81c7042c4c2b367a4db7e2fafd8f"} Dec 03 07:49:09 crc kubenswrapper[4831]: I1203 07:49:09.218717 4831 generic.go:334] "Generic (PLEG): container finished" podID="60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" containerID="295de267422a25101f3690b2c0e7d389e78d81c7042c4c2b367a4db7e2fafd8f" exitCode=0 Dec 03 07:49:09 crc kubenswrapper[4831]: I1203 07:49:09.218761 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqfch" event={"ID":"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e","Type":"ContainerDied","Data":"295de267422a25101f3690b2c0e7d389e78d81c7042c4c2b367a4db7e2fafd8f"} Dec 03 07:49:10 crc kubenswrapper[4831]: I1203 07:49:10.230159 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqfch" event={"ID":"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e","Type":"ContainerStarted","Data":"39fa95d26350fdb9c0e669d190335da4bf7ff59b2669128c0634522b94dc6a22"} Dec 03 07:49:10 crc kubenswrapper[4831]: I1203 07:49:10.278373 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kqfch" podStartSLOduration=2.626596893 podStartE2EDuration="5.278346368s" podCreationTimestamp="2025-12-03 07:49:05 +0000 UTC" firstStartedPulling="2025-12-03 07:49:07.201757641 +0000 UTC m=+4684.545341139" lastFinishedPulling="2025-12-03 07:49:09.853507096 +0000 UTC m=+4687.197090614" observedRunningTime="2025-12-03 07:49:10.268547332 +0000 UTC m=+4687.612130870" watchObservedRunningTime="2025-12-03 07:49:10.278346368 +0000 UTC m=+4687.621929916" Dec 03 07:49:16 crc kubenswrapper[4831]: I1203 07:49:16.312504 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:16 crc kubenswrapper[4831]: I1203 07:49:16.313100 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:16 crc kubenswrapper[4831]: I1203 07:49:16.377660 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:17 crc kubenswrapper[4831]: I1203 07:49:17.364738 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:17 crc kubenswrapper[4831]: I1203 07:49:17.433446 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kqfch"] Dec 03 07:49:19 crc kubenswrapper[4831]: I1203 07:49:19.306709 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kqfch" podUID="60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" containerName="registry-server" containerID="cri-o://39fa95d26350fdb9c0e669d190335da4bf7ff59b2669128c0634522b94dc6a22" gracePeriod=2 Dec 03 07:49:21 crc kubenswrapper[4831]: I1203 07:49:21.327818 4831 generic.go:334] "Generic (PLEG): container finished" podID="60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" containerID="39fa95d26350fdb9c0e669d190335da4bf7ff59b2669128c0634522b94dc6a22" exitCode=0 Dec 03 07:49:21 crc kubenswrapper[4831]: I1203 07:49:21.327920 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqfch" event={"ID":"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e","Type":"ContainerDied","Data":"39fa95d26350fdb9c0e669d190335da4bf7ff59b2669128c0634522b94dc6a22"} Dec 03 07:49:21 crc kubenswrapper[4831]: I1203 07:49:21.583523 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:21 crc kubenswrapper[4831]: I1203 07:49:21.712299 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-utilities\") pod \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\" (UID: \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\") " Dec 03 07:49:21 crc kubenswrapper[4831]: I1203 07:49:21.712554 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq5w9\" (UniqueName: \"kubernetes.io/projected/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-kube-api-access-cq5w9\") pod \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\" (UID: \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\") " Dec 03 07:49:21 crc kubenswrapper[4831]: I1203 07:49:21.712646 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-catalog-content\") pod \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\" (UID: \"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e\") " Dec 03 07:49:21 crc kubenswrapper[4831]: I1203 07:49:21.714383 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-utilities" (OuterVolumeSpecName: "utilities") pod "60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" (UID: "60f3ff95-aaf6-4d06-83df-61d8ce86ad2e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:49:21 crc kubenswrapper[4831]: I1203 07:49:21.721761 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-kube-api-access-cq5w9" (OuterVolumeSpecName: "kube-api-access-cq5w9") pod "60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" (UID: "60f3ff95-aaf6-4d06-83df-61d8ce86ad2e"). InnerVolumeSpecName "kube-api-access-cq5w9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:49:21 crc kubenswrapper[4831]: I1203 07:49:21.814531 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:49:21 crc kubenswrapper[4831]: I1203 07:49:21.814606 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq5w9\" (UniqueName: \"kubernetes.io/projected/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-kube-api-access-cq5w9\") on node \"crc\" DevicePath \"\"" Dec 03 07:49:21 crc kubenswrapper[4831]: I1203 07:49:21.827763 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" (UID: "60f3ff95-aaf6-4d06-83df-61d8ce86ad2e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:49:21 crc kubenswrapper[4831]: I1203 07:49:21.916571 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:49:22 crc kubenswrapper[4831]: I1203 07:49:22.344847 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqfch" event={"ID":"60f3ff95-aaf6-4d06-83df-61d8ce86ad2e","Type":"ContainerDied","Data":"e14cd0993295632e873f7d7518e4c88e9b7ced64a48dbe92f738634a0ecc6544"} Dec 03 07:49:22 crc kubenswrapper[4831]: I1203 07:49:22.344955 4831 scope.go:117] "RemoveContainer" containerID="39fa95d26350fdb9c0e669d190335da4bf7ff59b2669128c0634522b94dc6a22" Dec 03 07:49:22 crc kubenswrapper[4831]: I1203 07:49:22.344959 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqfch" Dec 03 07:49:22 crc kubenswrapper[4831]: I1203 07:49:22.383926 4831 scope.go:117] "RemoveContainer" containerID="295de267422a25101f3690b2c0e7d389e78d81c7042c4c2b367a4db7e2fafd8f" Dec 03 07:49:22 crc kubenswrapper[4831]: I1203 07:49:22.406885 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kqfch"] Dec 03 07:49:22 crc kubenswrapper[4831]: I1203 07:49:22.417185 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kqfch"] Dec 03 07:49:22 crc kubenswrapper[4831]: I1203 07:49:22.427159 4831 scope.go:117] "RemoveContainer" containerID="c8d382678c56d09cb7c345a0a33193a6d0c42d8c151632fbcee5e5b385eab228" Dec 03 07:49:23 crc kubenswrapper[4831]: I1203 07:49:23.023578 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" path="/var/lib/kubelet/pods/60f3ff95-aaf6-4d06-83df-61d8ce86ad2e/volumes" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.194635 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2hczv"] Dec 03 07:50:21 crc kubenswrapper[4831]: E1203 07:50:21.195375 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" containerName="extract-content" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.195391 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" containerName="extract-content" Dec 03 07:50:21 crc kubenswrapper[4831]: E1203 07:50:21.195430 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" containerName="registry-server" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.195438 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" containerName="registry-server" Dec 03 07:50:21 crc kubenswrapper[4831]: E1203 07:50:21.195452 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" containerName="extract-utilities" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.195461 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" containerName="extract-utilities" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.195636 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="60f3ff95-aaf6-4d06-83df-61d8ce86ad2e" containerName="registry-server" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.196593 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.205826 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2hczv"] Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.302149 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxbl5\" (UniqueName: \"kubernetes.io/projected/1104f06f-a455-4f73-93da-6fedeb0f5a7d-kube-api-access-jxbl5\") pod \"community-operators-2hczv\" (UID: \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\") " pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.302458 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1104f06f-a455-4f73-93da-6fedeb0f5a7d-catalog-content\") pod \"community-operators-2hczv\" (UID: \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\") " pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.302506 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1104f06f-a455-4f73-93da-6fedeb0f5a7d-utilities\") pod \"community-operators-2hczv\" (UID: \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\") " pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.403943 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1104f06f-a455-4f73-93da-6fedeb0f5a7d-catalog-content\") pod \"community-operators-2hczv\" (UID: \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\") " pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.404001 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1104f06f-a455-4f73-93da-6fedeb0f5a7d-utilities\") pod \"community-operators-2hczv\" (UID: \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\") " pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.404060 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxbl5\" (UniqueName: \"kubernetes.io/projected/1104f06f-a455-4f73-93da-6fedeb0f5a7d-kube-api-access-jxbl5\") pod \"community-operators-2hczv\" (UID: \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\") " pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.404598 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1104f06f-a455-4f73-93da-6fedeb0f5a7d-catalog-content\") pod \"community-operators-2hczv\" (UID: \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\") " pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.404708 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1104f06f-a455-4f73-93da-6fedeb0f5a7d-utilities\") pod \"community-operators-2hczv\" (UID: \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\") " pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.427447 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxbl5\" (UniqueName: \"kubernetes.io/projected/1104f06f-a455-4f73-93da-6fedeb0f5a7d-kube-api-access-jxbl5\") pod \"community-operators-2hczv\" (UID: \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\") " pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:21 crc kubenswrapper[4831]: I1203 07:50:21.549082 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:22 crc kubenswrapper[4831]: I1203 07:50:22.084055 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2hczv"] Dec 03 07:50:22 crc kubenswrapper[4831]: I1203 07:50:22.911546 4831 generic.go:334] "Generic (PLEG): container finished" podID="1104f06f-a455-4f73-93da-6fedeb0f5a7d" containerID="e697e764efcb9a38e5b99dc7cd16bfc4b53b51d10e052939807c5ad6543d4ce2" exitCode=0 Dec 03 07:50:22 crc kubenswrapper[4831]: I1203 07:50:22.911656 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hczv" event={"ID":"1104f06f-a455-4f73-93da-6fedeb0f5a7d","Type":"ContainerDied","Data":"e697e764efcb9a38e5b99dc7cd16bfc4b53b51d10e052939807c5ad6543d4ce2"} Dec 03 07:50:22 crc kubenswrapper[4831]: I1203 07:50:22.912073 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hczv" event={"ID":"1104f06f-a455-4f73-93da-6fedeb0f5a7d","Type":"ContainerStarted","Data":"bac0976df2e69e461884116a5470dea7b6429e7e46e268ee2cdc4fdfd26beec4"} Dec 03 07:50:23 crc kubenswrapper[4831]: I1203 07:50:23.933866 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hczv" event={"ID":"1104f06f-a455-4f73-93da-6fedeb0f5a7d","Type":"ContainerStarted","Data":"9b14c68e6dd37466018d576421e9d145cdeedfcc4506b08a349e209a300694d0"} Dec 03 07:50:24 crc kubenswrapper[4831]: I1203 07:50:24.857023 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-mxfbn"] Dec 03 07:50:24 crc kubenswrapper[4831]: I1203 07:50:24.858921 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:24 crc kubenswrapper[4831]: I1203 07:50:24.860807 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Dec 03 07:50:24 crc kubenswrapper[4831]: I1203 07:50:24.870435 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Dec 03 07:50:24 crc kubenswrapper[4831]: I1203 07:50:24.871213 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Dec 03 07:50:24 crc kubenswrapper[4831]: I1203 07:50:24.872388 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Dec 03 07:50:24 crc kubenswrapper[4831]: I1203 07:50:24.872429 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-nxfst" Dec 03 07:50:24 crc kubenswrapper[4831]: I1203 07:50:24.885619 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-mxfbn"] Dec 03 07:50:24 crc kubenswrapper[4831]: I1203 07:50:24.942787 4831 generic.go:334] "Generic (PLEG): container finished" podID="1104f06f-a455-4f73-93da-6fedeb0f5a7d" containerID="9b14c68e6dd37466018d576421e9d145cdeedfcc4506b08a349e209a300694d0" exitCode=0 Dec 03 07:50:24 crc kubenswrapper[4831]: I1203 07:50:24.942830 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hczv" event={"ID":"1104f06f-a455-4f73-93da-6fedeb0f5a7d","Type":"ContainerDied","Data":"9b14c68e6dd37466018d576421e9d145cdeedfcc4506b08a349e209a300694d0"} Dec 03 07:50:24 crc kubenswrapper[4831]: I1203 07:50:24.958219 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8527b96-1b76-4c5c-ae2a-634931b06163-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-mxfbn\" (UID: \"d8527b96-1b76-4c5c-ae2a-634931b06163\") " pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:24 crc kubenswrapper[4831]: I1203 07:50:24.958288 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-885tq\" (UniqueName: \"kubernetes.io/projected/d8527b96-1b76-4c5c-ae2a-634931b06163-kube-api-access-885tq\") pod \"dnsmasq-dns-5d7b5456f5-mxfbn\" (UID: \"d8527b96-1b76-4c5c-ae2a-634931b06163\") " pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:24 crc kubenswrapper[4831]: I1203 07:50:24.958433 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8527b96-1b76-4c5c-ae2a-634931b06163-config\") pod \"dnsmasq-dns-5d7b5456f5-mxfbn\" (UID: \"d8527b96-1b76-4c5c-ae2a-634931b06163\") " pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.059269 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8527b96-1b76-4c5c-ae2a-634931b06163-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-mxfbn\" (UID: \"d8527b96-1b76-4c5c-ae2a-634931b06163\") " pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.059342 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-885tq\" (UniqueName: \"kubernetes.io/projected/d8527b96-1b76-4c5c-ae2a-634931b06163-kube-api-access-885tq\") pod \"dnsmasq-dns-5d7b5456f5-mxfbn\" (UID: \"d8527b96-1b76-4c5c-ae2a-634931b06163\") " pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.059386 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8527b96-1b76-4c5c-ae2a-634931b06163-config\") pod \"dnsmasq-dns-5d7b5456f5-mxfbn\" (UID: \"d8527b96-1b76-4c5c-ae2a-634931b06163\") " pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.060484 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8527b96-1b76-4c5c-ae2a-634931b06163-config\") pod \"dnsmasq-dns-5d7b5456f5-mxfbn\" (UID: \"d8527b96-1b76-4c5c-ae2a-634931b06163\") " pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.061416 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8527b96-1b76-4c5c-ae2a-634931b06163-dns-svc\") pod \"dnsmasq-dns-5d7b5456f5-mxfbn\" (UID: \"d8527b96-1b76-4c5c-ae2a-634931b06163\") " pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.069724 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-j6wcv"] Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.070823 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.079847 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-j6wcv"] Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.105710 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-885tq\" (UniqueName: \"kubernetes.io/projected/d8527b96-1b76-4c5c-ae2a-634931b06163-kube-api-access-885tq\") pod \"dnsmasq-dns-5d7b5456f5-mxfbn\" (UID: \"d8527b96-1b76-4c5c-ae2a-634931b06163\") " pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.172610 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.261293 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cc56b85-be93-41b1-bdef-8944d1b349c1-config\") pod \"dnsmasq-dns-98ddfc8f-j6wcv\" (UID: \"9cc56b85-be93-41b1-bdef-8944d1b349c1\") " pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.261357 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw8vp\" (UniqueName: \"kubernetes.io/projected/9cc56b85-be93-41b1-bdef-8944d1b349c1-kube-api-access-pw8vp\") pod \"dnsmasq-dns-98ddfc8f-j6wcv\" (UID: \"9cc56b85-be93-41b1-bdef-8944d1b349c1\") " pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.261878 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cc56b85-be93-41b1-bdef-8944d1b349c1-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-j6wcv\" (UID: \"9cc56b85-be93-41b1-bdef-8944d1b349c1\") " pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.366127 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cc56b85-be93-41b1-bdef-8944d1b349c1-config\") pod \"dnsmasq-dns-98ddfc8f-j6wcv\" (UID: \"9cc56b85-be93-41b1-bdef-8944d1b349c1\") " pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.366169 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw8vp\" (UniqueName: \"kubernetes.io/projected/9cc56b85-be93-41b1-bdef-8944d1b349c1-kube-api-access-pw8vp\") pod \"dnsmasq-dns-98ddfc8f-j6wcv\" (UID: \"9cc56b85-be93-41b1-bdef-8944d1b349c1\") " pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.366205 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cc56b85-be93-41b1-bdef-8944d1b349c1-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-j6wcv\" (UID: \"9cc56b85-be93-41b1-bdef-8944d1b349c1\") " pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.367245 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cc56b85-be93-41b1-bdef-8944d1b349c1-dns-svc\") pod \"dnsmasq-dns-98ddfc8f-j6wcv\" (UID: \"9cc56b85-be93-41b1-bdef-8944d1b349c1\") " pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.367252 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cc56b85-be93-41b1-bdef-8944d1b349c1-config\") pod \"dnsmasq-dns-98ddfc8f-j6wcv\" (UID: \"9cc56b85-be93-41b1-bdef-8944d1b349c1\") " pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.390897 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw8vp\" (UniqueName: \"kubernetes.io/projected/9cc56b85-be93-41b1-bdef-8944d1b349c1-kube-api-access-pw8vp\") pod \"dnsmasq-dns-98ddfc8f-j6wcv\" (UID: \"9cc56b85-be93-41b1-bdef-8944d1b349c1\") " pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.410499 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.684598 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-mxfbn"] Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.958803 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.970501 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.970527 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" event={"ID":"d8527b96-1b76-4c5c-ae2a-634931b06163","Type":"ContainerStarted","Data":"e2aa271ad21265a0c053246235c1b54d9140d6f7be1c583a1b53f4df070275f6"} Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.970616 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.972689 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.973402 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.973719 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.973939 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 07:50:25 crc kubenswrapper[4831]: I1203 07:50:25.977403 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-smx72" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.076008 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.076061 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.076102 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.076140 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.076157 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qdzj\" (UniqueName: \"kubernetes.io/projected/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-kube-api-access-6qdzj\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.076198 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.076218 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.076238 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.076264 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.126021 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-j6wcv"] Dec 03 07:50:26 crc kubenswrapper[4831]: W1203 07:50:26.139354 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cc56b85_be93_41b1_bdef_8944d1b349c1.slice/crio-357717be64f30a2810081ae6d091a9745b31ee6ec8fa7912654697aa9af11062 WatchSource:0}: Error finding container 357717be64f30a2810081ae6d091a9745b31ee6ec8fa7912654697aa9af11062: Status 404 returned error can't find the container with id 357717be64f30a2810081ae6d091a9745b31ee6ec8fa7912654697aa9af11062 Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.178655 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.178736 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.178752 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qdzj\" (UniqueName: \"kubernetes.io/projected/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-kube-api-access-6qdzj\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.178824 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.178863 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.178893 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.178917 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.178961 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.178998 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.181402 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.183230 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.183585 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.184384 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.185452 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.185499 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7f9fda22de49c54b2fe990b90ce000982097b618ca40422e3591ee1046704283/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.187002 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.191884 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.207937 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.216861 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qdzj\" (UniqueName: \"kubernetes.io/projected/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-kube-api-access-6qdzj\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.235426 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\") pod \"rabbitmq-server-0\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.284344 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.285429 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.287868 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.288229 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.293738 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.294191 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.294261 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7p6cj" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.296750 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.300556 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.383128 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.383169 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aea3084a-d57f-49c8-b479-d031b774e0e9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.383196 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.383238 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aea3084a-d57f-49c8-b479-d031b774e0e9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.383270 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aea3084a-d57f-49c8-b479-d031b774e0e9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.383298 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.383335 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.383355 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aea3084a-d57f-49c8-b479-d031b774e0e9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.383373 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5hh5\" (UniqueName: \"kubernetes.io/projected/aea3084a-d57f-49c8-b479-d031b774e0e9-kube-api-access-c5hh5\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.487988 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.488241 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aea3084a-d57f-49c8-b479-d031b774e0e9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.488263 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.488308 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aea3084a-d57f-49c8-b479-d031b774e0e9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.488359 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aea3084a-d57f-49c8-b479-d031b774e0e9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.488384 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.488410 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.488429 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aea3084a-d57f-49c8-b479-d031b774e0e9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.488450 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5hh5\" (UniqueName: \"kubernetes.io/projected/aea3084a-d57f-49c8-b479-d031b774e0e9-kube-api-access-c5hh5\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.488963 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.489842 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aea3084a-d57f-49c8-b479-d031b774e0e9-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.489996 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aea3084a-d57f-49c8-b479-d031b774e0e9-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.490013 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.492718 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aea3084a-d57f-49c8-b479-d031b774e0e9-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.493926 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.494367 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aea3084a-d57f-49c8-b479-d031b774e0e9-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.496862 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.496891 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fcef2430a6210fdc275c89f6d0f015cb2d33458f21cca7e3c75761454f338e2f/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.505384 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5hh5\" (UniqueName: \"kubernetes.io/projected/aea3084a-d57f-49c8-b479-d031b774e0e9-kube-api-access-c5hh5\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.783199 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:50:26 crc kubenswrapper[4831]: W1203 07:50:26.923345 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4eae5aa7_d1e1_40f2_b3ee_6a54ec93bad8.slice/crio-dea9ce767b57d363c4896049ba5e840f8f3fbbe3114f88099d482633b6ac86df WatchSource:0}: Error finding container dea9ce767b57d363c4896049ba5e840f8f3fbbe3114f88099d482633b6ac86df: Status 404 returned error can't find the container with id dea9ce767b57d363c4896049ba5e840f8f3fbbe3114f88099d482633b6ac86df Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.974881 4831 generic.go:334] "Generic (PLEG): container finished" podID="9cc56b85-be93-41b1-bdef-8944d1b349c1" containerID="6379562fc7aa03f2783f2a5d6ab37169745b7d6902f69ca716f48f30bd368073" exitCode=0 Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.974986 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" event={"ID":"9cc56b85-be93-41b1-bdef-8944d1b349c1","Type":"ContainerDied","Data":"6379562fc7aa03f2783f2a5d6ab37169745b7d6902f69ca716f48f30bd368073"} Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.975246 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" event={"ID":"9cc56b85-be93-41b1-bdef-8944d1b349c1","Type":"ContainerStarted","Data":"357717be64f30a2810081ae6d091a9745b31ee6ec8fa7912654697aa9af11062"} Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.977357 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8","Type":"ContainerStarted","Data":"dea9ce767b57d363c4896049ba5e840f8f3fbbe3114f88099d482633b6ac86df"} Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.981726 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hczv" event={"ID":"1104f06f-a455-4f73-93da-6fedeb0f5a7d","Type":"ContainerStarted","Data":"78a45fdd84152e26db587743bcdd0e3167957e76ceddb1a25d974bec24a887f7"} Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.983108 4831 generic.go:334] "Generic (PLEG): container finished" podID="d8527b96-1b76-4c5c-ae2a-634931b06163" containerID="f4e68e9466c885243dc55a44cebe70a62032fa26c50875f622d324051adcc9ae" exitCode=0 Dec 03 07:50:26 crc kubenswrapper[4831]: I1203 07:50:26.983136 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" event={"ID":"d8527b96-1b76-4c5c-ae2a-634931b06163","Type":"ContainerDied","Data":"f4e68e9466c885243dc55a44cebe70a62032fa26c50875f622d324051adcc9ae"} Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.031440 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2hczv" podStartSLOduration=3.3012891030000002 podStartE2EDuration="6.031414355s" podCreationTimestamp="2025-12-03 07:50:21 +0000 UTC" firstStartedPulling="2025-12-03 07:50:22.92013686 +0000 UTC m=+4760.263720398" lastFinishedPulling="2025-12-03 07:50:25.650262142 +0000 UTC m=+4762.993845650" observedRunningTime="2025-12-03 07:50:27.021542146 +0000 UTC m=+4764.365125684" watchObservedRunningTime="2025-12-03 07:50:27.031414355 +0000 UTC m=+4764.374997873" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.128898 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\") pod \"rabbitmq-cell1-server-0\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.210680 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.212234 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.216479 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-t5cpv" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.217090 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.217247 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.217484 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.222136 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.230218 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.258193 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.305258 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.305457 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-config-data-default\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.305517 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d48e5d38-3fd5-4017-a872-21a6782a6fd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48e5d38-3fd5-4017-a872-21a6782a6fd0\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.305570 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-kolla-config\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.305610 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfvzb\" (UniqueName: \"kubernetes.io/projected/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-kube-api-access-mfvzb\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.305644 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.305693 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.305736 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.407245 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.407579 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.407605 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.407627 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-config-data-default\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.407662 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d48e5d38-3fd5-4017-a872-21a6782a6fd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48e5d38-3fd5-4017-a872-21a6782a6fd0\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.407702 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-kolla-config\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.407734 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfvzb\" (UniqueName: \"kubernetes.io/projected/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-kube-api-access-mfvzb\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.407756 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.408385 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-config-data-generated\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.409250 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-config-data-default\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.409932 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-kolla-config\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.411454 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-operator-scripts\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.412009 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.418936 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.418976 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d48e5d38-3fd5-4017-a872-21a6782a6fd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48e5d38-3fd5-4017-a872-21a6782a6fd0\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9a88e2366c67f7ae77b8fa1daea6f98b3bb76d99f8069e1dbb5e4f9bf3e74c91/globalmount\"" pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.431135 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfvzb\" (UniqueName: \"kubernetes.io/projected/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-kube-api-access-mfvzb\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.435253 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.469526 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d48e5d38-3fd5-4017-a872-21a6782a6fd0\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d48e5d38-3fd5-4017-a872-21a6782a6fd0\") pod \"openstack-galera-0\" (UID: \"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa\") " pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.538075 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.596947 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.597007 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.764463 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:50:27 crc kubenswrapper[4831]: W1203 07:50:27.793494 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaea3084a_d57f_49c8_b479_d031b774e0e9.slice/crio-439ba529d8f141ddaff252af4ad021b172c05334d977cc1f6c38a25e6e2b42c2 WatchSource:0}: Error finding container 439ba529d8f141ddaff252af4ad021b172c05334d977cc1f6c38a25e6e2b42c2: Status 404 returned error can't find the container with id 439ba529d8f141ddaff252af4ad021b172c05334d977cc1f6c38a25e6e2b42c2 Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.798058 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.803539 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.811089 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.811776 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-m2pbq" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.814502 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.829296 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.924048 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4-kolla-config\") pod \"memcached-0\" (UID: \"fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4\") " pod="openstack/memcached-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.924106 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxrl2\" (UniqueName: \"kubernetes.io/projected/fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4-kube-api-access-mxrl2\") pod \"memcached-0\" (UID: \"fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4\") " pod="openstack/memcached-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.924133 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4-config-data\") pod \"memcached-0\" (UID: \"fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4\") " pod="openstack/memcached-0" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.992445 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa","Type":"ContainerStarted","Data":"dafddf8c7750c4bb76b76cd77765f850bc597e6c6140c065d9e79dd7b4ea0c60"} Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.994489 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" event={"ID":"d8527b96-1b76-4c5c-ae2a-634931b06163","Type":"ContainerStarted","Data":"35b895c330c063b0c34485ab056537aee06316aa35f5ad9572767a1bc19a16cc"} Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.995607 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.995738 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" event={"ID":"9cc56b85-be93-41b1-bdef-8944d1b349c1","Type":"ContainerStarted","Data":"08434336e04985a50719af82b184e74ae688c22c1f3350b11aac64d36e7c6ec6"} Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.995892 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:50:27 crc kubenswrapper[4831]: I1203 07:50:27.996730 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aea3084a-d57f-49c8-b479-d031b774e0e9","Type":"ContainerStarted","Data":"439ba529d8f141ddaff252af4ad021b172c05334d977cc1f6c38a25e6e2b42c2"} Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.012143 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" podStartSLOduration=4.012127928 podStartE2EDuration="4.012127928s" podCreationTimestamp="2025-12-03 07:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:50:28.01187544 +0000 UTC m=+4765.355458948" watchObservedRunningTime="2025-12-03 07:50:28.012127928 +0000 UTC m=+4765.355711436" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.025711 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4-kolla-config\") pod \"memcached-0\" (UID: \"fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4\") " pod="openstack/memcached-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.025756 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxrl2\" (UniqueName: \"kubernetes.io/projected/fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4-kube-api-access-mxrl2\") pod \"memcached-0\" (UID: \"fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4\") " pod="openstack/memcached-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.025779 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4-config-data\") pod \"memcached-0\" (UID: \"fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4\") " pod="openstack/memcached-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.026406 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4-kolla-config\") pod \"memcached-0\" (UID: \"fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4\") " pod="openstack/memcached-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.026429 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4-config-data\") pod \"memcached-0\" (UID: \"fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4\") " pod="openstack/memcached-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.039001 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" podStartSLOduration=3.038976846 podStartE2EDuration="3.038976846s" podCreationTimestamp="2025-12-03 07:50:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:50:28.033278808 +0000 UTC m=+4765.376862316" watchObservedRunningTime="2025-12-03 07:50:28.038976846 +0000 UTC m=+4765.382560364" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.045898 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxrl2\" (UniqueName: \"kubernetes.io/projected/fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4-kube-api-access-mxrl2\") pod \"memcached-0\" (UID: \"fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4\") " pod="openstack/memcached-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.133543 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Dec 03 07:50:28 crc kubenswrapper[4831]: W1203 07:50:28.584427 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfda9b8fb_e9a0_4c3a_8b54_13a6d5f35af4.slice/crio-075ef9d681b7ce89cdf764c0cc13ea9d2bb00cb5e3914f584f0a6093763a0911 WatchSource:0}: Error finding container 075ef9d681b7ce89cdf764c0cc13ea9d2bb00cb5e3914f584f0a6093763a0911: Status 404 returned error can't find the container with id 075ef9d681b7ce89cdf764c0cc13ea9d2bb00cb5e3914f584f0a6093763a0911 Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.585580 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.679144 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.681073 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.684994 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-5c966" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.685282 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.685288 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.685529 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.697710 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.837328 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5db05cc0-7933-455e-8b79-23ee19abd027-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.837376 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5db05cc0-7933-455e-8b79-23ee19abd027-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.837397 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db05cc0-7933-455e-8b79-23ee19abd027-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.837445 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5db05cc0-7933-455e-8b79-23ee19abd027-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.837466 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-544d499f-b77c-4391-84b7-ef7d248e7391\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-544d499f-b77c-4391-84b7-ef7d248e7391\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.837487 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqrxs\" (UniqueName: \"kubernetes.io/projected/5db05cc0-7933-455e-8b79-23ee19abd027-kube-api-access-fqrxs\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.837512 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5db05cc0-7933-455e-8b79-23ee19abd027-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.837525 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db05cc0-7933-455e-8b79-23ee19abd027-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.939382 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5db05cc0-7933-455e-8b79-23ee19abd027-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.939447 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-544d499f-b77c-4391-84b7-ef7d248e7391\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-544d499f-b77c-4391-84b7-ef7d248e7391\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.939485 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqrxs\" (UniqueName: \"kubernetes.io/projected/5db05cc0-7933-455e-8b79-23ee19abd027-kube-api-access-fqrxs\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.939539 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5db05cc0-7933-455e-8b79-23ee19abd027-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.939573 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db05cc0-7933-455e-8b79-23ee19abd027-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.939680 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5db05cc0-7933-455e-8b79-23ee19abd027-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.939727 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5db05cc0-7933-455e-8b79-23ee19abd027-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.939760 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db05cc0-7933-455e-8b79-23ee19abd027-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.940285 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/5db05cc0-7933-455e-8b79-23ee19abd027-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.940340 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/5db05cc0-7933-455e-8b79-23ee19abd027-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.941034 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/5db05cc0-7933-455e-8b79-23ee19abd027-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.942113 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5db05cc0-7933-455e-8b79-23ee19abd027-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.942906 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.942958 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-544d499f-b77c-4391-84b7-ef7d248e7391\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-544d499f-b77c-4391-84b7-ef7d248e7391\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/9741a472e1af52a8e9225e2805f0899a5d1308164dff36ad974646b0aa058b91/globalmount\"" pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.945793 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5db05cc0-7933-455e-8b79-23ee19abd027-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.946548 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/5db05cc0-7933-455e-8b79-23ee19abd027-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.962570 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqrxs\" (UniqueName: \"kubernetes.io/projected/5db05cc0-7933-455e-8b79-23ee19abd027-kube-api-access-fqrxs\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:28 crc kubenswrapper[4831]: I1203 07:50:28.990194 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-544d499f-b77c-4391-84b7-ef7d248e7391\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-544d499f-b77c-4391-84b7-ef7d248e7391\") pod \"openstack-cell1-galera-0\" (UID: \"5db05cc0-7933-455e-8b79-23ee19abd027\") " pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:29 crc kubenswrapper[4831]: I1203 07:50:29.004773 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa","Type":"ContainerStarted","Data":"7fff52733c80527216f38fb2eb9244ef41b727c792f655faecfbffa6ea5fa5dd"} Dec 03 07:50:29 crc kubenswrapper[4831]: I1203 07:50:29.006881 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4","Type":"ContainerStarted","Data":"b3968dc948fd60945bb8e8fd1516b97da3e0cec6812054275983cc83ab610ad4"} Dec 03 07:50:29 crc kubenswrapper[4831]: I1203 07:50:29.006923 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4","Type":"ContainerStarted","Data":"075ef9d681b7ce89cdf764c0cc13ea9d2bb00cb5e3914f584f0a6093763a0911"} Dec 03 07:50:29 crc kubenswrapper[4831]: I1203 07:50:29.007100 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Dec 03 07:50:29 crc kubenswrapper[4831]: I1203 07:50:29.009049 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8","Type":"ContainerStarted","Data":"a62aa85d22c43800bd710b912db5265b4bc85618bbf3308ad36ab60b8722d7d8"} Dec 03 07:50:29 crc kubenswrapper[4831]: I1203 07:50:29.018476 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:29 crc kubenswrapper[4831]: I1203 07:50:29.051977 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.051956897 podStartE2EDuration="2.051956897s" podCreationTimestamp="2025-12-03 07:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:50:29.04660973 +0000 UTC m=+4766.390193248" watchObservedRunningTime="2025-12-03 07:50:29.051956897 +0000 UTC m=+4766.395540405" Dec 03 07:50:29 crc kubenswrapper[4831]: I1203 07:50:29.498527 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Dec 03 07:50:29 crc kubenswrapper[4831]: W1203 07:50:29.511195 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5db05cc0_7933_455e_8b79_23ee19abd027.slice/crio-5305b4fadc452b1893fc242035a54c1b8753a6e6bf13fb5db6dbb9c26df5ed00 WatchSource:0}: Error finding container 5305b4fadc452b1893fc242035a54c1b8753a6e6bf13fb5db6dbb9c26df5ed00: Status 404 returned error can't find the container with id 5305b4fadc452b1893fc242035a54c1b8753a6e6bf13fb5db6dbb9c26df5ed00 Dec 03 07:50:30 crc kubenswrapper[4831]: I1203 07:50:30.019396 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aea3084a-d57f-49c8-b479-d031b774e0e9","Type":"ContainerStarted","Data":"0e7a03bb2bcf62e1f0b114f3a19698114034a9d5575f252f9b50886b322bc2bc"} Dec 03 07:50:30 crc kubenswrapper[4831]: I1203 07:50:30.022602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5db05cc0-7933-455e-8b79-23ee19abd027","Type":"ContainerStarted","Data":"3e35e84d5be70939e397f3243037074c6905162f2e90f09ca2f53880d1950ae6"} Dec 03 07:50:30 crc kubenswrapper[4831]: I1203 07:50:30.022638 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5db05cc0-7933-455e-8b79-23ee19abd027","Type":"ContainerStarted","Data":"5305b4fadc452b1893fc242035a54c1b8753a6e6bf13fb5db6dbb9c26df5ed00"} Dec 03 07:50:31 crc kubenswrapper[4831]: I1203 07:50:31.551271 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:31 crc kubenswrapper[4831]: I1203 07:50:31.551439 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:31 crc kubenswrapper[4831]: I1203 07:50:31.601087 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:32 crc kubenswrapper[4831]: I1203 07:50:32.041224 4831 generic.go:334] "Generic (PLEG): container finished" podID="3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa" containerID="7fff52733c80527216f38fb2eb9244ef41b727c792f655faecfbffa6ea5fa5dd" exitCode=0 Dec 03 07:50:32 crc kubenswrapper[4831]: I1203 07:50:32.041352 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa","Type":"ContainerDied","Data":"7fff52733c80527216f38fb2eb9244ef41b727c792f655faecfbffa6ea5fa5dd"} Dec 03 07:50:32 crc kubenswrapper[4831]: I1203 07:50:32.106889 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:32 crc kubenswrapper[4831]: I1203 07:50:32.176374 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2hczv"] Dec 03 07:50:33 crc kubenswrapper[4831]: I1203 07:50:33.051226 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa","Type":"ContainerStarted","Data":"05c635a4f50193e8fd1b2ca8f9576cb9cdf284b81a4fe5d21aa5ab145fca7e29"} Dec 03 07:50:33 crc kubenswrapper[4831]: I1203 07:50:33.091761 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=7.09172863 podStartE2EDuration="7.09172863s" podCreationTimestamp="2025-12-03 07:50:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:50:33.082204912 +0000 UTC m=+4770.425788480" watchObservedRunningTime="2025-12-03 07:50:33.09172863 +0000 UTC m=+4770.435312178" Dec 03 07:50:33 crc kubenswrapper[4831]: I1203 07:50:33.135531 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Dec 03 07:50:34 crc kubenswrapper[4831]: I1203 07:50:34.062287 4831 generic.go:334] "Generic (PLEG): container finished" podID="5db05cc0-7933-455e-8b79-23ee19abd027" containerID="3e35e84d5be70939e397f3243037074c6905162f2e90f09ca2f53880d1950ae6" exitCode=0 Dec 03 07:50:34 crc kubenswrapper[4831]: I1203 07:50:34.062419 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5db05cc0-7933-455e-8b79-23ee19abd027","Type":"ContainerDied","Data":"3e35e84d5be70939e397f3243037074c6905162f2e90f09ca2f53880d1950ae6"} Dec 03 07:50:34 crc kubenswrapper[4831]: I1203 07:50:34.063014 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2hczv" podUID="1104f06f-a455-4f73-93da-6fedeb0f5a7d" containerName="registry-server" containerID="cri-o://78a45fdd84152e26db587743bcdd0e3167957e76ceddb1a25d974bec24a887f7" gracePeriod=2 Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.075754 4831 generic.go:334] "Generic (PLEG): container finished" podID="1104f06f-a455-4f73-93da-6fedeb0f5a7d" containerID="78a45fdd84152e26db587743bcdd0e3167957e76ceddb1a25d974bec24a887f7" exitCode=0 Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.075798 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hczv" event={"ID":"1104f06f-a455-4f73-93da-6fedeb0f5a7d","Type":"ContainerDied","Data":"78a45fdd84152e26db587743bcdd0e3167957e76ceddb1a25d974bec24a887f7"} Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.077956 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"5db05cc0-7933-455e-8b79-23ee19abd027","Type":"ContainerStarted","Data":"28664924bf75f687e1d6238bb8178576c06471858e9ecb5215cf4f6efe321dc4"} Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.106353 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=8.106331966 podStartE2EDuration="8.106331966s" podCreationTimestamp="2025-12-03 07:50:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:50:35.102240619 +0000 UTC m=+4772.445824157" watchObservedRunningTime="2025-12-03 07:50:35.106331966 +0000 UTC m=+4772.449915484" Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.174703 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.264850 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.368561 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1104f06f-a455-4f73-93da-6fedeb0f5a7d-catalog-content\") pod \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\" (UID: \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\") " Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.368938 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxbl5\" (UniqueName: \"kubernetes.io/projected/1104f06f-a455-4f73-93da-6fedeb0f5a7d-kube-api-access-jxbl5\") pod \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\" (UID: \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\") " Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.369024 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1104f06f-a455-4f73-93da-6fedeb0f5a7d-utilities\") pod \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\" (UID: \"1104f06f-a455-4f73-93da-6fedeb0f5a7d\") " Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.370531 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1104f06f-a455-4f73-93da-6fedeb0f5a7d-utilities" (OuterVolumeSpecName: "utilities") pod "1104f06f-a455-4f73-93da-6fedeb0f5a7d" (UID: "1104f06f-a455-4f73-93da-6fedeb0f5a7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.374933 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1104f06f-a455-4f73-93da-6fedeb0f5a7d-kube-api-access-jxbl5" (OuterVolumeSpecName: "kube-api-access-jxbl5") pod "1104f06f-a455-4f73-93da-6fedeb0f5a7d" (UID: "1104f06f-a455-4f73-93da-6fedeb0f5a7d"). InnerVolumeSpecName "kube-api-access-jxbl5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:50:35 crc kubenswrapper[4831]: E1203 07:50:35.401490 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.234:49538->38.102.83.234:39573: write tcp 38.102.83.234:49538->38.102.83.234:39573: write: broken pipe Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.412519 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.421139 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1104f06f-a455-4f73-93da-6fedeb0f5a7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1104f06f-a455-4f73-93da-6fedeb0f5a7d" (UID: "1104f06f-a455-4f73-93da-6fedeb0f5a7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.471122 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-mxfbn"] Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.478653 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1104f06f-a455-4f73-93da-6fedeb0f5a7d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.478683 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxbl5\" (UniqueName: \"kubernetes.io/projected/1104f06f-a455-4f73-93da-6fedeb0f5a7d-kube-api-access-jxbl5\") on node \"crc\" DevicePath \"\"" Dec 03 07:50:35 crc kubenswrapper[4831]: I1203 07:50:35.478695 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1104f06f-a455-4f73-93da-6fedeb0f5a7d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:50:36 crc kubenswrapper[4831]: I1203 07:50:36.091051 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" podUID="d8527b96-1b76-4c5c-ae2a-634931b06163" containerName="dnsmasq-dns" containerID="cri-o://35b895c330c063b0c34485ab056537aee06316aa35f5ad9572767a1bc19a16cc" gracePeriod=10 Dec 03 07:50:36 crc kubenswrapper[4831]: I1203 07:50:36.091290 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2hczv" event={"ID":"1104f06f-a455-4f73-93da-6fedeb0f5a7d","Type":"ContainerDied","Data":"bac0976df2e69e461884116a5470dea7b6429e7e46e268ee2cdc4fdfd26beec4"} Dec 03 07:50:36 crc kubenswrapper[4831]: I1203 07:50:36.091367 4831 scope.go:117] "RemoveContainer" containerID="78a45fdd84152e26db587743bcdd0e3167957e76ceddb1a25d974bec24a887f7" Dec 03 07:50:36 crc kubenswrapper[4831]: I1203 07:50:36.091382 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2hczv" Dec 03 07:50:36 crc kubenswrapper[4831]: I1203 07:50:36.125713 4831 scope.go:117] "RemoveContainer" containerID="9b14c68e6dd37466018d576421e9d145cdeedfcc4506b08a349e209a300694d0" Dec 03 07:50:36 crc kubenswrapper[4831]: I1203 07:50:36.220144 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2hczv"] Dec 03 07:50:36 crc kubenswrapper[4831]: I1203 07:50:36.228964 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2hczv"] Dec 03 07:50:36 crc kubenswrapper[4831]: I1203 07:50:36.551862 4831 scope.go:117] "RemoveContainer" containerID="e697e764efcb9a38e5b99dc7cd16bfc4b53b51d10e052939807c5ad6543d4ce2" Dec 03 07:50:36 crc kubenswrapper[4831]: I1203 07:50:36.852021 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.007962 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-885tq\" (UniqueName: \"kubernetes.io/projected/d8527b96-1b76-4c5c-ae2a-634931b06163-kube-api-access-885tq\") pod \"d8527b96-1b76-4c5c-ae2a-634931b06163\" (UID: \"d8527b96-1b76-4c5c-ae2a-634931b06163\") " Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.008064 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8527b96-1b76-4c5c-ae2a-634931b06163-dns-svc\") pod \"d8527b96-1b76-4c5c-ae2a-634931b06163\" (UID: \"d8527b96-1b76-4c5c-ae2a-634931b06163\") " Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.008103 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8527b96-1b76-4c5c-ae2a-634931b06163-config\") pod \"d8527b96-1b76-4c5c-ae2a-634931b06163\" (UID: \"d8527b96-1b76-4c5c-ae2a-634931b06163\") " Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.013762 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8527b96-1b76-4c5c-ae2a-634931b06163-kube-api-access-885tq" (OuterVolumeSpecName: "kube-api-access-885tq") pod "d8527b96-1b76-4c5c-ae2a-634931b06163" (UID: "d8527b96-1b76-4c5c-ae2a-634931b06163"). InnerVolumeSpecName "kube-api-access-885tq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.027529 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1104f06f-a455-4f73-93da-6fedeb0f5a7d" path="/var/lib/kubelet/pods/1104f06f-a455-4f73-93da-6fedeb0f5a7d/volumes" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.048409 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8527b96-1b76-4c5c-ae2a-634931b06163-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d8527b96-1b76-4c5c-ae2a-634931b06163" (UID: "d8527b96-1b76-4c5c-ae2a-634931b06163"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.051202 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d8527b96-1b76-4c5c-ae2a-634931b06163-config" (OuterVolumeSpecName: "config") pod "d8527b96-1b76-4c5c-ae2a-634931b06163" (UID: "d8527b96-1b76-4c5c-ae2a-634931b06163"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.098097 4831 generic.go:334] "Generic (PLEG): container finished" podID="d8527b96-1b76-4c5c-ae2a-634931b06163" containerID="35b895c330c063b0c34485ab056537aee06316aa35f5ad9572767a1bc19a16cc" exitCode=0 Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.098154 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.098185 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" event={"ID":"d8527b96-1b76-4c5c-ae2a-634931b06163","Type":"ContainerDied","Data":"35b895c330c063b0c34485ab056537aee06316aa35f5ad9572767a1bc19a16cc"} Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.098251 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5d7b5456f5-mxfbn" event={"ID":"d8527b96-1b76-4c5c-ae2a-634931b06163","Type":"ContainerDied","Data":"e2aa271ad21265a0c053246235c1b54d9140d6f7be1c583a1b53f4df070275f6"} Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.098269 4831 scope.go:117] "RemoveContainer" containerID="35b895c330c063b0c34485ab056537aee06316aa35f5ad9572767a1bc19a16cc" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.109875 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-885tq\" (UniqueName: \"kubernetes.io/projected/d8527b96-1b76-4c5c-ae2a-634931b06163-kube-api-access-885tq\") on node \"crc\" DevicePath \"\"" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.109902 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d8527b96-1b76-4c5c-ae2a-634931b06163-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.109911 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d8527b96-1b76-4c5c-ae2a-634931b06163-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.114739 4831 scope.go:117] "RemoveContainer" containerID="f4e68e9466c885243dc55a44cebe70a62032fa26c50875f622d324051adcc9ae" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.126841 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-mxfbn"] Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.136931 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5d7b5456f5-mxfbn"] Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.146097 4831 scope.go:117] "RemoveContainer" containerID="35b895c330c063b0c34485ab056537aee06316aa35f5ad9572767a1bc19a16cc" Dec 03 07:50:37 crc kubenswrapper[4831]: E1203 07:50:37.146640 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35b895c330c063b0c34485ab056537aee06316aa35f5ad9572767a1bc19a16cc\": container with ID starting with 35b895c330c063b0c34485ab056537aee06316aa35f5ad9572767a1bc19a16cc not found: ID does not exist" containerID="35b895c330c063b0c34485ab056537aee06316aa35f5ad9572767a1bc19a16cc" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.146681 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35b895c330c063b0c34485ab056537aee06316aa35f5ad9572767a1bc19a16cc"} err="failed to get container status \"35b895c330c063b0c34485ab056537aee06316aa35f5ad9572767a1bc19a16cc\": rpc error: code = NotFound desc = could not find container \"35b895c330c063b0c34485ab056537aee06316aa35f5ad9572767a1bc19a16cc\": container with ID starting with 35b895c330c063b0c34485ab056537aee06316aa35f5ad9572767a1bc19a16cc not found: ID does not exist" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.146705 4831 scope.go:117] "RemoveContainer" containerID="f4e68e9466c885243dc55a44cebe70a62032fa26c50875f622d324051adcc9ae" Dec 03 07:50:37 crc kubenswrapper[4831]: E1203 07:50:37.147029 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f4e68e9466c885243dc55a44cebe70a62032fa26c50875f622d324051adcc9ae\": container with ID starting with f4e68e9466c885243dc55a44cebe70a62032fa26c50875f622d324051adcc9ae not found: ID does not exist" containerID="f4e68e9466c885243dc55a44cebe70a62032fa26c50875f622d324051adcc9ae" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.147052 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f4e68e9466c885243dc55a44cebe70a62032fa26c50875f622d324051adcc9ae"} err="failed to get container status \"f4e68e9466c885243dc55a44cebe70a62032fa26c50875f622d324051adcc9ae\": rpc error: code = NotFound desc = could not find container \"f4e68e9466c885243dc55a44cebe70a62032fa26c50875f622d324051adcc9ae\": container with ID starting with f4e68e9466c885243dc55a44cebe70a62032fa26c50875f622d324051adcc9ae not found: ID does not exist" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.538926 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Dec 03 07:50:37 crc kubenswrapper[4831]: I1203 07:50:37.539439 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Dec 03 07:50:39 crc kubenswrapper[4831]: I1203 07:50:39.022412 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8527b96-1b76-4c5c-ae2a-634931b06163" path="/var/lib/kubelet/pods/d8527b96-1b76-4c5c-ae2a-634931b06163/volumes" Dec 03 07:50:39 crc kubenswrapper[4831]: I1203 07:50:39.022943 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:39 crc kubenswrapper[4831]: I1203 07:50:39.022972 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:40 crc kubenswrapper[4831]: I1203 07:50:40.230305 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Dec 03 07:50:40 crc kubenswrapper[4831]: I1203 07:50:40.369259 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Dec 03 07:50:41 crc kubenswrapper[4831]: I1203 07:50:41.491435 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:41 crc kubenswrapper[4831]: I1203 07:50:41.620464 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Dec 03 07:50:57 crc kubenswrapper[4831]: I1203 07:50:57.596766 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:50:57 crc kubenswrapper[4831]: I1203 07:50:57.597594 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:51:01 crc kubenswrapper[4831]: I1203 07:51:01.341449 4831 generic.go:334] "Generic (PLEG): container finished" podID="4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" containerID="a62aa85d22c43800bd710b912db5265b4bc85618bbf3308ad36ab60b8722d7d8" exitCode=0 Dec 03 07:51:01 crc kubenswrapper[4831]: I1203 07:51:01.341558 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8","Type":"ContainerDied","Data":"a62aa85d22c43800bd710b912db5265b4bc85618bbf3308ad36ab60b8722d7d8"} Dec 03 07:51:02 crc kubenswrapper[4831]: I1203 07:51:02.353896 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8","Type":"ContainerStarted","Data":"da397a07be27ad15aa344aca9beab4a97c426702d95be04d52d018c97e39e8c1"} Dec 03 07:51:02 crc kubenswrapper[4831]: I1203 07:51:02.354497 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 07:51:02 crc kubenswrapper[4831]: I1203 07:51:02.384013 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=38.383987268 podStartE2EDuration="38.383987268s" podCreationTimestamp="2025-12-03 07:50:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:51:02.382429109 +0000 UTC m=+4799.726012647" watchObservedRunningTime="2025-12-03 07:51:02.383987268 +0000 UTC m=+4799.727570816" Dec 03 07:51:03 crc kubenswrapper[4831]: I1203 07:51:03.364893 4831 generic.go:334] "Generic (PLEG): container finished" podID="aea3084a-d57f-49c8-b479-d031b774e0e9" containerID="0e7a03bb2bcf62e1f0b114f3a19698114034a9d5575f252f9b50886b322bc2bc" exitCode=0 Dec 03 07:51:03 crc kubenswrapper[4831]: I1203 07:51:03.364991 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aea3084a-d57f-49c8-b479-d031b774e0e9","Type":"ContainerDied","Data":"0e7a03bb2bcf62e1f0b114f3a19698114034a9d5575f252f9b50886b322bc2bc"} Dec 03 07:51:04 crc kubenswrapper[4831]: I1203 07:51:04.375594 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aea3084a-d57f-49c8-b479-d031b774e0e9","Type":"ContainerStarted","Data":"54a2f2f769392b8a537b632302453b19e790c5f17d6f6e5bb6f0e9b6774a4f44"} Dec 03 07:51:04 crc kubenswrapper[4831]: I1203 07:51:04.376258 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:04 crc kubenswrapper[4831]: I1203 07:51:04.401695 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.401675141 podStartE2EDuration="39.401675141s" podCreationTimestamp="2025-12-03 07:50:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:51:04.395377884 +0000 UTC m=+4801.738961442" watchObservedRunningTime="2025-12-03 07:51:04.401675141 +0000 UTC m=+4801.745258649" Dec 03 07:51:16 crc kubenswrapper[4831]: I1203 07:51:16.301764 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 07:51:17 crc kubenswrapper[4831]: I1203 07:51:17.261507 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.237528 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-nbt4j"] Dec 03 07:51:22 crc kubenswrapper[4831]: E1203 07:51:22.240745 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8527b96-1b76-4c5c-ae2a-634931b06163" containerName="dnsmasq-dns" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.240988 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8527b96-1b76-4c5c-ae2a-634931b06163" containerName="dnsmasq-dns" Dec 03 07:51:22 crc kubenswrapper[4831]: E1203 07:51:22.241228 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8527b96-1b76-4c5c-ae2a-634931b06163" containerName="init" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.241459 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8527b96-1b76-4c5c-ae2a-634931b06163" containerName="init" Dec 03 07:51:22 crc kubenswrapper[4831]: E1203 07:51:22.241685 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1104f06f-a455-4f73-93da-6fedeb0f5a7d" containerName="registry-server" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.241879 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1104f06f-a455-4f73-93da-6fedeb0f5a7d" containerName="registry-server" Dec 03 07:51:22 crc kubenswrapper[4831]: E1203 07:51:22.242075 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1104f06f-a455-4f73-93da-6fedeb0f5a7d" containerName="extract-utilities" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.242256 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1104f06f-a455-4f73-93da-6fedeb0f5a7d" containerName="extract-utilities" Dec 03 07:51:22 crc kubenswrapper[4831]: E1203 07:51:22.242501 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1104f06f-a455-4f73-93da-6fedeb0f5a7d" containerName="extract-content" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.243656 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1104f06f-a455-4f73-93da-6fedeb0f5a7d" containerName="extract-content" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.244345 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1104f06f-a455-4f73-93da-6fedeb0f5a7d" containerName="registry-server" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.244619 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8527b96-1b76-4c5c-ae2a-634931b06163" containerName="dnsmasq-dns" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.246821 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.254367 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-nbt4j"] Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.351877 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq6hh\" (UniqueName: \"kubernetes.io/projected/849c8e45-ebca-4115-9048-0f54605f6c3c-kube-api-access-sq6hh\") pod \"dnsmasq-dns-5b7946d7b9-nbt4j\" (UID: \"849c8e45-ebca-4115-9048-0f54605f6c3c\") " pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.351974 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-nbt4j\" (UID: \"849c8e45-ebca-4115-9048-0f54605f6c3c\") " pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.352068 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-config\") pod \"dnsmasq-dns-5b7946d7b9-nbt4j\" (UID: \"849c8e45-ebca-4115-9048-0f54605f6c3c\") " pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.453012 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-config\") pod \"dnsmasq-dns-5b7946d7b9-nbt4j\" (UID: \"849c8e45-ebca-4115-9048-0f54605f6c3c\") " pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.453106 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq6hh\" (UniqueName: \"kubernetes.io/projected/849c8e45-ebca-4115-9048-0f54605f6c3c-kube-api-access-sq6hh\") pod \"dnsmasq-dns-5b7946d7b9-nbt4j\" (UID: \"849c8e45-ebca-4115-9048-0f54605f6c3c\") " pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.453140 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-nbt4j\" (UID: \"849c8e45-ebca-4115-9048-0f54605f6c3c\") " pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.454100 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-dns-svc\") pod \"dnsmasq-dns-5b7946d7b9-nbt4j\" (UID: \"849c8e45-ebca-4115-9048-0f54605f6c3c\") " pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.454210 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-config\") pod \"dnsmasq-dns-5b7946d7b9-nbt4j\" (UID: \"849c8e45-ebca-4115-9048-0f54605f6c3c\") " pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.480990 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq6hh\" (UniqueName: \"kubernetes.io/projected/849c8e45-ebca-4115-9048-0f54605f6c3c-kube-api-access-sq6hh\") pod \"dnsmasq-dns-5b7946d7b9-nbt4j\" (UID: \"849c8e45-ebca-4115-9048-0f54605f6c3c\") " pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:51:22 crc kubenswrapper[4831]: I1203 07:51:22.605990 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:51:23 crc kubenswrapper[4831]: I1203 07:51:23.028221 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:51:23 crc kubenswrapper[4831]: I1203 07:51:23.501477 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-nbt4j"] Dec 03 07:51:23 crc kubenswrapper[4831]: I1203 07:51:23.547407 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" event={"ID":"849c8e45-ebca-4115-9048-0f54605f6c3c","Type":"ContainerStarted","Data":"4c45ac01a8ad3a60cf2025c431821fb19aa89e7e5e60d8992812b559cc3e4988"} Dec 03 07:51:23 crc kubenswrapper[4831]: I1203 07:51:23.622184 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:51:24 crc kubenswrapper[4831]: I1203 07:51:24.557085 4831 generic.go:334] "Generic (PLEG): container finished" podID="849c8e45-ebca-4115-9048-0f54605f6c3c" containerID="e5c43f4aad554eafc4d45ab473e54c844c10f3bb7abab3f9cf65ba112391d224" exitCode=0 Dec 03 07:51:24 crc kubenswrapper[4831]: I1203 07:51:24.557143 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" event={"ID":"849c8e45-ebca-4115-9048-0f54605f6c3c","Type":"ContainerDied","Data":"e5c43f4aad554eafc4d45ab473e54c844c10f3bb7abab3f9cf65ba112391d224"} Dec 03 07:51:25 crc kubenswrapper[4831]: I1203 07:51:25.105359 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" containerName="rabbitmq" containerID="cri-o://da397a07be27ad15aa344aca9beab4a97c426702d95be04d52d018c97e39e8c1" gracePeriod=604798 Dec 03 07:51:25 crc kubenswrapper[4831]: I1203 07:51:25.570779 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" event={"ID":"849c8e45-ebca-4115-9048-0f54605f6c3c","Type":"ContainerStarted","Data":"2b8ed7fcfff60e237f89454604fe5c9db1d089f4f855a0d3f59765a79ac288d6"} Dec 03 07:51:25 crc kubenswrapper[4831]: I1203 07:51:25.571067 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:51:25 crc kubenswrapper[4831]: I1203 07:51:25.590962 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" podStartSLOduration=3.590945122 podStartE2EDuration="3.590945122s" podCreationTimestamp="2025-12-03 07:51:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:51:25.588980041 +0000 UTC m=+4822.932563589" watchObservedRunningTime="2025-12-03 07:51:25.590945122 +0000 UTC m=+4822.934528630" Dec 03 07:51:25 crc kubenswrapper[4831]: I1203 07:51:25.777438 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="aea3084a-d57f-49c8-b479-d031b774e0e9" containerName="rabbitmq" containerID="cri-o://54a2f2f769392b8a537b632302453b19e790c5f17d6f6e5bb6f0e9b6774a4f44" gracePeriod=604798 Dec 03 07:51:26 crc kubenswrapper[4831]: I1203 07:51:26.297221 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.246:5672: connect: connection refused" Dec 03 07:51:27 crc kubenswrapper[4831]: I1203 07:51:27.259935 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="aea3084a-d57f-49c8-b479-d031b774e0e9" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.247:5672: connect: connection refused" Dec 03 07:51:27 crc kubenswrapper[4831]: I1203 07:51:27.597502 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:51:27 crc kubenswrapper[4831]: I1203 07:51:27.597622 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:51:27 crc kubenswrapper[4831]: I1203 07:51:27.597691 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 07:51:27 crc kubenswrapper[4831]: I1203 07:51:27.598593 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3baec98df1747de5e0f1821202cad3b811d612c0c379c2da31045f274fd26772"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:51:27 crc kubenswrapper[4831]: I1203 07:51:27.598700 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://3baec98df1747de5e0f1821202cad3b811d612c0c379c2da31045f274fd26772" gracePeriod=600 Dec 03 07:51:28 crc kubenswrapper[4831]: I1203 07:51:28.602665 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="3baec98df1747de5e0f1821202cad3b811d612c0c379c2da31045f274fd26772" exitCode=0 Dec 03 07:51:28 crc kubenswrapper[4831]: I1203 07:51:28.602786 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"3baec98df1747de5e0f1821202cad3b811d612c0c379c2da31045f274fd26772"} Dec 03 07:51:28 crc kubenswrapper[4831]: I1203 07:51:28.603628 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28"} Dec 03 07:51:28 crc kubenswrapper[4831]: I1203 07:51:28.603688 4831 scope.go:117] "RemoveContainer" containerID="a49608d2f36ded443bc34b9c018da6725f163a050578c82d7230f76a78d013dc" Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.646246 4831 generic.go:334] "Generic (PLEG): container finished" podID="4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" containerID="da397a07be27ad15aa344aca9beab4a97c426702d95be04d52d018c97e39e8c1" exitCode=0 Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.646364 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8","Type":"ContainerDied","Data":"da397a07be27ad15aa344aca9beab4a97c426702d95be04d52d018c97e39e8c1"} Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.772924 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.960240 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-plugins\") pod \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.960389 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-confd\") pod \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.960450 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-erlang-cookie\") pod \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.960629 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\") pod \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.960698 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-pod-info\") pod \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.960732 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qdzj\" (UniqueName: \"kubernetes.io/projected/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-kube-api-access-6qdzj\") pod \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.960757 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-server-conf\") pod \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.960816 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-erlang-cookie-secret\") pod \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.960850 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-plugins-conf\") pod \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\" (UID: \"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8\") " Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.960924 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" (UID: "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.961168 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.961219 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" (UID: "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.961716 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" (UID: "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.968541 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-pod-info" (OuterVolumeSpecName: "pod-info") pod "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" (UID: "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.968633 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-kube-api-access-6qdzj" (OuterVolumeSpecName: "kube-api-access-6qdzj") pod "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" (UID: "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8"). InnerVolumeSpecName "kube-api-access-6qdzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.970417 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" (UID: "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:51:31 crc kubenswrapper[4831]: I1203 07:51:31.976484 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f" (OuterVolumeSpecName: "persistence") pod "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" (UID: "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8"). InnerVolumeSpecName "pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.008454 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-server-conf" (OuterVolumeSpecName: "server-conf") pod "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" (UID: "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.057374 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" (UID: "4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.063081 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.063166 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\") on node \"crc\" " Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.063185 4831 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.063200 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qdzj\" (UniqueName: \"kubernetes.io/projected/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-kube-api-access-6qdzj\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.063213 4831 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.063224 4831 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.063234 4831 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.063245 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.084130 4831 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.084309 4831 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f") on node "crc" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.165267 4831 reconciler_common.go:293] "Volume detached for volume \"pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.299004 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.469517 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aea3084a-d57f-49c8-b479-d031b774e0e9-server-conf\") pod \"aea3084a-d57f-49c8-b479-d031b774e0e9\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.469967 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-plugins\") pod \"aea3084a-d57f-49c8-b479-d031b774e0e9\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.470066 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aea3084a-d57f-49c8-b479-d031b774e0e9-plugins-conf\") pod \"aea3084a-d57f-49c8-b479-d031b774e0e9\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.470094 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-erlang-cookie\") pod \"aea3084a-d57f-49c8-b479-d031b774e0e9\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.470126 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aea3084a-d57f-49c8-b479-d031b774e0e9-pod-info\") pod \"aea3084a-d57f-49c8-b479-d031b774e0e9\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.470196 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-confd\") pod \"aea3084a-d57f-49c8-b479-d031b774e0e9\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.470219 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aea3084a-d57f-49c8-b479-d031b774e0e9-erlang-cookie-secret\") pod \"aea3084a-d57f-49c8-b479-d031b774e0e9\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.470257 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5hh5\" (UniqueName: \"kubernetes.io/projected/aea3084a-d57f-49c8-b479-d031b774e0e9-kube-api-access-c5hh5\") pod \"aea3084a-d57f-49c8-b479-d031b774e0e9\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.470435 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\") pod \"aea3084a-d57f-49c8-b479-d031b774e0e9\" (UID: \"aea3084a-d57f-49c8-b479-d031b774e0e9\") " Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.470858 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea3084a-d57f-49c8-b479-d031b774e0e9-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "aea3084a-d57f-49c8-b479-d031b774e0e9" (UID: "aea3084a-d57f-49c8-b479-d031b774e0e9"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.470907 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "aea3084a-d57f-49c8-b479-d031b774e0e9" (UID: "aea3084a-d57f-49c8-b479-d031b774e0e9"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.471880 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "aea3084a-d57f-49c8-b479-d031b774e0e9" (UID: "aea3084a-d57f-49c8-b479-d031b774e0e9"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.477687 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea3084a-d57f-49c8-b479-d031b774e0e9-kube-api-access-c5hh5" (OuterVolumeSpecName: "kube-api-access-c5hh5") pod "aea3084a-d57f-49c8-b479-d031b774e0e9" (UID: "aea3084a-d57f-49c8-b479-d031b774e0e9"). InnerVolumeSpecName "kube-api-access-c5hh5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.479531 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/aea3084a-d57f-49c8-b479-d031b774e0e9-pod-info" (OuterVolumeSpecName: "pod-info") pod "aea3084a-d57f-49c8-b479-d031b774e0e9" (UID: "aea3084a-d57f-49c8-b479-d031b774e0e9"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.483526 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aea3084a-d57f-49c8-b479-d031b774e0e9-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "aea3084a-d57f-49c8-b479-d031b774e0e9" (UID: "aea3084a-d57f-49c8-b479-d031b774e0e9"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.496947 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd" (OuterVolumeSpecName: "persistence") pod "aea3084a-d57f-49c8-b479-d031b774e0e9" (UID: "aea3084a-d57f-49c8-b479-d031b774e0e9"). InnerVolumeSpecName "pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.507085 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aea3084a-d57f-49c8-b479-d031b774e0e9-server-conf" (OuterVolumeSpecName: "server-conf") pod "aea3084a-d57f-49c8-b479-d031b774e0e9" (UID: "aea3084a-d57f-49c8-b479-d031b774e0e9"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.572567 4831 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/aea3084a-d57f-49c8-b479-d031b774e0e9-plugins-conf\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.572847 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.573028 4831 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/aea3084a-d57f-49c8-b479-d031b774e0e9-pod-info\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.573146 4831 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/aea3084a-d57f-49c8-b479-d031b774e0e9-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.573259 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5hh5\" (UniqueName: \"kubernetes.io/projected/aea3084a-d57f-49c8-b479-d031b774e0e9-kube-api-access-c5hh5\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.573436 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\") on node \"crc\" " Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.573588 4831 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/aea3084a-d57f-49c8-b479-d031b774e0e9-server-conf\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.573723 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.591039 4831 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.591300 4831 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd") on node "crc" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.594266 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "aea3084a-d57f-49c8-b479-d031b774e0e9" (UID: "aea3084a-d57f-49c8-b479-d031b774e0e9"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.607493 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.652939 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-j6wcv"] Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.653156 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" podUID="9cc56b85-be93-41b1-bdef-8944d1b349c1" containerName="dnsmasq-dns" containerID="cri-o://08434336e04985a50719af82b184e74ae688c22c1f3350b11aac64d36e7c6ec6" gracePeriod=10 Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.658201 4831 generic.go:334] "Generic (PLEG): container finished" podID="aea3084a-d57f-49c8-b479-d031b774e0e9" containerID="54a2f2f769392b8a537b632302453b19e790c5f17d6f6e5bb6f0e9b6774a4f44" exitCode=0 Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.658268 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aea3084a-d57f-49c8-b479-d031b774e0e9","Type":"ContainerDied","Data":"54a2f2f769392b8a537b632302453b19e790c5f17d6f6e5bb6f0e9b6774a4f44"} Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.658295 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"aea3084a-d57f-49c8-b479-d031b774e0e9","Type":"ContainerDied","Data":"439ba529d8f141ddaff252af4ad021b172c05334d977cc1f6c38a25e6e2b42c2"} Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.658313 4831 scope.go:117] "RemoveContainer" containerID="54a2f2f769392b8a537b632302453b19e790c5f17d6f6e5bb6f0e9b6774a4f44" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.658455 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.661608 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8","Type":"ContainerDied","Data":"dea9ce767b57d363c4896049ba5e840f8f3fbbe3114f88099d482633b6ac86df"} Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.661640 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.676539 4831 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/aea3084a-d57f-49c8-b479-d031b774e0e9-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.676562 4831 reconciler_common.go:293] "Volume detached for volume \"pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.697066 4831 scope.go:117] "RemoveContainer" containerID="0e7a03bb2bcf62e1f0b114f3a19698114034a9d5575f252f9b50886b322bc2bc" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.738638 4831 scope.go:117] "RemoveContainer" containerID="54a2f2f769392b8a537b632302453b19e790c5f17d6f6e5bb6f0e9b6774a4f44" Dec 03 07:51:32 crc kubenswrapper[4831]: E1203 07:51:32.739100 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54a2f2f769392b8a537b632302453b19e790c5f17d6f6e5bb6f0e9b6774a4f44\": container with ID starting with 54a2f2f769392b8a537b632302453b19e790c5f17d6f6e5bb6f0e9b6774a4f44 not found: ID does not exist" containerID="54a2f2f769392b8a537b632302453b19e790c5f17d6f6e5bb6f0e9b6774a4f44" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.739130 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54a2f2f769392b8a537b632302453b19e790c5f17d6f6e5bb6f0e9b6774a4f44"} err="failed to get container status \"54a2f2f769392b8a537b632302453b19e790c5f17d6f6e5bb6f0e9b6774a4f44\": rpc error: code = NotFound desc = could not find container \"54a2f2f769392b8a537b632302453b19e790c5f17d6f6e5bb6f0e9b6774a4f44\": container with ID starting with 54a2f2f769392b8a537b632302453b19e790c5f17d6f6e5bb6f0e9b6774a4f44 not found: ID does not exist" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.739151 4831 scope.go:117] "RemoveContainer" containerID="0e7a03bb2bcf62e1f0b114f3a19698114034a9d5575f252f9b50886b322bc2bc" Dec 03 07:51:32 crc kubenswrapper[4831]: E1203 07:51:32.739377 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e7a03bb2bcf62e1f0b114f3a19698114034a9d5575f252f9b50886b322bc2bc\": container with ID starting with 0e7a03bb2bcf62e1f0b114f3a19698114034a9d5575f252f9b50886b322bc2bc not found: ID does not exist" containerID="0e7a03bb2bcf62e1f0b114f3a19698114034a9d5575f252f9b50886b322bc2bc" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.739392 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e7a03bb2bcf62e1f0b114f3a19698114034a9d5575f252f9b50886b322bc2bc"} err="failed to get container status \"0e7a03bb2bcf62e1f0b114f3a19698114034a9d5575f252f9b50886b322bc2bc\": rpc error: code = NotFound desc = could not find container \"0e7a03bb2bcf62e1f0b114f3a19698114034a9d5575f252f9b50886b322bc2bc\": container with ID starting with 0e7a03bb2bcf62e1f0b114f3a19698114034a9d5575f252f9b50886b322bc2bc not found: ID does not exist" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.739403 4831 scope.go:117] "RemoveContainer" containerID="da397a07be27ad15aa344aca9beab4a97c426702d95be04d52d018c97e39e8c1" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.751762 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.759568 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.766336 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.771876 4831 scope.go:117] "RemoveContainer" containerID="a62aa85d22c43800bd710b912db5265b4bc85618bbf3308ad36ab60b8722d7d8" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.772976 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.787050 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:51:32 crc kubenswrapper[4831]: E1203 07:51:32.788608 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" containerName="rabbitmq" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.788632 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" containerName="rabbitmq" Dec 03 07:51:32 crc kubenswrapper[4831]: E1203 07:51:32.788649 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" containerName="setup-container" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.788659 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" containerName="setup-container" Dec 03 07:51:32 crc kubenswrapper[4831]: E1203 07:51:32.788687 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea3084a-d57f-49c8-b479-d031b774e0e9" containerName="setup-container" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.788696 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea3084a-d57f-49c8-b479-d031b774e0e9" containerName="setup-container" Dec 03 07:51:32 crc kubenswrapper[4831]: E1203 07:51:32.788711 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aea3084a-d57f-49c8-b479-d031b774e0e9" containerName="rabbitmq" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.788719 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="aea3084a-d57f-49c8-b479-d031b774e0e9" containerName="rabbitmq" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.788885 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" containerName="rabbitmq" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.788900 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="aea3084a-d57f-49c8-b479-d031b774e0e9" containerName="rabbitmq" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.789705 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.800671 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.802683 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.802982 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.803237 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.803416 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-smx72" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.803544 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.809669 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.811490 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.812972 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.813255 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.813494 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-7p6cj" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.813669 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.813870 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.828841 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.980534 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.980831 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/af0f187c-abc8-40f5-97a3-8e75e2e12769-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.980863 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.980890 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnzwd\" (UniqueName: \"kubernetes.io/projected/af0f187c-abc8-40f5-97a3-8e75e2e12769-kube-api-access-nnzwd\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.980909 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.980929 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.980947 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/af0f187c-abc8-40f5-97a3-8e75e2e12769-pod-info\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.980968 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/af0f187c-abc8-40f5-97a3-8e75e2e12769-server-conf\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.980997 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.981020 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfcjm\" (UniqueName: \"kubernetes.io/projected/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-kube-api-access-pfcjm\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.981037 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.981069 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.981101 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/af0f187c-abc8-40f5-97a3-8e75e2e12769-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.981118 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/af0f187c-abc8-40f5-97a3-8e75e2e12769-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.981136 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/af0f187c-abc8-40f5-97a3-8e75e2e12769-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.981153 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/af0f187c-abc8-40f5-97a3-8e75e2e12769-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.981178 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:32 crc kubenswrapper[4831]: I1203 07:51:32.981196 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.025138 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8" path="/var/lib/kubelet/pods/4eae5aa7-d1e1-40f2-b3ee-6a54ec93bad8/volumes" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.026072 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aea3084a-d57f-49c8-b479-d031b774e0e9" path="/var/lib/kubelet/pods/aea3084a-d57f-49c8-b479-d031b774e0e9/volumes" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.028206 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.082055 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pfcjm\" (UniqueName: \"kubernetes.io/projected/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-kube-api-access-pfcjm\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.082106 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.082173 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.082218 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/af0f187c-abc8-40f5-97a3-8e75e2e12769-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.082242 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/af0f187c-abc8-40f5-97a3-8e75e2e12769-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.082264 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/af0f187c-abc8-40f5-97a3-8e75e2e12769-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.082287 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/af0f187c-abc8-40f5-97a3-8e75e2e12769-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.082797 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.082821 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/af0f187c-abc8-40f5-97a3-8e75e2e12769-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.083153 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.083194 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.083231 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.083252 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/af0f187c-abc8-40f5-97a3-8e75e2e12769-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.083275 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.083310 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nnzwd\" (UniqueName: \"kubernetes.io/projected/af0f187c-abc8-40f5-97a3-8e75e2e12769-kube-api-access-nnzwd\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.083344 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.083372 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.085637 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.086389 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.083358 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/af0f187c-abc8-40f5-97a3-8e75e2e12769-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.083366 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.083367 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/af0f187c-abc8-40f5-97a3-8e75e2e12769-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.086935 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/af0f187c-abc8-40f5-97a3-8e75e2e12769-pod-info\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.086983 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/af0f187c-abc8-40f5-97a3-8e75e2e12769-server-conf\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.087016 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.088836 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.090017 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/af0f187c-abc8-40f5-97a3-8e75e2e12769-server-conf\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.091984 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/af0f187c-abc8-40f5-97a3-8e75e2e12769-pod-info\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.092709 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.092737 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7f9fda22de49c54b2fe990b90ce000982097b618ca40422e3591ee1046704283/globalmount\"" pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.093400 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/af0f187c-abc8-40f5-97a3-8e75e2e12769-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.094963 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.095144 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fcef2430a6210fdc275c89f6d0f015cb2d33458f21cca7e3c75761454f338e2f/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.099522 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.099986 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfcjm\" (UniqueName: \"kubernetes.io/projected/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-kube-api-access-pfcjm\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.105021 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/af0f187c-abc8-40f5-97a3-8e75e2e12769-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.112605 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnzwd\" (UniqueName: \"kubernetes.io/projected/af0f187c-abc8-40f5-97a3-8e75e2e12769-kube-api-access-nnzwd\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.113122 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/933ee7aa-0ba3-46f2-a093-ffe91b58f62e-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.135350 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d7c4b259-a3d6-4201-8b62-7c154876b13f\") pod \"rabbitmq-server-0\" (UID: \"af0f187c-abc8-40f5-97a3-8e75e2e12769\") " pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.135580 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-83e8a9d6-78e6-4206-a18b-18198bd83afd\") pod \"rabbitmq-cell1-server-0\" (UID: \"933ee7aa-0ba3-46f2-a093-ffe91b58f62e\") " pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.140911 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.190029 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw8vp\" (UniqueName: \"kubernetes.io/projected/9cc56b85-be93-41b1-bdef-8944d1b349c1-kube-api-access-pw8vp\") pod \"9cc56b85-be93-41b1-bdef-8944d1b349c1\" (UID: \"9cc56b85-be93-41b1-bdef-8944d1b349c1\") " Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.190154 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cc56b85-be93-41b1-bdef-8944d1b349c1-config\") pod \"9cc56b85-be93-41b1-bdef-8944d1b349c1\" (UID: \"9cc56b85-be93-41b1-bdef-8944d1b349c1\") " Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.190246 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cc56b85-be93-41b1-bdef-8944d1b349c1-dns-svc\") pod \"9cc56b85-be93-41b1-bdef-8944d1b349c1\" (UID: \"9cc56b85-be93-41b1-bdef-8944d1b349c1\") " Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.196177 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cc56b85-be93-41b1-bdef-8944d1b349c1-kube-api-access-pw8vp" (OuterVolumeSpecName: "kube-api-access-pw8vp") pod "9cc56b85-be93-41b1-bdef-8944d1b349c1" (UID: "9cc56b85-be93-41b1-bdef-8944d1b349c1"). InnerVolumeSpecName "kube-api-access-pw8vp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.235719 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cc56b85-be93-41b1-bdef-8944d1b349c1-config" (OuterVolumeSpecName: "config") pod "9cc56b85-be93-41b1-bdef-8944d1b349c1" (UID: "9cc56b85-be93-41b1-bdef-8944d1b349c1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.247065 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cc56b85-be93-41b1-bdef-8944d1b349c1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "9cc56b85-be93-41b1-bdef-8944d1b349c1" (UID: "9cc56b85-be93-41b1-bdef-8944d1b349c1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.292113 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9cc56b85-be93-41b1-bdef-8944d1b349c1-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.292150 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/9cc56b85-be93-41b1-bdef-8944d1b349c1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.292162 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw8vp\" (UniqueName: \"kubernetes.io/projected/9cc56b85-be93-41b1-bdef-8944d1b349c1-kube-api-access-pw8vp\") on node \"crc\" DevicePath \"\"" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.397389 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.425980 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.682039 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.683793 4831 generic.go:334] "Generic (PLEG): container finished" podID="9cc56b85-be93-41b1-bdef-8944d1b349c1" containerID="08434336e04985a50719af82b184e74ae688c22c1f3350b11aac64d36e7c6ec6" exitCode=0 Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.683861 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" event={"ID":"9cc56b85-be93-41b1-bdef-8944d1b349c1","Type":"ContainerDied","Data":"08434336e04985a50719af82b184e74ae688c22c1f3350b11aac64d36e7c6ec6"} Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.683881 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.683940 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-98ddfc8f-j6wcv" event={"ID":"9cc56b85-be93-41b1-bdef-8944d1b349c1","Type":"ContainerDied","Data":"357717be64f30a2810081ae6d091a9745b31ee6ec8fa7912654697aa9af11062"} Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.683968 4831 scope.go:117] "RemoveContainer" containerID="08434336e04985a50719af82b184e74ae688c22c1f3350b11aac64d36e7c6ec6" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.686287 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"933ee7aa-0ba3-46f2-a093-ffe91b58f62e","Type":"ContainerStarted","Data":"2be1f8a3c3398a58d6e7823f73efae93c92550d5de617f8b1317cb69edecc1b6"} Dec 03 07:51:33 crc kubenswrapper[4831]: W1203 07:51:33.690843 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf0f187c_abc8_40f5_97a3_8e75e2e12769.slice/crio-0e2d5bdffd0d8059a018c6287bf5182127f3e0737fe1dc571834118291aa1e15 WatchSource:0}: Error finding container 0e2d5bdffd0d8059a018c6287bf5182127f3e0737fe1dc571834118291aa1e15: Status 404 returned error can't find the container with id 0e2d5bdffd0d8059a018c6287bf5182127f3e0737fe1dc571834118291aa1e15 Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.701246 4831 scope.go:117] "RemoveContainer" containerID="6379562fc7aa03f2783f2a5d6ab37169745b7d6902f69ca716f48f30bd368073" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.715346 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-j6wcv"] Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.721309 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-98ddfc8f-j6wcv"] Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.740609 4831 scope.go:117] "RemoveContainer" containerID="08434336e04985a50719af82b184e74ae688c22c1f3350b11aac64d36e7c6ec6" Dec 03 07:51:33 crc kubenswrapper[4831]: E1203 07:51:33.741051 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08434336e04985a50719af82b184e74ae688c22c1f3350b11aac64d36e7c6ec6\": container with ID starting with 08434336e04985a50719af82b184e74ae688c22c1f3350b11aac64d36e7c6ec6 not found: ID does not exist" containerID="08434336e04985a50719af82b184e74ae688c22c1f3350b11aac64d36e7c6ec6" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.741091 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08434336e04985a50719af82b184e74ae688c22c1f3350b11aac64d36e7c6ec6"} err="failed to get container status \"08434336e04985a50719af82b184e74ae688c22c1f3350b11aac64d36e7c6ec6\": rpc error: code = NotFound desc = could not find container \"08434336e04985a50719af82b184e74ae688c22c1f3350b11aac64d36e7c6ec6\": container with ID starting with 08434336e04985a50719af82b184e74ae688c22c1f3350b11aac64d36e7c6ec6 not found: ID does not exist" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.741117 4831 scope.go:117] "RemoveContainer" containerID="6379562fc7aa03f2783f2a5d6ab37169745b7d6902f69ca716f48f30bd368073" Dec 03 07:51:33 crc kubenswrapper[4831]: E1203 07:51:33.741544 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6379562fc7aa03f2783f2a5d6ab37169745b7d6902f69ca716f48f30bd368073\": container with ID starting with 6379562fc7aa03f2783f2a5d6ab37169745b7d6902f69ca716f48f30bd368073 not found: ID does not exist" containerID="6379562fc7aa03f2783f2a5d6ab37169745b7d6902f69ca716f48f30bd368073" Dec 03 07:51:33 crc kubenswrapper[4831]: I1203 07:51:33.741571 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6379562fc7aa03f2783f2a5d6ab37169745b7d6902f69ca716f48f30bd368073"} err="failed to get container status \"6379562fc7aa03f2783f2a5d6ab37169745b7d6902f69ca716f48f30bd368073\": rpc error: code = NotFound desc = could not find container \"6379562fc7aa03f2783f2a5d6ab37169745b7d6902f69ca716f48f30bd368073\": container with ID starting with 6379562fc7aa03f2783f2a5d6ab37169745b7d6902f69ca716f48f30bd368073 not found: ID does not exist" Dec 03 07:51:33 crc kubenswrapper[4831]: E1203 07:51:33.842660 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cc56b85_be93_41b1_bdef_8944d1b349c1.slice/crio-357717be64f30a2810081ae6d091a9745b31ee6ec8fa7912654697aa9af11062\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cc56b85_be93_41b1_bdef_8944d1b349c1.slice\": RecentStats: unable to find data in memory cache]" Dec 03 07:51:34 crc kubenswrapper[4831]: I1203 07:51:34.700347 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"af0f187c-abc8-40f5-97a3-8e75e2e12769","Type":"ContainerStarted","Data":"0e2d5bdffd0d8059a018c6287bf5182127f3e0737fe1dc571834118291aa1e15"} Dec 03 07:51:35 crc kubenswrapper[4831]: I1203 07:51:35.037692 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cc56b85-be93-41b1-bdef-8944d1b349c1" path="/var/lib/kubelet/pods/9cc56b85-be93-41b1-bdef-8944d1b349c1/volumes" Dec 03 07:51:35 crc kubenswrapper[4831]: I1203 07:51:35.715162 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"933ee7aa-0ba3-46f2-a093-ffe91b58f62e","Type":"ContainerStarted","Data":"f8cb5dc2c98f93359e5677123e2bb1efac402bc18f23adb8f69bd5747d224f4a"} Dec 03 07:51:36 crc kubenswrapper[4831]: I1203 07:51:36.728457 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"af0f187c-abc8-40f5-97a3-8e75e2e12769","Type":"ContainerStarted","Data":"9b59bce65a70c0d5d4c677bedd98f8d59c89b989ce2d44a02bf9fcc77df25fad"} Dec 03 07:52:09 crc kubenswrapper[4831]: I1203 07:52:09.079278 4831 generic.go:334] "Generic (PLEG): container finished" podID="933ee7aa-0ba3-46f2-a093-ffe91b58f62e" containerID="f8cb5dc2c98f93359e5677123e2bb1efac402bc18f23adb8f69bd5747d224f4a" exitCode=0 Dec 03 07:52:09 crc kubenswrapper[4831]: I1203 07:52:09.080084 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"933ee7aa-0ba3-46f2-a093-ffe91b58f62e","Type":"ContainerDied","Data":"f8cb5dc2c98f93359e5677123e2bb1efac402bc18f23adb8f69bd5747d224f4a"} Dec 03 07:52:10 crc kubenswrapper[4831]: I1203 07:52:10.088967 4831 generic.go:334] "Generic (PLEG): container finished" podID="af0f187c-abc8-40f5-97a3-8e75e2e12769" containerID="9b59bce65a70c0d5d4c677bedd98f8d59c89b989ce2d44a02bf9fcc77df25fad" exitCode=0 Dec 03 07:52:10 crc kubenswrapper[4831]: I1203 07:52:10.089057 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"af0f187c-abc8-40f5-97a3-8e75e2e12769","Type":"ContainerDied","Data":"9b59bce65a70c0d5d4c677bedd98f8d59c89b989ce2d44a02bf9fcc77df25fad"} Dec 03 07:52:10 crc kubenswrapper[4831]: I1203 07:52:10.091542 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"933ee7aa-0ba3-46f2-a093-ffe91b58f62e","Type":"ContainerStarted","Data":"438cd93f02b9b0b0296c328551cbe54b9d505da8aba2067705c4ee03753e5794"} Dec 03 07:52:10 crc kubenswrapper[4831]: I1203 07:52:10.091787 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:52:11 crc kubenswrapper[4831]: I1203 07:52:11.101185 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"af0f187c-abc8-40f5-97a3-8e75e2e12769","Type":"ContainerStarted","Data":"fe50341c87c9e5f348f0e3426fd152cfef974cf4cdd6681a5f4dac6d69e06f56"} Dec 03 07:52:11 crc kubenswrapper[4831]: I1203 07:52:11.101913 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Dec 03 07:52:11 crc kubenswrapper[4831]: I1203 07:52:11.128494 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=39.128477954 podStartE2EDuration="39.128477954s" podCreationTimestamp="2025-12-03 07:51:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:52:11.120457775 +0000 UTC m=+4868.464041283" watchObservedRunningTime="2025-12-03 07:52:11.128477954 +0000 UTC m=+4868.472061462" Dec 03 07:52:11 crc kubenswrapper[4831]: I1203 07:52:11.128837 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=39.128832096 podStartE2EDuration="39.128832096s" podCreationTimestamp="2025-12-03 07:51:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:52:10.147826087 +0000 UTC m=+4867.491409595" watchObservedRunningTime="2025-12-03 07:52:11.128832096 +0000 UTC m=+4868.472415604" Dec 03 07:52:18 crc kubenswrapper[4831]: I1203 07:52:18.325765 4831 scope.go:117] "RemoveContainer" containerID="8715c38d9a5c580643523f51a3a9e615ad80d85056859b85d7fcb9963d0ce2a5" Dec 03 07:52:18 crc kubenswrapper[4831]: I1203 07:52:18.355248 4831 scope.go:117] "RemoveContainer" containerID="49630976e7fb8afc4adbd3cbaf0e095679d5b5f644e3d6b8d023db21da3ae1c6" Dec 03 07:52:18 crc kubenswrapper[4831]: I1203 07:52:18.410294 4831 scope.go:117] "RemoveContainer" containerID="e38b4507305a0ed640a5cf479cd4b9343ed4409d9609dbb9831335a341f4e4ff" Dec 03 07:52:23 crc kubenswrapper[4831]: I1203 07:52:23.145384 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Dec 03 07:52:23 crc kubenswrapper[4831]: I1203 07:52:23.429032 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Dec 03 07:52:35 crc kubenswrapper[4831]: I1203 07:52:35.939134 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1-default"] Dec 03 07:52:35 crc kubenswrapper[4831]: E1203 07:52:35.939977 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc56b85-be93-41b1-bdef-8944d1b349c1" containerName="init" Dec 03 07:52:35 crc kubenswrapper[4831]: I1203 07:52:35.940600 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc56b85-be93-41b1-bdef-8944d1b349c1" containerName="init" Dec 03 07:52:35 crc kubenswrapper[4831]: E1203 07:52:35.940632 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cc56b85-be93-41b1-bdef-8944d1b349c1" containerName="dnsmasq-dns" Dec 03 07:52:35 crc kubenswrapper[4831]: I1203 07:52:35.940674 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cc56b85-be93-41b1-bdef-8944d1b349c1" containerName="dnsmasq-dns" Dec 03 07:52:35 crc kubenswrapper[4831]: I1203 07:52:35.940872 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cc56b85-be93-41b1-bdef-8944d1b349c1" containerName="dnsmasq-dns" Dec 03 07:52:35 crc kubenswrapper[4831]: I1203 07:52:35.941506 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 03 07:52:35 crc kubenswrapper[4831]: I1203 07:52:35.944272 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8ddwt" Dec 03 07:52:35 crc kubenswrapper[4831]: I1203 07:52:35.956485 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 03 07:52:35 crc kubenswrapper[4831]: I1203 07:52:35.992499 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jr2c\" (UniqueName: \"kubernetes.io/projected/7b84a820-8823-44f4-be36-a64371b49bb5-kube-api-access-8jr2c\") pod \"mariadb-client-1-default\" (UID: \"7b84a820-8823-44f4-be36-a64371b49bb5\") " pod="openstack/mariadb-client-1-default" Dec 03 07:52:36 crc kubenswrapper[4831]: I1203 07:52:36.094043 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jr2c\" (UniqueName: \"kubernetes.io/projected/7b84a820-8823-44f4-be36-a64371b49bb5-kube-api-access-8jr2c\") pod \"mariadb-client-1-default\" (UID: \"7b84a820-8823-44f4-be36-a64371b49bb5\") " pod="openstack/mariadb-client-1-default" Dec 03 07:52:36 crc kubenswrapper[4831]: I1203 07:52:36.124290 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jr2c\" (UniqueName: \"kubernetes.io/projected/7b84a820-8823-44f4-be36-a64371b49bb5-kube-api-access-8jr2c\") pod \"mariadb-client-1-default\" (UID: \"7b84a820-8823-44f4-be36-a64371b49bb5\") " pod="openstack/mariadb-client-1-default" Dec 03 07:52:36 crc kubenswrapper[4831]: I1203 07:52:36.275476 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 03 07:52:36 crc kubenswrapper[4831]: I1203 07:52:36.612927 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 03 07:52:37 crc kubenswrapper[4831]: I1203 07:52:37.362008 4831 generic.go:334] "Generic (PLEG): container finished" podID="7b84a820-8823-44f4-be36-a64371b49bb5" containerID="42a31cf25feb27df8100aa18261dd18475478fd5d96abf0cf009e2bfe5e7135d" exitCode=0 Dec 03 07:52:37 crc kubenswrapper[4831]: I1203 07:52:37.362065 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"7b84a820-8823-44f4-be36-a64371b49bb5","Type":"ContainerDied","Data":"42a31cf25feb27df8100aa18261dd18475478fd5d96abf0cf009e2bfe5e7135d"} Dec 03 07:52:37 crc kubenswrapper[4831]: I1203 07:52:37.362409 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1-default" event={"ID":"7b84a820-8823-44f4-be36-a64371b49bb5","Type":"ContainerStarted","Data":"4e69c1af3dcc8309884264f1c8744802a959e34f61d92d28af39253e924ced7a"} Dec 03 07:52:38 crc kubenswrapper[4831]: I1203 07:52:38.779648 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 03 07:52:38 crc kubenswrapper[4831]: I1203 07:52:38.808226 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1-default_7b84a820-8823-44f4-be36-a64371b49bb5/mariadb-client-1-default/0.log" Dec 03 07:52:38 crc kubenswrapper[4831]: I1203 07:52:38.834685 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 03 07:52:38 crc kubenswrapper[4831]: I1203 07:52:38.839883 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jr2c\" (UniqueName: \"kubernetes.io/projected/7b84a820-8823-44f4-be36-a64371b49bb5-kube-api-access-8jr2c\") pod \"7b84a820-8823-44f4-be36-a64371b49bb5\" (UID: \"7b84a820-8823-44f4-be36-a64371b49bb5\") " Dec 03 07:52:38 crc kubenswrapper[4831]: I1203 07:52:38.844050 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1-default"] Dec 03 07:52:38 crc kubenswrapper[4831]: I1203 07:52:38.846063 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b84a820-8823-44f4-be36-a64371b49bb5-kube-api-access-8jr2c" (OuterVolumeSpecName: "kube-api-access-8jr2c") pod "7b84a820-8823-44f4-be36-a64371b49bb5" (UID: "7b84a820-8823-44f4-be36-a64371b49bb5"). InnerVolumeSpecName "kube-api-access-8jr2c". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:52:38 crc kubenswrapper[4831]: I1203 07:52:38.941662 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jr2c\" (UniqueName: \"kubernetes.io/projected/7b84a820-8823-44f4-be36-a64371b49bb5-kube-api-access-8jr2c\") on node \"crc\" DevicePath \"\"" Dec 03 07:52:39 crc kubenswrapper[4831]: I1203 07:52:39.029195 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b84a820-8823-44f4-be36-a64371b49bb5" path="/var/lib/kubelet/pods/7b84a820-8823-44f4-be36-a64371b49bb5/volumes" Dec 03 07:52:39 crc kubenswrapper[4831]: I1203 07:52:39.315836 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2-default"] Dec 03 07:52:39 crc kubenswrapper[4831]: E1203 07:52:39.316286 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b84a820-8823-44f4-be36-a64371b49bb5" containerName="mariadb-client-1-default" Dec 03 07:52:39 crc kubenswrapper[4831]: I1203 07:52:39.316340 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b84a820-8823-44f4-be36-a64371b49bb5" containerName="mariadb-client-1-default" Dec 03 07:52:39 crc kubenswrapper[4831]: I1203 07:52:39.316624 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b84a820-8823-44f4-be36-a64371b49bb5" containerName="mariadb-client-1-default" Dec 03 07:52:39 crc kubenswrapper[4831]: I1203 07:52:39.317479 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 03 07:52:39 crc kubenswrapper[4831]: I1203 07:52:39.334376 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 03 07:52:39 crc kubenswrapper[4831]: I1203 07:52:39.349652 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfsrz\" (UniqueName: \"kubernetes.io/projected/52211ba9-6e7d-4d58-9f8e-52327db1963f-kube-api-access-sfsrz\") pod \"mariadb-client-2-default\" (UID: \"52211ba9-6e7d-4d58-9f8e-52327db1963f\") " pod="openstack/mariadb-client-2-default" Dec 03 07:52:39 crc kubenswrapper[4831]: I1203 07:52:39.378857 4831 scope.go:117] "RemoveContainer" containerID="42a31cf25feb27df8100aa18261dd18475478fd5d96abf0cf009e2bfe5e7135d" Dec 03 07:52:39 crc kubenswrapper[4831]: I1203 07:52:39.378903 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1-default" Dec 03 07:52:39 crc kubenswrapper[4831]: I1203 07:52:39.451238 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfsrz\" (UniqueName: \"kubernetes.io/projected/52211ba9-6e7d-4d58-9f8e-52327db1963f-kube-api-access-sfsrz\") pod \"mariadb-client-2-default\" (UID: \"52211ba9-6e7d-4d58-9f8e-52327db1963f\") " pod="openstack/mariadb-client-2-default" Dec 03 07:52:39 crc kubenswrapper[4831]: I1203 07:52:39.477174 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfsrz\" (UniqueName: \"kubernetes.io/projected/52211ba9-6e7d-4d58-9f8e-52327db1963f-kube-api-access-sfsrz\") pod \"mariadb-client-2-default\" (UID: \"52211ba9-6e7d-4d58-9f8e-52327db1963f\") " pod="openstack/mariadb-client-2-default" Dec 03 07:52:39 crc kubenswrapper[4831]: I1203 07:52:39.657580 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 03 07:52:40 crc kubenswrapper[4831]: W1203 07:52:40.224778 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52211ba9_6e7d_4d58_9f8e_52327db1963f.slice/crio-94936fc0c0645d9b3a5797d5b836547fff8967bbb63d1d578a08fe8ec8a535ed WatchSource:0}: Error finding container 94936fc0c0645d9b3a5797d5b836547fff8967bbb63d1d578a08fe8ec8a535ed: Status 404 returned error can't find the container with id 94936fc0c0645d9b3a5797d5b836547fff8967bbb63d1d578a08fe8ec8a535ed Dec 03 07:52:40 crc kubenswrapper[4831]: I1203 07:52:40.225127 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 03 07:52:40 crc kubenswrapper[4831]: I1203 07:52:40.389113 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"52211ba9-6e7d-4d58-9f8e-52327db1963f","Type":"ContainerStarted","Data":"94936fc0c0645d9b3a5797d5b836547fff8967bbb63d1d578a08fe8ec8a535ed"} Dec 03 07:52:41 crc kubenswrapper[4831]: I1203 07:52:41.402945 4831 generic.go:334] "Generic (PLEG): container finished" podID="52211ba9-6e7d-4d58-9f8e-52327db1963f" containerID="1a3e0b44a0343d2cca7bdf808197809186e5f088f6b31573dbd59970be6c3085" exitCode=1 Dec 03 07:52:41 crc kubenswrapper[4831]: I1203 07:52:41.403062 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2-default" event={"ID":"52211ba9-6e7d-4d58-9f8e-52327db1963f","Type":"ContainerDied","Data":"1a3e0b44a0343d2cca7bdf808197809186e5f088f6b31573dbd59970be6c3085"} Dec 03 07:52:42 crc kubenswrapper[4831]: I1203 07:52:42.908132 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 03 07:52:42 crc kubenswrapper[4831]: I1203 07:52:42.934372 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2-default_52211ba9-6e7d-4d58-9f8e-52327db1963f/mariadb-client-2-default/0.log" Dec 03 07:52:42 crc kubenswrapper[4831]: I1203 07:52:42.971947 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 03 07:52:42 crc kubenswrapper[4831]: I1203 07:52:42.982348 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2-default"] Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.020231 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfsrz\" (UniqueName: \"kubernetes.io/projected/52211ba9-6e7d-4d58-9f8e-52327db1963f-kube-api-access-sfsrz\") pod \"52211ba9-6e7d-4d58-9f8e-52327db1963f\" (UID: \"52211ba9-6e7d-4d58-9f8e-52327db1963f\") " Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.026654 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52211ba9-6e7d-4d58-9f8e-52327db1963f-kube-api-access-sfsrz" (OuterVolumeSpecName: "kube-api-access-sfsrz") pod "52211ba9-6e7d-4d58-9f8e-52327db1963f" (UID: "52211ba9-6e7d-4d58-9f8e-52327db1963f"). InnerVolumeSpecName "kube-api-access-sfsrz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.122463 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfsrz\" (UniqueName: \"kubernetes.io/projected/52211ba9-6e7d-4d58-9f8e-52327db1963f-kube-api-access-sfsrz\") on node \"crc\" DevicePath \"\"" Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.427229 4831 scope.go:117] "RemoveContainer" containerID="1a3e0b44a0343d2cca7bdf808197809186e5f088f6b31573dbd59970be6c3085" Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.427284 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2-default" Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.518667 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-1"] Dec 03 07:52:43 crc kubenswrapper[4831]: E1203 07:52:43.519144 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52211ba9-6e7d-4d58-9f8e-52327db1963f" containerName="mariadb-client-2-default" Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.519175 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="52211ba9-6e7d-4d58-9f8e-52327db1963f" containerName="mariadb-client-2-default" Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.519472 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="52211ba9-6e7d-4d58-9f8e-52327db1963f" containerName="mariadb-client-2-default" Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.520295 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.526751 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8ddwt" Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.533195 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.631995 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bsgr\" (UniqueName: \"kubernetes.io/projected/e8f614af-6e07-4d90-9bde-f7551332b1f2-kube-api-access-7bsgr\") pod \"mariadb-client-1\" (UID: \"e8f614af-6e07-4d90-9bde-f7551332b1f2\") " pod="openstack/mariadb-client-1" Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.733657 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bsgr\" (UniqueName: \"kubernetes.io/projected/e8f614af-6e07-4d90-9bde-f7551332b1f2-kube-api-access-7bsgr\") pod \"mariadb-client-1\" (UID: \"e8f614af-6e07-4d90-9bde-f7551332b1f2\") " pod="openstack/mariadb-client-1" Dec 03 07:52:43 crc kubenswrapper[4831]: I1203 07:52:43.921022 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bsgr\" (UniqueName: \"kubernetes.io/projected/e8f614af-6e07-4d90-9bde-f7551332b1f2-kube-api-access-7bsgr\") pod \"mariadb-client-1\" (UID: \"e8f614af-6e07-4d90-9bde-f7551332b1f2\") " pod="openstack/mariadb-client-1" Dec 03 07:52:44 crc kubenswrapper[4831]: I1203 07:52:44.146725 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 03 07:52:44 crc kubenswrapper[4831]: I1203 07:52:44.746510 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-1"] Dec 03 07:52:44 crc kubenswrapper[4831]: W1203 07:52:44.747286 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8f614af_6e07_4d90_9bde_f7551332b1f2.slice/crio-7115ef770e6f293962580554695d15408d4799fd21872862e300be6beff06443 WatchSource:0}: Error finding container 7115ef770e6f293962580554695d15408d4799fd21872862e300be6beff06443: Status 404 returned error can't find the container with id 7115ef770e6f293962580554695d15408d4799fd21872862e300be6beff06443 Dec 03 07:52:45 crc kubenswrapper[4831]: I1203 07:52:45.025804 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52211ba9-6e7d-4d58-9f8e-52327db1963f" path="/var/lib/kubelet/pods/52211ba9-6e7d-4d58-9f8e-52327db1963f/volumes" Dec 03 07:52:45 crc kubenswrapper[4831]: I1203 07:52:45.443086 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"e8f614af-6e07-4d90-9bde-f7551332b1f2","Type":"ContainerStarted","Data":"1827fb7b07c920c84180ec7aeb648c8c903e1f314c151a1df9583c3b321718d7"} Dec 03 07:52:45 crc kubenswrapper[4831]: I1203 07:52:45.443132 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"e8f614af-6e07-4d90-9bde-f7551332b1f2","Type":"ContainerStarted","Data":"7115ef770e6f293962580554695d15408d4799fd21872862e300be6beff06443"} Dec 03 07:52:45 crc kubenswrapper[4831]: I1203 07:52:45.469813 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-1" podStartSLOduration=2.469779325 podStartE2EDuration="2.469779325s" podCreationTimestamp="2025-12-03 07:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:52:45.458184364 +0000 UTC m=+4902.801767902" watchObservedRunningTime="2025-12-03 07:52:45.469779325 +0000 UTC m=+4902.813362873" Dec 03 07:52:45 crc kubenswrapper[4831]: I1203 07:52:45.532798 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-1_e8f614af-6e07-4d90-9bde-f7551332b1f2/mariadb-client-1/0.log" Dec 03 07:52:46 crc kubenswrapper[4831]: I1203 07:52:46.454500 4831 generic.go:334] "Generic (PLEG): container finished" podID="e8f614af-6e07-4d90-9bde-f7551332b1f2" containerID="1827fb7b07c920c84180ec7aeb648c8c903e1f314c151a1df9583c3b321718d7" exitCode=0 Dec 03 07:52:46 crc kubenswrapper[4831]: I1203 07:52:46.454630 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-1" event={"ID":"e8f614af-6e07-4d90-9bde-f7551332b1f2","Type":"ContainerDied","Data":"1827fb7b07c920c84180ec7aeb648c8c903e1f314c151a1df9583c3b321718d7"} Dec 03 07:52:47 crc kubenswrapper[4831]: I1203 07:52:47.830850 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 03 07:52:47 crc kubenswrapper[4831]: I1203 07:52:47.872298 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-1"] Dec 03 07:52:47 crc kubenswrapper[4831]: I1203 07:52:47.877526 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-1"] Dec 03 07:52:47 crc kubenswrapper[4831]: I1203 07:52:47.903839 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bsgr\" (UniqueName: \"kubernetes.io/projected/e8f614af-6e07-4d90-9bde-f7551332b1f2-kube-api-access-7bsgr\") pod \"e8f614af-6e07-4d90-9bde-f7551332b1f2\" (UID: \"e8f614af-6e07-4d90-9bde-f7551332b1f2\") " Dec 03 07:52:47 crc kubenswrapper[4831]: I1203 07:52:47.909805 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8f614af-6e07-4d90-9bde-f7551332b1f2-kube-api-access-7bsgr" (OuterVolumeSpecName: "kube-api-access-7bsgr") pod "e8f614af-6e07-4d90-9bde-f7551332b1f2" (UID: "e8f614af-6e07-4d90-9bde-f7551332b1f2"). InnerVolumeSpecName "kube-api-access-7bsgr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:52:48 crc kubenswrapper[4831]: I1203 07:52:48.006246 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bsgr\" (UniqueName: \"kubernetes.io/projected/e8f614af-6e07-4d90-9bde-f7551332b1f2-kube-api-access-7bsgr\") on node \"crc\" DevicePath \"\"" Dec 03 07:52:48 crc kubenswrapper[4831]: I1203 07:52:48.386434 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-4-default"] Dec 03 07:52:48 crc kubenswrapper[4831]: E1203 07:52:48.386936 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8f614af-6e07-4d90-9bde-f7551332b1f2" containerName="mariadb-client-1" Dec 03 07:52:48 crc kubenswrapper[4831]: I1203 07:52:48.386969 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8f614af-6e07-4d90-9bde-f7551332b1f2" containerName="mariadb-client-1" Dec 03 07:52:48 crc kubenswrapper[4831]: I1203 07:52:48.387230 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8f614af-6e07-4d90-9bde-f7551332b1f2" containerName="mariadb-client-1" Dec 03 07:52:48 crc kubenswrapper[4831]: I1203 07:52:48.388082 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 03 07:52:48 crc kubenswrapper[4831]: I1203 07:52:48.400851 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 03 07:52:48 crc kubenswrapper[4831]: I1203 07:52:48.505027 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7115ef770e6f293962580554695d15408d4799fd21872862e300be6beff06443" Dec 03 07:52:48 crc kubenswrapper[4831]: I1203 07:52:48.505399 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-1" Dec 03 07:52:48 crc kubenswrapper[4831]: I1203 07:52:48.515000 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f727k\" (UniqueName: \"kubernetes.io/projected/6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd-kube-api-access-f727k\") pod \"mariadb-client-4-default\" (UID: \"6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd\") " pod="openstack/mariadb-client-4-default" Dec 03 07:52:48 crc kubenswrapper[4831]: I1203 07:52:48.618072 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f727k\" (UniqueName: \"kubernetes.io/projected/6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd-kube-api-access-f727k\") pod \"mariadb-client-4-default\" (UID: \"6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd\") " pod="openstack/mariadb-client-4-default" Dec 03 07:52:48 crc kubenswrapper[4831]: I1203 07:52:48.656897 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f727k\" (UniqueName: \"kubernetes.io/projected/6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd-kube-api-access-f727k\") pod \"mariadb-client-4-default\" (UID: \"6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd\") " pod="openstack/mariadb-client-4-default" Dec 03 07:52:48 crc kubenswrapper[4831]: I1203 07:52:48.719540 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 03 07:52:49 crc kubenswrapper[4831]: I1203 07:52:49.038656 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8f614af-6e07-4d90-9bde-f7551332b1f2" path="/var/lib/kubelet/pods/e8f614af-6e07-4d90-9bde-f7551332b1f2/volumes" Dec 03 07:52:49 crc kubenswrapper[4831]: I1203 07:52:49.228893 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 03 07:52:49 crc kubenswrapper[4831]: I1203 07:52:49.515304 4831 generic.go:334] "Generic (PLEG): container finished" podID="6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd" containerID="b316cefc1fd9500ab0f27322656bffa009e671ffff01f4bb305fca334642d234" exitCode=0 Dec 03 07:52:49 crc kubenswrapper[4831]: I1203 07:52:49.515387 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd","Type":"ContainerDied","Data":"b316cefc1fd9500ab0f27322656bffa009e671ffff01f4bb305fca334642d234"} Dec 03 07:52:49 crc kubenswrapper[4831]: I1203 07:52:49.515440 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-4-default" event={"ID":"6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd","Type":"ContainerStarted","Data":"f2dd636239dbc72bade19b469130012e98ab7c339948c3264614f9c478ef6aac"} Dec 03 07:52:50 crc kubenswrapper[4831]: I1203 07:52:50.896612 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 03 07:52:50 crc kubenswrapper[4831]: I1203 07:52:50.920959 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-4-default_6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd/mariadb-client-4-default/0.log" Dec 03 07:52:50 crc kubenswrapper[4831]: I1203 07:52:50.958250 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 03 07:52:50 crc kubenswrapper[4831]: I1203 07:52:50.966033 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f727k\" (UniqueName: \"kubernetes.io/projected/6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd-kube-api-access-f727k\") pod \"6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd\" (UID: \"6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd\") " Dec 03 07:52:50 crc kubenswrapper[4831]: I1203 07:52:50.966299 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-4-default"] Dec 03 07:52:50 crc kubenswrapper[4831]: I1203 07:52:50.978586 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd-kube-api-access-f727k" (OuterVolumeSpecName: "kube-api-access-f727k") pod "6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd" (UID: "6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd"). InnerVolumeSpecName "kube-api-access-f727k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:52:51 crc kubenswrapper[4831]: I1203 07:52:51.027878 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd" path="/var/lib/kubelet/pods/6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd/volumes" Dec 03 07:52:51 crc kubenswrapper[4831]: I1203 07:52:51.067723 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f727k\" (UniqueName: \"kubernetes.io/projected/6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd-kube-api-access-f727k\") on node \"crc\" DevicePath \"\"" Dec 03 07:52:51 crc kubenswrapper[4831]: I1203 07:52:51.534675 4831 scope.go:117] "RemoveContainer" containerID="b316cefc1fd9500ab0f27322656bffa009e671ffff01f4bb305fca334642d234" Dec 03 07:52:51 crc kubenswrapper[4831]: I1203 07:52:51.534702 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-4-default" Dec 03 07:52:54 crc kubenswrapper[4831]: I1203 07:52:54.822690 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-5-default"] Dec 03 07:52:54 crc kubenswrapper[4831]: E1203 07:52:54.823580 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd" containerName="mariadb-client-4-default" Dec 03 07:52:54 crc kubenswrapper[4831]: I1203 07:52:54.823607 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd" containerName="mariadb-client-4-default" Dec 03 07:52:54 crc kubenswrapper[4831]: I1203 07:52:54.823903 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a8551fd-f8eb-44ec-bd3a-4d1c545ad2cd" containerName="mariadb-client-4-default" Dec 03 07:52:54 crc kubenswrapper[4831]: I1203 07:52:54.824785 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 03 07:52:54 crc kubenswrapper[4831]: I1203 07:52:54.827225 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8ddwt" Dec 03 07:52:54 crc kubenswrapper[4831]: I1203 07:52:54.840664 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 03 07:52:54 crc kubenswrapper[4831]: I1203 07:52:54.937466 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m62t6\" (UniqueName: \"kubernetes.io/projected/ecd536a6-8935-44c6-96eb-19fe9ee833e0-kube-api-access-m62t6\") pod \"mariadb-client-5-default\" (UID: \"ecd536a6-8935-44c6-96eb-19fe9ee833e0\") " pod="openstack/mariadb-client-5-default" Dec 03 07:52:55 crc kubenswrapper[4831]: I1203 07:52:55.039653 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m62t6\" (UniqueName: \"kubernetes.io/projected/ecd536a6-8935-44c6-96eb-19fe9ee833e0-kube-api-access-m62t6\") pod \"mariadb-client-5-default\" (UID: \"ecd536a6-8935-44c6-96eb-19fe9ee833e0\") " pod="openstack/mariadb-client-5-default" Dec 03 07:52:55 crc kubenswrapper[4831]: I1203 07:52:55.061824 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m62t6\" (UniqueName: \"kubernetes.io/projected/ecd536a6-8935-44c6-96eb-19fe9ee833e0-kube-api-access-m62t6\") pod \"mariadb-client-5-default\" (UID: \"ecd536a6-8935-44c6-96eb-19fe9ee833e0\") " pod="openstack/mariadb-client-5-default" Dec 03 07:52:55 crc kubenswrapper[4831]: I1203 07:52:55.157034 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 03 07:52:55 crc kubenswrapper[4831]: I1203 07:52:55.712100 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 03 07:52:56 crc kubenswrapper[4831]: I1203 07:52:56.593940 4831 generic.go:334] "Generic (PLEG): container finished" podID="ecd536a6-8935-44c6-96eb-19fe9ee833e0" containerID="1119170d5a73d59578adef9531478efbb8273e8d56d789e589b57fec50ed3b61" exitCode=0 Dec 03 07:52:56 crc kubenswrapper[4831]: I1203 07:52:56.594068 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"ecd536a6-8935-44c6-96eb-19fe9ee833e0","Type":"ContainerDied","Data":"1119170d5a73d59578adef9531478efbb8273e8d56d789e589b57fec50ed3b61"} Dec 03 07:52:56 crc kubenswrapper[4831]: I1203 07:52:56.594538 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-5-default" event={"ID":"ecd536a6-8935-44c6-96eb-19fe9ee833e0","Type":"ContainerStarted","Data":"1974bf7b700c30ee905afe3ec357960f58b1c6fdf6afefe25afa055fcd58ad5b"} Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.175754 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.206107 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-5-default_ecd536a6-8935-44c6-96eb-19fe9ee833e0/mariadb-client-5-default/0.log" Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.232592 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.241261 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-5-default"] Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.353685 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m62t6\" (UniqueName: \"kubernetes.io/projected/ecd536a6-8935-44c6-96eb-19fe9ee833e0-kube-api-access-m62t6\") pod \"ecd536a6-8935-44c6-96eb-19fe9ee833e0\" (UID: \"ecd536a6-8935-44c6-96eb-19fe9ee833e0\") " Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.361053 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ecd536a6-8935-44c6-96eb-19fe9ee833e0-kube-api-access-m62t6" (OuterVolumeSpecName: "kube-api-access-m62t6") pod "ecd536a6-8935-44c6-96eb-19fe9ee833e0" (UID: "ecd536a6-8935-44c6-96eb-19fe9ee833e0"). InnerVolumeSpecName "kube-api-access-m62t6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.370893 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-6-default"] Dec 03 07:52:58 crc kubenswrapper[4831]: E1203 07:52:58.371422 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ecd536a6-8935-44c6-96eb-19fe9ee833e0" containerName="mariadb-client-5-default" Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.371463 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ecd536a6-8935-44c6-96eb-19fe9ee833e0" containerName="mariadb-client-5-default" Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.371765 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ecd536a6-8935-44c6-96eb-19fe9ee833e0" containerName="mariadb-client-5-default" Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.372713 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.382902 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.455596 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m62t6\" (UniqueName: \"kubernetes.io/projected/ecd536a6-8935-44c6-96eb-19fe9ee833e0-kube-api-access-m62t6\") on node \"crc\" DevicePath \"\"" Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.570767 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grqp2\" (UniqueName: \"kubernetes.io/projected/09536790-3a99-43d5-a013-4dfef03d2457-kube-api-access-grqp2\") pod \"mariadb-client-6-default\" (UID: \"09536790-3a99-43d5-a013-4dfef03d2457\") " pod="openstack/mariadb-client-6-default" Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.618889 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1974bf7b700c30ee905afe3ec357960f58b1c6fdf6afefe25afa055fcd58ad5b" Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.618967 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-5-default" Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.672533 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grqp2\" (UniqueName: \"kubernetes.io/projected/09536790-3a99-43d5-a013-4dfef03d2457-kube-api-access-grqp2\") pod \"mariadb-client-6-default\" (UID: \"09536790-3a99-43d5-a013-4dfef03d2457\") " pod="openstack/mariadb-client-6-default" Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.710801 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grqp2\" (UniqueName: \"kubernetes.io/projected/09536790-3a99-43d5-a013-4dfef03d2457-kube-api-access-grqp2\") pod \"mariadb-client-6-default\" (UID: \"09536790-3a99-43d5-a013-4dfef03d2457\") " pod="openstack/mariadb-client-6-default" Dec 03 07:52:58 crc kubenswrapper[4831]: I1203 07:52:58.715414 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 03 07:52:59 crc kubenswrapper[4831]: I1203 07:52:59.037630 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ecd536a6-8935-44c6-96eb-19fe9ee833e0" path="/var/lib/kubelet/pods/ecd536a6-8935-44c6-96eb-19fe9ee833e0/volumes" Dec 03 07:52:59 crc kubenswrapper[4831]: I1203 07:52:59.109858 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 03 07:52:59 crc kubenswrapper[4831]: W1203 07:52:59.136417 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09536790_3a99_43d5_a013_4dfef03d2457.slice/crio-b51c17cdc9ed58d36421c83a759ceec8b476fbb8fa0878b79f2a186f1d915f09 WatchSource:0}: Error finding container b51c17cdc9ed58d36421c83a759ceec8b476fbb8fa0878b79f2a186f1d915f09: Status 404 returned error can't find the container with id b51c17cdc9ed58d36421c83a759ceec8b476fbb8fa0878b79f2a186f1d915f09 Dec 03 07:52:59 crc kubenswrapper[4831]: I1203 07:52:59.630260 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"09536790-3a99-43d5-a013-4dfef03d2457","Type":"ContainerStarted","Data":"ee7ebf71b9bfbd5e2036592443931e9a199849cce989ced67546287f62619c78"} Dec 03 07:52:59 crc kubenswrapper[4831]: I1203 07:52:59.630673 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"09536790-3a99-43d5-a013-4dfef03d2457","Type":"ContainerStarted","Data":"b51c17cdc9ed58d36421c83a759ceec8b476fbb8fa0878b79f2a186f1d915f09"} Dec 03 07:52:59 crc kubenswrapper[4831]: I1203 07:52:59.652420 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-client-6-default" podStartSLOduration=1.652393438 podStartE2EDuration="1.652393438s" podCreationTimestamp="2025-12-03 07:52:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:52:59.645702449 +0000 UTC m=+4916.989285997" watchObservedRunningTime="2025-12-03 07:52:59.652393438 +0000 UTC m=+4916.995976966" Dec 03 07:53:00 crc kubenswrapper[4831]: I1203 07:53:00.642228 4831 generic.go:334] "Generic (PLEG): container finished" podID="09536790-3a99-43d5-a013-4dfef03d2457" containerID="ee7ebf71b9bfbd5e2036592443931e9a199849cce989ced67546287f62619c78" exitCode=1 Dec 03 07:53:00 crc kubenswrapper[4831]: I1203 07:53:00.642359 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-6-default" event={"ID":"09536790-3a99-43d5-a013-4dfef03d2457","Type":"ContainerDied","Data":"ee7ebf71b9bfbd5e2036592443931e9a199849cce989ced67546287f62619c78"} Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.098062 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.136244 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.141519 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-6-default"] Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.228006 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grqp2\" (UniqueName: \"kubernetes.io/projected/09536790-3a99-43d5-a013-4dfef03d2457-kube-api-access-grqp2\") pod \"09536790-3a99-43d5-a013-4dfef03d2457\" (UID: \"09536790-3a99-43d5-a013-4dfef03d2457\") " Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.236147 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09536790-3a99-43d5-a013-4dfef03d2457-kube-api-access-grqp2" (OuterVolumeSpecName: "kube-api-access-grqp2") pod "09536790-3a99-43d5-a013-4dfef03d2457" (UID: "09536790-3a99-43d5-a013-4dfef03d2457"). InnerVolumeSpecName "kube-api-access-grqp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.260958 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-7-default"] Dec 03 07:53:02 crc kubenswrapper[4831]: E1203 07:53:02.261439 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09536790-3a99-43d5-a013-4dfef03d2457" containerName="mariadb-client-6-default" Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.261465 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="09536790-3a99-43d5-a013-4dfef03d2457" containerName="mariadb-client-6-default" Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.261651 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="09536790-3a99-43d5-a013-4dfef03d2457" containerName="mariadb-client-6-default" Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.262332 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.278159 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.330400 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grqp2\" (UniqueName: \"kubernetes.io/projected/09536790-3a99-43d5-a013-4dfef03d2457-kube-api-access-grqp2\") on node \"crc\" DevicePath \"\"" Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.432375 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp2q2\" (UniqueName: \"kubernetes.io/projected/56680371-0eea-4f75-90c9-eaf9bdb4ee60-kube-api-access-qp2q2\") pod \"mariadb-client-7-default\" (UID: \"56680371-0eea-4f75-90c9-eaf9bdb4ee60\") " pod="openstack/mariadb-client-7-default" Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.534481 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp2q2\" (UniqueName: \"kubernetes.io/projected/56680371-0eea-4f75-90c9-eaf9bdb4ee60-kube-api-access-qp2q2\") pod \"mariadb-client-7-default\" (UID: \"56680371-0eea-4f75-90c9-eaf9bdb4ee60\") " pod="openstack/mariadb-client-7-default" Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.576973 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp2q2\" (UniqueName: \"kubernetes.io/projected/56680371-0eea-4f75-90c9-eaf9bdb4ee60-kube-api-access-qp2q2\") pod \"mariadb-client-7-default\" (UID: \"56680371-0eea-4f75-90c9-eaf9bdb4ee60\") " pod="openstack/mariadb-client-7-default" Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.584678 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.667856 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b51c17cdc9ed58d36421c83a759ceec8b476fbb8fa0878b79f2a186f1d915f09" Dec 03 07:53:02 crc kubenswrapper[4831]: I1203 07:53:02.668301 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-6-default" Dec 03 07:53:03 crc kubenswrapper[4831]: I1203 07:53:03.023046 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09536790-3a99-43d5-a013-4dfef03d2457" path="/var/lib/kubelet/pods/09536790-3a99-43d5-a013-4dfef03d2457/volumes" Dec 03 07:53:03 crc kubenswrapper[4831]: I1203 07:53:03.166069 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 03 07:53:03 crc kubenswrapper[4831]: I1203 07:53:03.676281 4831 generic.go:334] "Generic (PLEG): container finished" podID="56680371-0eea-4f75-90c9-eaf9bdb4ee60" containerID="a78523f273d45c4a96f1498fe8b9235a8197c9a6cc18841acd55355f57009a0a" exitCode=0 Dec 03 07:53:03 crc kubenswrapper[4831]: I1203 07:53:03.676356 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"56680371-0eea-4f75-90c9-eaf9bdb4ee60","Type":"ContainerDied","Data":"a78523f273d45c4a96f1498fe8b9235a8197c9a6cc18841acd55355f57009a0a"} Dec 03 07:53:03 crc kubenswrapper[4831]: I1203 07:53:03.676623 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-7-default" event={"ID":"56680371-0eea-4f75-90c9-eaf9bdb4ee60","Type":"ContainerStarted","Data":"500661600d0e1cf038992beddc8436eabdd772ff3c0b633737bfa6fd82ab410f"} Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.127963 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.148990 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-7-default_56680371-0eea-4f75-90c9-eaf9bdb4ee60/mariadb-client-7-default/0.log" Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.175435 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.188978 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-7-default"] Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.291013 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qp2q2\" (UniqueName: \"kubernetes.io/projected/56680371-0eea-4f75-90c9-eaf9bdb4ee60-kube-api-access-qp2q2\") pod \"56680371-0eea-4f75-90c9-eaf9bdb4ee60\" (UID: \"56680371-0eea-4f75-90c9-eaf9bdb4ee60\") " Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.301534 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56680371-0eea-4f75-90c9-eaf9bdb4ee60-kube-api-access-qp2q2" (OuterVolumeSpecName: "kube-api-access-qp2q2") pod "56680371-0eea-4f75-90c9-eaf9bdb4ee60" (UID: "56680371-0eea-4f75-90c9-eaf9bdb4ee60"). InnerVolumeSpecName "kube-api-access-qp2q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.332548 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client-2"] Dec 03 07:53:05 crc kubenswrapper[4831]: E1203 07:53:05.332994 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56680371-0eea-4f75-90c9-eaf9bdb4ee60" containerName="mariadb-client-7-default" Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.333011 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="56680371-0eea-4f75-90c9-eaf9bdb4ee60" containerName="mariadb-client-7-default" Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.333225 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="56680371-0eea-4f75-90c9-eaf9bdb4ee60" containerName="mariadb-client-7-default" Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.333906 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.341795 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.392631 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qp2q2\" (UniqueName: \"kubernetes.io/projected/56680371-0eea-4f75-90c9-eaf9bdb4ee60-kube-api-access-qp2q2\") on node \"crc\" DevicePath \"\"" Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.493810 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rs5ll\" (UniqueName: \"kubernetes.io/projected/535990cb-cbfb-469f-9635-1a23a6e202cd-kube-api-access-rs5ll\") pod \"mariadb-client-2\" (UID: \"535990cb-cbfb-469f-9635-1a23a6e202cd\") " pod="openstack/mariadb-client-2" Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.595759 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rs5ll\" (UniqueName: \"kubernetes.io/projected/535990cb-cbfb-469f-9635-1a23a6e202cd-kube-api-access-rs5ll\") pod \"mariadb-client-2\" (UID: \"535990cb-cbfb-469f-9635-1a23a6e202cd\") " pod="openstack/mariadb-client-2" Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.624919 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rs5ll\" (UniqueName: \"kubernetes.io/projected/535990cb-cbfb-469f-9635-1a23a6e202cd-kube-api-access-rs5ll\") pod \"mariadb-client-2\" (UID: \"535990cb-cbfb-469f-9635-1a23a6e202cd\") " pod="openstack/mariadb-client-2" Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.661179 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.713507 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="500661600d0e1cf038992beddc8436eabdd772ff3c0b633737bfa6fd82ab410f" Dec 03 07:53:05 crc kubenswrapper[4831]: I1203 07:53:05.713628 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-7-default" Dec 03 07:53:06 crc kubenswrapper[4831]: I1203 07:53:06.275660 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 07:53:06 crc kubenswrapper[4831]: W1203 07:53:06.280912 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod535990cb_cbfb_469f_9635_1a23a6e202cd.slice/crio-6d563c13a58beba4e633cec35279f610cf52f2ed72cd4bee21a131daeec9aeb1 WatchSource:0}: Error finding container 6d563c13a58beba4e633cec35279f610cf52f2ed72cd4bee21a131daeec9aeb1: Status 404 returned error can't find the container with id 6d563c13a58beba4e633cec35279f610cf52f2ed72cd4bee21a131daeec9aeb1 Dec 03 07:53:06 crc kubenswrapper[4831]: I1203 07:53:06.725864 4831 generic.go:334] "Generic (PLEG): container finished" podID="535990cb-cbfb-469f-9635-1a23a6e202cd" containerID="dc7b50e85ee5b590677e46c724a121832ef46a8fb7d2589f1d2cbef19dc4a779" exitCode=0 Dec 03 07:53:06 crc kubenswrapper[4831]: I1203 07:53:06.725978 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"535990cb-cbfb-469f-9635-1a23a6e202cd","Type":"ContainerDied","Data":"dc7b50e85ee5b590677e46c724a121832ef46a8fb7d2589f1d2cbef19dc4a779"} Dec 03 07:53:06 crc kubenswrapper[4831]: I1203 07:53:06.726229 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client-2" event={"ID":"535990cb-cbfb-469f-9635-1a23a6e202cd","Type":"ContainerStarted","Data":"6d563c13a58beba4e633cec35279f610cf52f2ed72cd4bee21a131daeec9aeb1"} Dec 03 07:53:07 crc kubenswrapper[4831]: I1203 07:53:07.022504 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56680371-0eea-4f75-90c9-eaf9bdb4ee60" path="/var/lib/kubelet/pods/56680371-0eea-4f75-90c9-eaf9bdb4ee60/volumes" Dec 03 07:53:08 crc kubenswrapper[4831]: I1203 07:53:08.270994 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 07:53:08 crc kubenswrapper[4831]: I1203 07:53:08.293795 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client-2_535990cb-cbfb-469f-9635-1a23a6e202cd/mariadb-client-2/0.log" Dec 03 07:53:08 crc kubenswrapper[4831]: I1203 07:53:08.314333 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 07:53:08 crc kubenswrapper[4831]: I1203 07:53:08.321375 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client-2"] Dec 03 07:53:08 crc kubenswrapper[4831]: I1203 07:53:08.452590 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rs5ll\" (UniqueName: \"kubernetes.io/projected/535990cb-cbfb-469f-9635-1a23a6e202cd-kube-api-access-rs5ll\") pod \"535990cb-cbfb-469f-9635-1a23a6e202cd\" (UID: \"535990cb-cbfb-469f-9635-1a23a6e202cd\") " Dec 03 07:53:08 crc kubenswrapper[4831]: I1203 07:53:08.463566 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/535990cb-cbfb-469f-9635-1a23a6e202cd-kube-api-access-rs5ll" (OuterVolumeSpecName: "kube-api-access-rs5ll") pod "535990cb-cbfb-469f-9635-1a23a6e202cd" (UID: "535990cb-cbfb-469f-9635-1a23a6e202cd"). InnerVolumeSpecName "kube-api-access-rs5ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:53:08 crc kubenswrapper[4831]: I1203 07:53:08.554714 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rs5ll\" (UniqueName: \"kubernetes.io/projected/535990cb-cbfb-469f-9635-1a23a6e202cd-kube-api-access-rs5ll\") on node \"crc\" DevicePath \"\"" Dec 03 07:53:08 crc kubenswrapper[4831]: I1203 07:53:08.761818 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d563c13a58beba4e633cec35279f610cf52f2ed72cd4bee21a131daeec9aeb1" Dec 03 07:53:08 crc kubenswrapper[4831]: I1203 07:53:08.761915 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client-2" Dec 03 07:53:09 crc kubenswrapper[4831]: I1203 07:53:09.025965 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="535990cb-cbfb-469f-9635-1a23a6e202cd" path="/var/lib/kubelet/pods/535990cb-cbfb-469f-9635-1a23a6e202cd/volumes" Dec 03 07:53:18 crc kubenswrapper[4831]: I1203 07:53:18.525261 4831 scope.go:117] "RemoveContainer" containerID="ccd04f15bf9e07532d64edd9da97c634a3ca7e855be113135dedbec011e7debc" Dec 03 07:53:27 crc kubenswrapper[4831]: I1203 07:53:27.597264 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:53:27 crc kubenswrapper[4831]: I1203 07:53:27.597838 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:53:57 crc kubenswrapper[4831]: I1203 07:53:57.597490 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:53:57 crc kubenswrapper[4831]: I1203 07:53:57.598218 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.545643 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vtksj"] Dec 03 07:54:00 crc kubenswrapper[4831]: E1203 07:54:00.546511 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="535990cb-cbfb-469f-9635-1a23a6e202cd" containerName="mariadb-client-2" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.546535 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="535990cb-cbfb-469f-9635-1a23a6e202cd" containerName="mariadb-client-2" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.546871 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="535990cb-cbfb-469f-9635-1a23a6e202cd" containerName="mariadb-client-2" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.548752 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.590800 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vtksj"] Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.601357 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df1e16e-4331-43c5-94f9-73d6ad45157b-utilities\") pod \"certified-operators-vtksj\" (UID: \"6df1e16e-4331-43c5-94f9-73d6ad45157b\") " pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.601418 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df1e16e-4331-43c5-94f9-73d6ad45157b-catalog-content\") pod \"certified-operators-vtksj\" (UID: \"6df1e16e-4331-43c5-94f9-73d6ad45157b\") " pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.601564 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79zt5\" (UniqueName: \"kubernetes.io/projected/6df1e16e-4331-43c5-94f9-73d6ad45157b-kube-api-access-79zt5\") pod \"certified-operators-vtksj\" (UID: \"6df1e16e-4331-43c5-94f9-73d6ad45157b\") " pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.703909 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df1e16e-4331-43c5-94f9-73d6ad45157b-utilities\") pod \"certified-operators-vtksj\" (UID: \"6df1e16e-4331-43c5-94f9-73d6ad45157b\") " pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.703999 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df1e16e-4331-43c5-94f9-73d6ad45157b-catalog-content\") pod \"certified-operators-vtksj\" (UID: \"6df1e16e-4331-43c5-94f9-73d6ad45157b\") " pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.704043 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79zt5\" (UniqueName: \"kubernetes.io/projected/6df1e16e-4331-43c5-94f9-73d6ad45157b-kube-api-access-79zt5\") pod \"certified-operators-vtksj\" (UID: \"6df1e16e-4331-43c5-94f9-73d6ad45157b\") " pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.704940 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6df1e16e-4331-43c5-94f9-73d6ad45157b-catalog-content\") pod \"certified-operators-vtksj\" (UID: \"6df1e16e-4331-43c5-94f9-73d6ad45157b\") " pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.704965 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6df1e16e-4331-43c5-94f9-73d6ad45157b-utilities\") pod \"certified-operators-vtksj\" (UID: \"6df1e16e-4331-43c5-94f9-73d6ad45157b\") " pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.734058 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79zt5\" (UniqueName: \"kubernetes.io/projected/6df1e16e-4331-43c5-94f9-73d6ad45157b-kube-api-access-79zt5\") pod \"certified-operators-vtksj\" (UID: \"6df1e16e-4331-43c5-94f9-73d6ad45157b\") " pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:00 crc kubenswrapper[4831]: I1203 07:54:00.889289 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:01 crc kubenswrapper[4831]: I1203 07:54:01.473389 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vtksj"] Dec 03 07:54:02 crc kubenswrapper[4831]: I1203 07:54:02.267969 4831 generic.go:334] "Generic (PLEG): container finished" podID="6df1e16e-4331-43c5-94f9-73d6ad45157b" containerID="add5e01b198fe49facfedb00cdc72352ae74b421c5f11eef748f634a94ecbd08" exitCode=0 Dec 03 07:54:02 crc kubenswrapper[4831]: I1203 07:54:02.268034 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vtksj" event={"ID":"6df1e16e-4331-43c5-94f9-73d6ad45157b","Type":"ContainerDied","Data":"add5e01b198fe49facfedb00cdc72352ae74b421c5f11eef748f634a94ecbd08"} Dec 03 07:54:02 crc kubenswrapper[4831]: I1203 07:54:02.268461 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vtksj" event={"ID":"6df1e16e-4331-43c5-94f9-73d6ad45157b","Type":"ContainerStarted","Data":"46046798943dbb8c6bda98a80377832f989438fe461e85156fcabc64134a04c8"} Dec 03 07:54:07 crc kubenswrapper[4831]: I1203 07:54:07.313265 4831 generic.go:334] "Generic (PLEG): container finished" podID="6df1e16e-4331-43c5-94f9-73d6ad45157b" containerID="a46d71a9ce95e73369f8d3a42fe790285d9c5eb56cc405fed6a29898433ef297" exitCode=0 Dec 03 07:54:07 crc kubenswrapper[4831]: I1203 07:54:07.313959 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vtksj" event={"ID":"6df1e16e-4331-43c5-94f9-73d6ad45157b","Type":"ContainerDied","Data":"a46d71a9ce95e73369f8d3a42fe790285d9c5eb56cc405fed6a29898433ef297"} Dec 03 07:54:07 crc kubenswrapper[4831]: I1203 07:54:07.316715 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 07:54:08 crc kubenswrapper[4831]: I1203 07:54:08.324212 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vtksj" event={"ID":"6df1e16e-4331-43c5-94f9-73d6ad45157b","Type":"ContainerStarted","Data":"8cbe6e50573fcc21bd2dcd07ffa4c22b1f6d6eb1b318cbff58e7485b0e409053"} Dec 03 07:54:08 crc kubenswrapper[4831]: I1203 07:54:08.355609 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vtksj" podStartSLOduration=2.6714457400000002 podStartE2EDuration="8.35558384s" podCreationTimestamp="2025-12-03 07:54:00 +0000 UTC" firstStartedPulling="2025-12-03 07:54:02.27042394 +0000 UTC m=+4979.614007488" lastFinishedPulling="2025-12-03 07:54:07.95456206 +0000 UTC m=+4985.298145588" observedRunningTime="2025-12-03 07:54:08.349249152 +0000 UTC m=+4985.692832670" watchObservedRunningTime="2025-12-03 07:54:08.35558384 +0000 UTC m=+4985.699167358" Dec 03 07:54:10 crc kubenswrapper[4831]: I1203 07:54:10.890005 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:10 crc kubenswrapper[4831]: I1203 07:54:10.890550 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:10 crc kubenswrapper[4831]: I1203 07:54:10.943043 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:20 crc kubenswrapper[4831]: I1203 07:54:20.971113 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vtksj" Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.041484 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vtksj"] Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.081333 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9mjxh"] Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.081923 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-9mjxh" podUID="03e08a40-1b90-40fb-a497-72589cdb0dcc" containerName="registry-server" containerID="cri-o://23eec0c855d888ba42fae44af3741ef3e321cf53ade8e352e3b32fd77a01a3be" gracePeriod=2 Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.457178 4831 generic.go:334] "Generic (PLEG): container finished" podID="03e08a40-1b90-40fb-a497-72589cdb0dcc" containerID="23eec0c855d888ba42fae44af3741ef3e321cf53ade8e352e3b32fd77a01a3be" exitCode=0 Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.457252 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mjxh" event={"ID":"03e08a40-1b90-40fb-a497-72589cdb0dcc","Type":"ContainerDied","Data":"23eec0c855d888ba42fae44af3741ef3e321cf53ade8e352e3b32fd77a01a3be"} Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.457296 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-9mjxh" event={"ID":"03e08a40-1b90-40fb-a497-72589cdb0dcc","Type":"ContainerDied","Data":"1f0860c25afe3a5fefa5d395091777f72dd6e55cc12904c7479fb95756a1b98a"} Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.457324 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1f0860c25afe3a5fefa5d395091777f72dd6e55cc12904c7479fb95756a1b98a" Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.495064 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.581611 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hztkx\" (UniqueName: \"kubernetes.io/projected/03e08a40-1b90-40fb-a497-72589cdb0dcc-kube-api-access-hztkx\") pod \"03e08a40-1b90-40fb-a497-72589cdb0dcc\" (UID: \"03e08a40-1b90-40fb-a497-72589cdb0dcc\") " Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.581678 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03e08a40-1b90-40fb-a497-72589cdb0dcc-utilities\") pod \"03e08a40-1b90-40fb-a497-72589cdb0dcc\" (UID: \"03e08a40-1b90-40fb-a497-72589cdb0dcc\") " Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.581704 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03e08a40-1b90-40fb-a497-72589cdb0dcc-catalog-content\") pod \"03e08a40-1b90-40fb-a497-72589cdb0dcc\" (UID: \"03e08a40-1b90-40fb-a497-72589cdb0dcc\") " Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.591006 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03e08a40-1b90-40fb-a497-72589cdb0dcc-utilities" (OuterVolumeSpecName: "utilities") pod "03e08a40-1b90-40fb-a497-72589cdb0dcc" (UID: "03e08a40-1b90-40fb-a497-72589cdb0dcc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.597174 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03e08a40-1b90-40fb-a497-72589cdb0dcc-kube-api-access-hztkx" (OuterVolumeSpecName: "kube-api-access-hztkx") pod "03e08a40-1b90-40fb-a497-72589cdb0dcc" (UID: "03e08a40-1b90-40fb-a497-72589cdb0dcc"). InnerVolumeSpecName "kube-api-access-hztkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.627122 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/03e08a40-1b90-40fb-a497-72589cdb0dcc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "03e08a40-1b90-40fb-a497-72589cdb0dcc" (UID: "03e08a40-1b90-40fb-a497-72589cdb0dcc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.683125 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hztkx\" (UniqueName: \"kubernetes.io/projected/03e08a40-1b90-40fb-a497-72589cdb0dcc-kube-api-access-hztkx\") on node \"crc\" DevicePath \"\"" Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.683160 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/03e08a40-1b90-40fb-a497-72589cdb0dcc-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:54:21 crc kubenswrapper[4831]: I1203 07:54:21.683173 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/03e08a40-1b90-40fb-a497-72589cdb0dcc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:54:22 crc kubenswrapper[4831]: I1203 07:54:22.463870 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-9mjxh" Dec 03 07:54:22 crc kubenswrapper[4831]: I1203 07:54:22.498020 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-9mjxh"] Dec 03 07:54:22 crc kubenswrapper[4831]: I1203 07:54:22.504252 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-9mjxh"] Dec 03 07:54:23 crc kubenswrapper[4831]: I1203 07:54:23.023493 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03e08a40-1b90-40fb-a497-72589cdb0dcc" path="/var/lib/kubelet/pods/03e08a40-1b90-40fb-a497-72589cdb0dcc/volumes" Dec 03 07:54:27 crc kubenswrapper[4831]: I1203 07:54:27.596683 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 07:54:27 crc kubenswrapper[4831]: I1203 07:54:27.597134 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 07:54:27 crc kubenswrapper[4831]: I1203 07:54:27.597197 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 07:54:27 crc kubenswrapper[4831]: I1203 07:54:27.599050 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 07:54:27 crc kubenswrapper[4831]: I1203 07:54:27.599191 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" gracePeriod=600 Dec 03 07:54:27 crc kubenswrapper[4831]: E1203 07:54:27.748221 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:54:28 crc kubenswrapper[4831]: I1203 07:54:28.547882 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" exitCode=0 Dec 03 07:54:28 crc kubenswrapper[4831]: I1203 07:54:28.547997 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28"} Dec 03 07:54:28 crc kubenswrapper[4831]: I1203 07:54:28.548499 4831 scope.go:117] "RemoveContainer" containerID="3baec98df1747de5e0f1821202cad3b811d612c0c379c2da31045f274fd26772" Dec 03 07:54:28 crc kubenswrapper[4831]: I1203 07:54:28.549464 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:54:28 crc kubenswrapper[4831]: E1203 07:54:28.550046 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:54:39 crc kubenswrapper[4831]: I1203 07:54:39.013422 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:54:39 crc kubenswrapper[4831]: E1203 07:54:39.015674 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:54:52 crc kubenswrapper[4831]: I1203 07:54:52.013649 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:54:52 crc kubenswrapper[4831]: E1203 07:54:52.014854 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:55:05 crc kubenswrapper[4831]: I1203 07:55:05.013475 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:55:05 crc kubenswrapper[4831]: E1203 07:55:05.014882 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:55:18 crc kubenswrapper[4831]: I1203 07:55:18.653096 4831 scope.go:117] "RemoveContainer" containerID="20bb4a09eabde14a8879b6ae0982718167bf0eab33fded11b875033dabd64ebf" Dec 03 07:55:18 crc kubenswrapper[4831]: I1203 07:55:18.670693 4831 scope.go:117] "RemoveContainer" containerID="23eec0c855d888ba42fae44af3741ef3e321cf53ade8e352e3b32fd77a01a3be" Dec 03 07:55:18 crc kubenswrapper[4831]: I1203 07:55:18.701695 4831 scope.go:117] "RemoveContainer" containerID="639d13b1fb5d4e11c6ad425d8bd45f1ac36dd2d40829542d7c0a9dbeb3365f72" Dec 03 07:55:19 crc kubenswrapper[4831]: I1203 07:55:19.017172 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:55:19 crc kubenswrapper[4831]: E1203 07:55:19.017935 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:55:32 crc kubenswrapper[4831]: I1203 07:55:32.013611 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:55:32 crc kubenswrapper[4831]: E1203 07:55:32.014654 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:55:44 crc kubenswrapper[4831]: I1203 07:55:44.013906 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:55:44 crc kubenswrapper[4831]: E1203 07:55:44.014902 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:55:56 crc kubenswrapper[4831]: I1203 07:55:56.013193 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:55:56 crc kubenswrapper[4831]: E1203 07:55:56.014623 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:56:08 crc kubenswrapper[4831]: I1203 07:56:08.013070 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:56:08 crc kubenswrapper[4831]: E1203 07:56:08.013987 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:56:20 crc kubenswrapper[4831]: I1203 07:56:20.013815 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:56:20 crc kubenswrapper[4831]: E1203 07:56:20.014734 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:56:33 crc kubenswrapper[4831]: I1203 07:56:33.018810 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:56:33 crc kubenswrapper[4831]: E1203 07:56:33.019615 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:56:45 crc kubenswrapper[4831]: I1203 07:56:45.013194 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:56:45 crc kubenswrapper[4831]: E1203 07:56:45.014100 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:56:59 crc kubenswrapper[4831]: I1203 07:56:59.013740 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:56:59 crc kubenswrapper[4831]: E1203 07:56:59.014848 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:57:11 crc kubenswrapper[4831]: I1203 07:57:11.013638 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:57:11 crc kubenswrapper[4831]: E1203 07:57:11.014802 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:57:26 crc kubenswrapper[4831]: I1203 07:57:26.013850 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:57:26 crc kubenswrapper[4831]: E1203 07:57:26.014795 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.165198 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lkhbs"] Dec 03 07:57:31 crc kubenswrapper[4831]: E1203 07:57:31.166555 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03e08a40-1b90-40fb-a497-72589cdb0dcc" containerName="registry-server" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.166581 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="03e08a40-1b90-40fb-a497-72589cdb0dcc" containerName="registry-server" Dec 03 07:57:31 crc kubenswrapper[4831]: E1203 07:57:31.166611 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03e08a40-1b90-40fb-a497-72589cdb0dcc" containerName="extract-utilities" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.166624 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="03e08a40-1b90-40fb-a497-72589cdb0dcc" containerName="extract-utilities" Dec 03 07:57:31 crc kubenswrapper[4831]: E1203 07:57:31.166656 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03e08a40-1b90-40fb-a497-72589cdb0dcc" containerName="extract-content" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.166669 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="03e08a40-1b90-40fb-a497-72589cdb0dcc" containerName="extract-content" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.166966 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="03e08a40-1b90-40fb-a497-72589cdb0dcc" containerName="registry-server" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.168859 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.187244 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkhbs"] Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.291743 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dd9w\" (UniqueName: \"kubernetes.io/projected/bf263ced-6d35-4805-b3a5-ef901f0c9405-kube-api-access-5dd9w\") pod \"redhat-marketplace-lkhbs\" (UID: \"bf263ced-6d35-4805-b3a5-ef901f0c9405\") " pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.291831 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf263ced-6d35-4805-b3a5-ef901f0c9405-utilities\") pod \"redhat-marketplace-lkhbs\" (UID: \"bf263ced-6d35-4805-b3a5-ef901f0c9405\") " pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.291979 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf263ced-6d35-4805-b3a5-ef901f0c9405-catalog-content\") pod \"redhat-marketplace-lkhbs\" (UID: \"bf263ced-6d35-4805-b3a5-ef901f0c9405\") " pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.393135 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dd9w\" (UniqueName: \"kubernetes.io/projected/bf263ced-6d35-4805-b3a5-ef901f0c9405-kube-api-access-5dd9w\") pod \"redhat-marketplace-lkhbs\" (UID: \"bf263ced-6d35-4805-b3a5-ef901f0c9405\") " pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.393201 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf263ced-6d35-4805-b3a5-ef901f0c9405-utilities\") pod \"redhat-marketplace-lkhbs\" (UID: \"bf263ced-6d35-4805-b3a5-ef901f0c9405\") " pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.393299 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf263ced-6d35-4805-b3a5-ef901f0c9405-catalog-content\") pod \"redhat-marketplace-lkhbs\" (UID: \"bf263ced-6d35-4805-b3a5-ef901f0c9405\") " pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.394044 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf263ced-6d35-4805-b3a5-ef901f0c9405-utilities\") pod \"redhat-marketplace-lkhbs\" (UID: \"bf263ced-6d35-4805-b3a5-ef901f0c9405\") " pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.394084 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf263ced-6d35-4805-b3a5-ef901f0c9405-catalog-content\") pod \"redhat-marketplace-lkhbs\" (UID: \"bf263ced-6d35-4805-b3a5-ef901f0c9405\") " pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.423180 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dd9w\" (UniqueName: \"kubernetes.io/projected/bf263ced-6d35-4805-b3a5-ef901f0c9405-kube-api-access-5dd9w\") pod \"redhat-marketplace-lkhbs\" (UID: \"bf263ced-6d35-4805-b3a5-ef901f0c9405\") " pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:31 crc kubenswrapper[4831]: I1203 07:57:31.508761 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:32 crc kubenswrapper[4831]: I1203 07:57:31.998139 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkhbs"] Dec 03 07:57:32 crc kubenswrapper[4831]: I1203 07:57:32.455466 4831 generic.go:334] "Generic (PLEG): container finished" podID="bf263ced-6d35-4805-b3a5-ef901f0c9405" containerID="a7f6d484dd4361733329e3a65f256f5dff263523f421a578247a2b1e40968e79" exitCode=0 Dec 03 07:57:32 crc kubenswrapper[4831]: I1203 07:57:32.455566 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkhbs" event={"ID":"bf263ced-6d35-4805-b3a5-ef901f0c9405","Type":"ContainerDied","Data":"a7f6d484dd4361733329e3a65f256f5dff263523f421a578247a2b1e40968e79"} Dec 03 07:57:32 crc kubenswrapper[4831]: I1203 07:57:32.455919 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkhbs" event={"ID":"bf263ced-6d35-4805-b3a5-ef901f0c9405","Type":"ContainerStarted","Data":"0351c51fbd1632cf4d834fde388acba1a0f715b20c9084bc8f4fbb3e955c698f"} Dec 03 07:57:33 crc kubenswrapper[4831]: I1203 07:57:33.466820 4831 generic.go:334] "Generic (PLEG): container finished" podID="bf263ced-6d35-4805-b3a5-ef901f0c9405" containerID="2b79ce5cd6c98544240683a8d6b7b9652c27b702870a4f258ea9e65b0c049a57" exitCode=0 Dec 03 07:57:33 crc kubenswrapper[4831]: I1203 07:57:33.466927 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkhbs" event={"ID":"bf263ced-6d35-4805-b3a5-ef901f0c9405","Type":"ContainerDied","Data":"2b79ce5cd6c98544240683a8d6b7b9652c27b702870a4f258ea9e65b0c049a57"} Dec 03 07:57:34 crc kubenswrapper[4831]: I1203 07:57:34.477501 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkhbs" event={"ID":"bf263ced-6d35-4805-b3a5-ef901f0c9405","Type":"ContainerStarted","Data":"1dff8f5f30d21ab408cec43c6c883cdfd0642be092cfd2338ee159d3a49e8aa7"} Dec 03 07:57:34 crc kubenswrapper[4831]: I1203 07:57:34.501145 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lkhbs" podStartSLOduration=2.041619359 podStartE2EDuration="3.501128709s" podCreationTimestamp="2025-12-03 07:57:31 +0000 UTC" firstStartedPulling="2025-12-03 07:57:32.457746838 +0000 UTC m=+5189.801330386" lastFinishedPulling="2025-12-03 07:57:33.917256178 +0000 UTC m=+5191.260839736" observedRunningTime="2025-12-03 07:57:34.499392704 +0000 UTC m=+5191.842976212" watchObservedRunningTime="2025-12-03 07:57:34.501128709 +0000 UTC m=+5191.844712217" Dec 03 07:57:37 crc kubenswrapper[4831]: I1203 07:57:37.012852 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:57:37 crc kubenswrapper[4831]: E1203 07:57:37.013703 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:57:41 crc kubenswrapper[4831]: I1203 07:57:41.509823 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:41 crc kubenswrapper[4831]: I1203 07:57:41.510442 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:41 crc kubenswrapper[4831]: I1203 07:57:41.588642 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:41 crc kubenswrapper[4831]: I1203 07:57:41.649303 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:41 crc kubenswrapper[4831]: I1203 07:57:41.833240 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkhbs"] Dec 03 07:57:43 crc kubenswrapper[4831]: I1203 07:57:43.560812 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lkhbs" podUID="bf263ced-6d35-4805-b3a5-ef901f0c9405" containerName="registry-server" containerID="cri-o://1dff8f5f30d21ab408cec43c6c883cdfd0642be092cfd2338ee159d3a49e8aa7" gracePeriod=2 Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.565092 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.574034 4831 generic.go:334] "Generic (PLEG): container finished" podID="bf263ced-6d35-4805-b3a5-ef901f0c9405" containerID="1dff8f5f30d21ab408cec43c6c883cdfd0642be092cfd2338ee159d3a49e8aa7" exitCode=0 Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.574091 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkhbs" event={"ID":"bf263ced-6d35-4805-b3a5-ef901f0c9405","Type":"ContainerDied","Data":"1dff8f5f30d21ab408cec43c6c883cdfd0642be092cfd2338ee159d3a49e8aa7"} Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.574113 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lkhbs" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.574137 4831 scope.go:117] "RemoveContainer" containerID="1dff8f5f30d21ab408cec43c6c883cdfd0642be092cfd2338ee159d3a49e8aa7" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.574122 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lkhbs" event={"ID":"bf263ced-6d35-4805-b3a5-ef901f0c9405","Type":"ContainerDied","Data":"0351c51fbd1632cf4d834fde388acba1a0f715b20c9084bc8f4fbb3e955c698f"} Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.605201 4831 scope.go:117] "RemoveContainer" containerID="2b79ce5cd6c98544240683a8d6b7b9652c27b702870a4f258ea9e65b0c049a57" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.607594 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf263ced-6d35-4805-b3a5-ef901f0c9405-utilities\") pod \"bf263ced-6d35-4805-b3a5-ef901f0c9405\" (UID: \"bf263ced-6d35-4805-b3a5-ef901f0c9405\") " Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.607761 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf263ced-6d35-4805-b3a5-ef901f0c9405-catalog-content\") pod \"bf263ced-6d35-4805-b3a5-ef901f0c9405\" (UID: \"bf263ced-6d35-4805-b3a5-ef901f0c9405\") " Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.608760 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf263ced-6d35-4805-b3a5-ef901f0c9405-utilities" (OuterVolumeSpecName: "utilities") pod "bf263ced-6d35-4805-b3a5-ef901f0c9405" (UID: "bf263ced-6d35-4805-b3a5-ef901f0c9405"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.609667 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5dd9w\" (UniqueName: \"kubernetes.io/projected/bf263ced-6d35-4805-b3a5-ef901f0c9405-kube-api-access-5dd9w\") pod \"bf263ced-6d35-4805-b3a5-ef901f0c9405\" (UID: \"bf263ced-6d35-4805-b3a5-ef901f0c9405\") " Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.610411 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bf263ced-6d35-4805-b3a5-ef901f0c9405-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.624749 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf263ced-6d35-4805-b3a5-ef901f0c9405-kube-api-access-5dd9w" (OuterVolumeSpecName: "kube-api-access-5dd9w") pod "bf263ced-6d35-4805-b3a5-ef901f0c9405" (UID: "bf263ced-6d35-4805-b3a5-ef901f0c9405"). InnerVolumeSpecName "kube-api-access-5dd9w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.629332 4831 scope.go:117] "RemoveContainer" containerID="a7f6d484dd4361733329e3a65f256f5dff263523f421a578247a2b1e40968e79" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.630412 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bf263ced-6d35-4805-b3a5-ef901f0c9405-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bf263ced-6d35-4805-b3a5-ef901f0c9405" (UID: "bf263ced-6d35-4805-b3a5-ef901f0c9405"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.695865 4831 scope.go:117] "RemoveContainer" containerID="1dff8f5f30d21ab408cec43c6c883cdfd0642be092cfd2338ee159d3a49e8aa7" Dec 03 07:57:44 crc kubenswrapper[4831]: E1203 07:57:44.696352 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dff8f5f30d21ab408cec43c6c883cdfd0642be092cfd2338ee159d3a49e8aa7\": container with ID starting with 1dff8f5f30d21ab408cec43c6c883cdfd0642be092cfd2338ee159d3a49e8aa7 not found: ID does not exist" containerID="1dff8f5f30d21ab408cec43c6c883cdfd0642be092cfd2338ee159d3a49e8aa7" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.696382 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dff8f5f30d21ab408cec43c6c883cdfd0642be092cfd2338ee159d3a49e8aa7"} err="failed to get container status \"1dff8f5f30d21ab408cec43c6c883cdfd0642be092cfd2338ee159d3a49e8aa7\": rpc error: code = NotFound desc = could not find container \"1dff8f5f30d21ab408cec43c6c883cdfd0642be092cfd2338ee159d3a49e8aa7\": container with ID starting with 1dff8f5f30d21ab408cec43c6c883cdfd0642be092cfd2338ee159d3a49e8aa7 not found: ID does not exist" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.696404 4831 scope.go:117] "RemoveContainer" containerID="2b79ce5cd6c98544240683a8d6b7b9652c27b702870a4f258ea9e65b0c049a57" Dec 03 07:57:44 crc kubenswrapper[4831]: E1203 07:57:44.696689 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b79ce5cd6c98544240683a8d6b7b9652c27b702870a4f258ea9e65b0c049a57\": container with ID starting with 2b79ce5cd6c98544240683a8d6b7b9652c27b702870a4f258ea9e65b0c049a57 not found: ID does not exist" containerID="2b79ce5cd6c98544240683a8d6b7b9652c27b702870a4f258ea9e65b0c049a57" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.696721 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b79ce5cd6c98544240683a8d6b7b9652c27b702870a4f258ea9e65b0c049a57"} err="failed to get container status \"2b79ce5cd6c98544240683a8d6b7b9652c27b702870a4f258ea9e65b0c049a57\": rpc error: code = NotFound desc = could not find container \"2b79ce5cd6c98544240683a8d6b7b9652c27b702870a4f258ea9e65b0c049a57\": container with ID starting with 2b79ce5cd6c98544240683a8d6b7b9652c27b702870a4f258ea9e65b0c049a57 not found: ID does not exist" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.696735 4831 scope.go:117] "RemoveContainer" containerID="a7f6d484dd4361733329e3a65f256f5dff263523f421a578247a2b1e40968e79" Dec 03 07:57:44 crc kubenswrapper[4831]: E1203 07:57:44.697108 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7f6d484dd4361733329e3a65f256f5dff263523f421a578247a2b1e40968e79\": container with ID starting with a7f6d484dd4361733329e3a65f256f5dff263523f421a578247a2b1e40968e79 not found: ID does not exist" containerID="a7f6d484dd4361733329e3a65f256f5dff263523f421a578247a2b1e40968e79" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.697133 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7f6d484dd4361733329e3a65f256f5dff263523f421a578247a2b1e40968e79"} err="failed to get container status \"a7f6d484dd4361733329e3a65f256f5dff263523f421a578247a2b1e40968e79\": rpc error: code = NotFound desc = could not find container \"a7f6d484dd4361733329e3a65f256f5dff263523f421a578247a2b1e40968e79\": container with ID starting with a7f6d484dd4361733329e3a65f256f5dff263523f421a578247a2b1e40968e79 not found: ID does not exist" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.711416 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bf263ced-6d35-4805-b3a5-ef901f0c9405-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.711442 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5dd9w\" (UniqueName: \"kubernetes.io/projected/bf263ced-6d35-4805-b3a5-ef901f0c9405-kube-api-access-5dd9w\") on node \"crc\" DevicePath \"\"" Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.923689 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkhbs"] Dec 03 07:57:44 crc kubenswrapper[4831]: I1203 07:57:44.930646 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lkhbs"] Dec 03 07:57:45 crc kubenswrapper[4831]: I1203 07:57:45.031885 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf263ced-6d35-4805-b3a5-ef901f0c9405" path="/var/lib/kubelet/pods/bf263ced-6d35-4805-b3a5-ef901f0c9405/volumes" Dec 03 07:57:51 crc kubenswrapper[4831]: I1203 07:57:51.013408 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:57:51 crc kubenswrapper[4831]: E1203 07:57:51.014014 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.534221 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-copy-data"] Dec 03 07:58:01 crc kubenswrapper[4831]: E1203 07:58:01.535101 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf263ced-6d35-4805-b3a5-ef901f0c9405" containerName="extract-utilities" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.535119 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf263ced-6d35-4805-b3a5-ef901f0c9405" containerName="extract-utilities" Dec 03 07:58:01 crc kubenswrapper[4831]: E1203 07:58:01.535135 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf263ced-6d35-4805-b3a5-ef901f0c9405" containerName="extract-content" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.535143 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf263ced-6d35-4805-b3a5-ef901f0c9405" containerName="extract-content" Dec 03 07:58:01 crc kubenswrapper[4831]: E1203 07:58:01.535154 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf263ced-6d35-4805-b3a5-ef901f0c9405" containerName="registry-server" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.535162 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf263ced-6d35-4805-b3a5-ef901f0c9405" containerName="registry-server" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.535400 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf263ced-6d35-4805-b3a5-ef901f0c9405" containerName="registry-server" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.535923 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.538254 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-8ddwt" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.543845 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.598521 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-39da139f-5ed3-4973-89fc-b011508a45de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-39da139f-5ed3-4973-89fc-b011508a45de\") pod \"mariadb-copy-data\" (UID: \"18e2c38d-2593-41b7-a74a-74fd5671f27d\") " pod="openstack/mariadb-copy-data" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.598576 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7xml\" (UniqueName: \"kubernetes.io/projected/18e2c38d-2593-41b7-a74a-74fd5671f27d-kube-api-access-f7xml\") pod \"mariadb-copy-data\" (UID: \"18e2c38d-2593-41b7-a74a-74fd5671f27d\") " pod="openstack/mariadb-copy-data" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.699468 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7xml\" (UniqueName: \"kubernetes.io/projected/18e2c38d-2593-41b7-a74a-74fd5671f27d-kube-api-access-f7xml\") pod \"mariadb-copy-data\" (UID: \"18e2c38d-2593-41b7-a74a-74fd5671f27d\") " pod="openstack/mariadb-copy-data" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.699641 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-39da139f-5ed3-4973-89fc-b011508a45de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-39da139f-5ed3-4973-89fc-b011508a45de\") pod \"mariadb-copy-data\" (UID: \"18e2c38d-2593-41b7-a74a-74fd5671f27d\") " pod="openstack/mariadb-copy-data" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.702663 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.702703 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-39da139f-5ed3-4973-89fc-b011508a45de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-39da139f-5ed3-4973-89fc-b011508a45de\") pod \"mariadb-copy-data\" (UID: \"18e2c38d-2593-41b7-a74a-74fd5671f27d\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8057d3c82af3e7aaa8b4c0a2150f1683d19648dae1ffb0bb9c695ffcd4bbc2dc/globalmount\"" pod="openstack/mariadb-copy-data" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.730302 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7xml\" (UniqueName: \"kubernetes.io/projected/18e2c38d-2593-41b7-a74a-74fd5671f27d-kube-api-access-f7xml\") pod \"mariadb-copy-data\" (UID: \"18e2c38d-2593-41b7-a74a-74fd5671f27d\") " pod="openstack/mariadb-copy-data" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.746639 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-39da139f-5ed3-4973-89fc-b011508a45de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-39da139f-5ed3-4973-89fc-b011508a45de\") pod \"mariadb-copy-data\" (UID: \"18e2c38d-2593-41b7-a74a-74fd5671f27d\") " pod="openstack/mariadb-copy-data" Dec 03 07:58:01 crc kubenswrapper[4831]: I1203 07:58:01.857067 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 03 07:58:02 crc kubenswrapper[4831]: I1203 07:58:02.493502 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-copy-data"] Dec 03 07:58:02 crc kubenswrapper[4831]: I1203 07:58:02.746127 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"18e2c38d-2593-41b7-a74a-74fd5671f27d","Type":"ContainerStarted","Data":"e1026a1036699d43cf788f6d01284c6257c26b6806cfb3a948b3e5cac425bee1"} Dec 03 07:58:02 crc kubenswrapper[4831]: I1203 07:58:02.746515 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"18e2c38d-2593-41b7-a74a-74fd5671f27d","Type":"ContainerStarted","Data":"cb7060465747354d2071d28bf5973e06448c8266d3daf9aa7801a5461856fed9"} Dec 03 07:58:02 crc kubenswrapper[4831]: I1203 07:58:02.763464 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/mariadb-copy-data" podStartSLOduration=2.7634335500000002 podStartE2EDuration="2.76343355s" podCreationTimestamp="2025-12-03 07:58:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:58:02.762844431 +0000 UTC m=+5220.106427929" watchObservedRunningTime="2025-12-03 07:58:02.76343355 +0000 UTC m=+5220.107017118" Dec 03 07:58:04 crc kubenswrapper[4831]: I1203 07:58:04.012426 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:58:04 crc kubenswrapper[4831]: E1203 07:58:04.013254 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:58:05 crc kubenswrapper[4831]: I1203 07:58:05.497803 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 03 07:58:05 crc kubenswrapper[4831]: I1203 07:58:05.499514 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 07:58:05 crc kubenswrapper[4831]: I1203 07:58:05.516760 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 03 07:58:05 crc kubenswrapper[4831]: I1203 07:58:05.563653 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m85cl\" (UniqueName: \"kubernetes.io/projected/81e96d0f-a8f1-477d-938c-62c5b99ec3cc-kube-api-access-m85cl\") pod \"mariadb-client\" (UID: \"81e96d0f-a8f1-477d-938c-62c5b99ec3cc\") " pod="openstack/mariadb-client" Dec 03 07:58:05 crc kubenswrapper[4831]: I1203 07:58:05.665766 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m85cl\" (UniqueName: \"kubernetes.io/projected/81e96d0f-a8f1-477d-938c-62c5b99ec3cc-kube-api-access-m85cl\") pod \"mariadb-client\" (UID: \"81e96d0f-a8f1-477d-938c-62c5b99ec3cc\") " pod="openstack/mariadb-client" Dec 03 07:58:05 crc kubenswrapper[4831]: I1203 07:58:05.692748 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m85cl\" (UniqueName: \"kubernetes.io/projected/81e96d0f-a8f1-477d-938c-62c5b99ec3cc-kube-api-access-m85cl\") pod \"mariadb-client\" (UID: \"81e96d0f-a8f1-477d-938c-62c5b99ec3cc\") " pod="openstack/mariadb-client" Dec 03 07:58:05 crc kubenswrapper[4831]: I1203 07:58:05.835778 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 07:58:06 crc kubenswrapper[4831]: I1203 07:58:06.131042 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 03 07:58:06 crc kubenswrapper[4831]: W1203 07:58:06.133270 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81e96d0f_a8f1_477d_938c_62c5b99ec3cc.slice/crio-f07c70fb6318974d15d61253ddc422bf46e7f4a5d5d1a32cd48a3a912a00b54a WatchSource:0}: Error finding container f07c70fb6318974d15d61253ddc422bf46e7f4a5d5d1a32cd48a3a912a00b54a: Status 404 returned error can't find the container with id f07c70fb6318974d15d61253ddc422bf46e7f4a5d5d1a32cd48a3a912a00b54a Dec 03 07:58:06 crc kubenswrapper[4831]: I1203 07:58:06.787303 4831 generic.go:334] "Generic (PLEG): container finished" podID="81e96d0f-a8f1-477d-938c-62c5b99ec3cc" containerID="3c6d705e4d2ae977f2dd90a037a2aae5787f76589898c4b2620b499a8514f34a" exitCode=0 Dec 03 07:58:06 crc kubenswrapper[4831]: I1203 07:58:06.787480 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"81e96d0f-a8f1-477d-938c-62c5b99ec3cc","Type":"ContainerDied","Data":"3c6d705e4d2ae977f2dd90a037a2aae5787f76589898c4b2620b499a8514f34a"} Dec 03 07:58:06 crc kubenswrapper[4831]: I1203 07:58:06.787794 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"81e96d0f-a8f1-477d-938c-62c5b99ec3cc","Type":"ContainerStarted","Data":"f07c70fb6318974d15d61253ddc422bf46e7f4a5d5d1a32cd48a3a912a00b54a"} Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.071849 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.093714 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_81e96d0f-a8f1-477d-938c-62c5b99ec3cc/mariadb-client/0.log" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.109043 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m85cl\" (UniqueName: \"kubernetes.io/projected/81e96d0f-a8f1-477d-938c-62c5b99ec3cc-kube-api-access-m85cl\") pod \"81e96d0f-a8f1-477d-938c-62c5b99ec3cc\" (UID: \"81e96d0f-a8f1-477d-938c-62c5b99ec3cc\") " Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.116517 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81e96d0f-a8f1-477d-938c-62c5b99ec3cc-kube-api-access-m85cl" (OuterVolumeSpecName: "kube-api-access-m85cl") pod "81e96d0f-a8f1-477d-938c-62c5b99ec3cc" (UID: "81e96d0f-a8f1-477d-938c-62c5b99ec3cc"). InnerVolumeSpecName "kube-api-access-m85cl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.119194 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.123972 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.212070 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m85cl\" (UniqueName: \"kubernetes.io/projected/81e96d0f-a8f1-477d-938c-62c5b99ec3cc-kube-api-access-m85cl\") on node \"crc\" DevicePath \"\"" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.258759 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/mariadb-client"] Dec 03 07:58:08 crc kubenswrapper[4831]: E1203 07:58:08.259201 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81e96d0f-a8f1-477d-938c-62c5b99ec3cc" containerName="mariadb-client" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.259230 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="81e96d0f-a8f1-477d-938c-62c5b99ec3cc" containerName="mariadb-client" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.259455 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="81e96d0f-a8f1-477d-938c-62c5b99ec3cc" containerName="mariadb-client" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.260133 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.275003 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.313404 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpzcp\" (UniqueName: \"kubernetes.io/projected/5a59ef51-d944-4d6b-a943-1fb0a50d1dea-kube-api-access-rpzcp\") pod \"mariadb-client\" (UID: \"5a59ef51-d944-4d6b-a943-1fb0a50d1dea\") " pod="openstack/mariadb-client" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.415387 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpzcp\" (UniqueName: \"kubernetes.io/projected/5a59ef51-d944-4d6b-a943-1fb0a50d1dea-kube-api-access-rpzcp\") pod \"mariadb-client\" (UID: \"5a59ef51-d944-4d6b-a943-1fb0a50d1dea\") " pod="openstack/mariadb-client" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.438252 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpzcp\" (UniqueName: \"kubernetes.io/projected/5a59ef51-d944-4d6b-a943-1fb0a50d1dea-kube-api-access-rpzcp\") pod \"mariadb-client\" (UID: \"5a59ef51-d944-4d6b-a943-1fb0a50d1dea\") " pod="openstack/mariadb-client" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.590262 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.805927 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f07c70fb6318974d15d61253ddc422bf46e7f4a5d5d1a32cd48a3a912a00b54a" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.806000 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 07:58:08 crc kubenswrapper[4831]: I1203 07:58:08.832794 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/mariadb-client" oldPodUID="81e96d0f-a8f1-477d-938c-62c5b99ec3cc" podUID="5a59ef51-d944-4d6b-a943-1fb0a50d1dea" Dec 03 07:58:09 crc kubenswrapper[4831]: I1203 07:58:09.024099 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81e96d0f-a8f1-477d-938c-62c5b99ec3cc" path="/var/lib/kubelet/pods/81e96d0f-a8f1-477d-938c-62c5b99ec3cc/volumes" Dec 03 07:58:09 crc kubenswrapper[4831]: I1203 07:58:09.088304 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/mariadb-client"] Dec 03 07:58:09 crc kubenswrapper[4831]: W1203 07:58:09.093908 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5a59ef51_d944_4d6b_a943_1fb0a50d1dea.slice/crio-d3228db335d742b913415b5bb407c3f08dec766ec8c4f33d3bd939e1d41282f8 WatchSource:0}: Error finding container d3228db335d742b913415b5bb407c3f08dec766ec8c4f33d3bd939e1d41282f8: Status 404 returned error can't find the container with id d3228db335d742b913415b5bb407c3f08dec766ec8c4f33d3bd939e1d41282f8 Dec 03 07:58:09 crc kubenswrapper[4831]: I1203 07:58:09.830638 4831 generic.go:334] "Generic (PLEG): container finished" podID="5a59ef51-d944-4d6b-a943-1fb0a50d1dea" containerID="7011694c744a2a9af9fe6a52114969cdc7e7b33736f2062ec3de97055b6631c0" exitCode=0 Dec 03 07:58:09 crc kubenswrapper[4831]: I1203 07:58:09.830991 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"5a59ef51-d944-4d6b-a943-1fb0a50d1dea","Type":"ContainerDied","Data":"7011694c744a2a9af9fe6a52114969cdc7e7b33736f2062ec3de97055b6631c0"} Dec 03 07:58:09 crc kubenswrapper[4831]: I1203 07:58:09.831147 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-client" event={"ID":"5a59ef51-d944-4d6b-a943-1fb0a50d1dea","Type":"ContainerStarted","Data":"d3228db335d742b913415b5bb407c3f08dec766ec8c4f33d3bd939e1d41282f8"} Dec 03 07:58:11 crc kubenswrapper[4831]: I1203 07:58:11.232680 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 07:58:11 crc kubenswrapper[4831]: I1203 07:58:11.252925 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_mariadb-client_5a59ef51-d944-4d6b-a943-1fb0a50d1dea/mariadb-client/0.log" Dec 03 07:58:11 crc kubenswrapper[4831]: I1203 07:58:11.267574 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpzcp\" (UniqueName: \"kubernetes.io/projected/5a59ef51-d944-4d6b-a943-1fb0a50d1dea-kube-api-access-rpzcp\") pod \"5a59ef51-d944-4d6b-a943-1fb0a50d1dea\" (UID: \"5a59ef51-d944-4d6b-a943-1fb0a50d1dea\") " Dec 03 07:58:11 crc kubenswrapper[4831]: I1203 07:58:11.278034 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-client"] Dec 03 07:58:11 crc kubenswrapper[4831]: I1203 07:58:11.292989 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-client"] Dec 03 07:58:11 crc kubenswrapper[4831]: I1203 07:58:11.293636 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a59ef51-d944-4d6b-a943-1fb0a50d1dea-kube-api-access-rpzcp" (OuterVolumeSpecName: "kube-api-access-rpzcp") pod "5a59ef51-d944-4d6b-a943-1fb0a50d1dea" (UID: "5a59ef51-d944-4d6b-a943-1fb0a50d1dea"). InnerVolumeSpecName "kube-api-access-rpzcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:58:11 crc kubenswrapper[4831]: I1203 07:58:11.369567 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpzcp\" (UniqueName: \"kubernetes.io/projected/5a59ef51-d944-4d6b-a943-1fb0a50d1dea-kube-api-access-rpzcp\") on node \"crc\" DevicePath \"\"" Dec 03 07:58:11 crc kubenswrapper[4831]: I1203 07:58:11.856741 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3228db335d742b913415b5bb407c3f08dec766ec8c4f33d3bd939e1d41282f8" Dec 03 07:58:11 crc kubenswrapper[4831]: I1203 07:58:11.856855 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-client" Dec 03 07:58:13 crc kubenswrapper[4831]: I1203 07:58:13.024906 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a59ef51-d944-4d6b-a943-1fb0a50d1dea" path="/var/lib/kubelet/pods/5a59ef51-d944-4d6b-a943-1fb0a50d1dea/volumes" Dec 03 07:58:18 crc kubenswrapper[4831]: I1203 07:58:18.013246 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:58:18 crc kubenswrapper[4831]: E1203 07:58:18.013980 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:58:31 crc kubenswrapper[4831]: I1203 07:58:31.014078 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:58:31 crc kubenswrapper[4831]: E1203 07:58:31.016173 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:58:40 crc kubenswrapper[4831]: E1203 07:58:40.454726 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.234:55842->38.102.83.234:39573: write tcp 38.102.83.234:55842->38.102.83.234:39573: write: broken pipe Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.113656 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:58:42 crc kubenswrapper[4831]: E1203 07:58:42.114283 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.884559 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 07:58:42 crc kubenswrapper[4831]: E1203 07:58:42.885097 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a59ef51-d944-4d6b-a943-1fb0a50d1dea" containerName="mariadb-client" Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.885136 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a59ef51-d944-4d6b-a943-1fb0a50d1dea" containerName="mariadb-client" Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.885500 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a59ef51-d944-4d6b-a943-1fb0a50d1dea" containerName="mariadb-client" Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.935264 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.939211 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-csxgr" Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.938562 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.946289 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.947563 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.954909 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.956788 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.965890 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.967167 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.973714 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 03 07:58:42 crc kubenswrapper[4831]: I1203 07:58:42.985024 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.009307 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e33c41f-a322-46e0-83ed-309466254c79-config\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.009391 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e33c41f-a322-46e0-83ed-309466254c79-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.009441 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tlw2\" (UniqueName: \"kubernetes.io/projected/7e33c41f-a322-46e0-83ed-309466254c79-kube-api-access-7tlw2\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.009576 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e33c41f-a322-46e0-83ed-309466254c79-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.009646 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-31b125f4-4cdb-4307-bfdc-43fe50f679ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31b125f4-4cdb-4307-bfdc-43fe50f679ed\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.009738 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e33c41f-a322-46e0-83ed-309466254c79-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.080790 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.082266 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.089816 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-wjx47" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.090105 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.094721 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110624 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tlw2\" (UniqueName: \"kubernetes.io/projected/7e33c41f-a322-46e0-83ed-309466254c79-kube-api-access-7tlw2\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110680 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqh2c\" (UniqueName: \"kubernetes.io/projected/a1bcc142-49f9-466a-af9b-033b4375a87e-kube-api-access-lqh2c\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110705 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0589080a-1977-4ad4-9660-3db4472b78b4-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110735 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e33c41f-a322-46e0-83ed-309466254c79-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110763 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1bcc142-49f9-466a-af9b-033b4375a87e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110782 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-31b125f4-4cdb-4307-bfdc-43fe50f679ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31b125f4-4cdb-4307-bfdc-43fe50f679ed\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110800 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1bcc142-49f9-466a-af9b-033b4375a87e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110817 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0589080a-1977-4ad4-9660-3db4472b78b4-config\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110832 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0589080a-1977-4ad4-9660-3db4472b78b4-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110878 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bcc142-49f9-466a-af9b-033b4375a87e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110922 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e33c41f-a322-46e0-83ed-309466254c79-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110941 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-ff9b050a-95eb-42f6-b375-e5d4aa16702b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff9b050a-95eb-42f6-b375-e5d4aa16702b\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110964 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-a8637635-760d-44f7-a288-906450e44dff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8637635-760d-44f7-a288-906450e44dff\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.110986 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0589080a-1977-4ad4-9660-3db4472b78b4-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.111031 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e33c41f-a322-46e0-83ed-309466254c79-config\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.111045 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bcc142-49f9-466a-af9b-033b4375a87e-config\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.111072 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hn6sv\" (UniqueName: \"kubernetes.io/projected/0589080a-1977-4ad4-9660-3db4472b78b4-kube-api-access-hn6sv\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.111091 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e33c41f-a322-46e0-83ed-309466254c79-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.111492 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7e33c41f-a322-46e0-83ed-309466254c79-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.112394 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e33c41f-a322-46e0-83ed-309466254c79-config\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.113386 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7e33c41f-a322-46e0-83ed-309466254c79-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.125096 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.128774 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e33c41f-a322-46e0-83ed-309466254c79-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.131429 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.132513 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.132553 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-31b125f4-4cdb-4307-bfdc-43fe50f679ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31b125f4-4cdb-4307-bfdc-43fe50f679ed\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/58d7a5ccb0383581ea5d320cac2a22f947c2bd0f33e0a4c47610eb13c84b2c00/globalmount\"" pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.133420 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.138471 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.140908 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tlw2\" (UniqueName: \"kubernetes.io/projected/7e33c41f-a322-46e0-83ed-309466254c79-kube-api-access-7tlw2\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.141240 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.161456 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.170661 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.181713 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-31b125f4-4cdb-4307-bfdc-43fe50f679ed\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-31b125f4-4cdb-4307-bfdc-43fe50f679ed\") pod \"ovsdbserver-nb-0\" (UID: \"7e33c41f-a322-46e0-83ed-309466254c79\") " pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212576 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-a8637635-760d-44f7-a288-906450e44dff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8637635-760d-44f7-a288-906450e44dff\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212637 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-32e721e8-3107-4a36-8c5a-350bb366fe05\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32e721e8-3107-4a36-8c5a-350bb366fe05\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212661 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvbwx\" (UniqueName: \"kubernetes.io/projected/acbbc7a4-e670-4315-b4d2-28702c8af2aa-kube-api-access-bvbwx\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212682 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0589080a-1977-4ad4-9660-3db4472b78b4-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212702 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2131c614-9f45-4b2b-99cf-1b830da4013a-config\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212727 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2131c614-9f45-4b2b-99cf-1b830da4013a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212746 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/acbbc7a4-e670-4315-b4d2-28702c8af2aa-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212763 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bcc142-49f9-466a-af9b-033b4375a87e-config\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212777 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4m5t\" (UniqueName: \"kubernetes.io/projected/80e2b474-5b25-4074-920b-844874ab8fab-kube-api-access-m4m5t\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212804 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hn6sv\" (UniqueName: \"kubernetes.io/projected/0589080a-1977-4ad4-9660-3db4472b78b4-kube-api-access-hn6sv\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212827 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-7da54e56-fc86-428a-b904-0aaa73b2a446\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7da54e56-fc86-428a-b904-0aaa73b2a446\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212859 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2131c614-9f45-4b2b-99cf-1b830da4013a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212882 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-9184547e-9130-4101-8774-8ccf24276007\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9184547e-9130-4101-8774-8ccf24276007\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212900 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2131c614-9f45-4b2b-99cf-1b830da4013a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212914 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqh2c\" (UniqueName: \"kubernetes.io/projected/a1bcc142-49f9-466a-af9b-033b4375a87e-kube-api-access-lqh2c\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212930 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/80e2b474-5b25-4074-920b-844874ab8fab-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212956 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0589080a-1977-4ad4-9660-3db4472b78b4-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212979 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1bcc142-49f9-466a-af9b-033b4375a87e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.212996 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80e2b474-5b25-4074-920b-844874ab8fab-config\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.213016 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1bcc142-49f9-466a-af9b-033b4375a87e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.213032 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acbbc7a4-e670-4315-b4d2-28702c8af2aa-config\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.213049 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wb9cc\" (UniqueName: \"kubernetes.io/projected/2131c614-9f45-4b2b-99cf-1b830da4013a-kube-api-access-wb9cc\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.213065 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0589080a-1977-4ad4-9660-3db4472b78b4-config\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.213087 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0589080a-1977-4ad4-9660-3db4472b78b4-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.213104 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bcc142-49f9-466a-af9b-033b4375a87e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.213121 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80e2b474-5b25-4074-920b-844874ab8fab-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.213137 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbbc7a4-e670-4315-b4d2-28702c8af2aa-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.213159 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acbbc7a4-e670-4315-b4d2-28702c8af2aa-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.213178 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-ff9b050a-95eb-42f6-b375-e5d4aa16702b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff9b050a-95eb-42f6-b375-e5d4aa16702b\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.213194 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e2b474-5b25-4074-920b-844874ab8fab-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.220181 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/0589080a-1977-4ad4-9660-3db4472b78b4-ovsdb-rundir\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.226837 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.226885 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-a8637635-760d-44f7-a288-906450e44dff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8637635-760d-44f7-a288-906450e44dff\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/067fc81443cbf65055d030822297626be15c896704b243cb1e25d01618aa4e45/globalmount\"" pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.227216 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0589080a-1977-4ad4-9660-3db4472b78b4-scripts\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.227242 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0589080a-1977-4ad4-9660-3db4472b78b4-config\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.227338 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a1bcc142-49f9-466a-af9b-033b4375a87e-combined-ca-bundle\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.227570 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/a1bcc142-49f9-466a-af9b-033b4375a87e-ovsdb-rundir\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.227754 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.227800 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-ff9b050a-95eb-42f6-b375-e5d4aa16702b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff9b050a-95eb-42f6-b375-e5d4aa16702b\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/4123a3090b52479515cf7d39982f49aedbd813baea591a26c9816ab3d89bc405/globalmount\"" pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.228187 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a1bcc142-49f9-466a-af9b-033b4375a87e-config\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.228473 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/a1bcc142-49f9-466a-af9b-033b4375a87e-scripts\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.230017 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0589080a-1977-4ad4-9660-3db4472b78b4-combined-ca-bundle\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.231056 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hn6sv\" (UniqueName: \"kubernetes.io/projected/0589080a-1977-4ad4-9660-3db4472b78b4-kube-api-access-hn6sv\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.235060 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqh2c\" (UniqueName: \"kubernetes.io/projected/a1bcc142-49f9-466a-af9b-033b4375a87e-kube-api-access-lqh2c\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.260922 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-a8637635-760d-44f7-a288-906450e44dff\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-a8637635-760d-44f7-a288-906450e44dff\") pod \"ovsdbserver-nb-1\" (UID: \"a1bcc142-49f9-466a-af9b-033b4375a87e\") " pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.262188 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-ff9b050a-95eb-42f6-b375-e5d4aa16702b\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-ff9b050a-95eb-42f6-b375-e5d4aa16702b\") pod \"ovsdbserver-nb-2\" (UID: \"0589080a-1977-4ad4-9660-3db4472b78b4\") " pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.281661 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.298423 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.310082 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315155 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2131c614-9f45-4b2b-99cf-1b830da4013a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315213 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-9184547e-9130-4101-8774-8ccf24276007\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9184547e-9130-4101-8774-8ccf24276007\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315247 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2131c614-9f45-4b2b-99cf-1b830da4013a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315266 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/80e2b474-5b25-4074-920b-844874ab8fab-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315351 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80e2b474-5b25-4074-920b-844874ab8fab-config\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315384 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wb9cc\" (UniqueName: \"kubernetes.io/projected/2131c614-9f45-4b2b-99cf-1b830da4013a-kube-api-access-wb9cc\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315406 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acbbc7a4-e670-4315-b4d2-28702c8af2aa-config\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315439 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80e2b474-5b25-4074-920b-844874ab8fab-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315465 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbbc7a4-e670-4315-b4d2-28702c8af2aa-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315495 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acbbc7a4-e670-4315-b4d2-28702c8af2aa-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315521 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e2b474-5b25-4074-920b-844874ab8fab-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315556 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-32e721e8-3107-4a36-8c5a-350bb366fe05\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32e721e8-3107-4a36-8c5a-350bb366fe05\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315584 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvbwx\" (UniqueName: \"kubernetes.io/projected/acbbc7a4-e670-4315-b4d2-28702c8af2aa-kube-api-access-bvbwx\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315607 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2131c614-9f45-4b2b-99cf-1b830da4013a-config\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315659 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2131c614-9f45-4b2b-99cf-1b830da4013a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315685 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/acbbc7a4-e670-4315-b4d2-28702c8af2aa-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315708 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4m5t\" (UniqueName: \"kubernetes.io/projected/80e2b474-5b25-4074-920b-844874ab8fab-kube-api-access-m4m5t\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.315746 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-7da54e56-fc86-428a-b904-0aaa73b2a446\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7da54e56-fc86-428a-b904-0aaa73b2a446\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.316000 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/80e2b474-5b25-4074-920b-844874ab8fab-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.316483 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2131c614-9f45-4b2b-99cf-1b830da4013a-scripts\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.316569 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/acbbc7a4-e670-4315-b4d2-28702c8af2aa-ovsdb-rundir\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.316897 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/80e2b474-5b25-4074-920b-844874ab8fab-config\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.317595 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/acbbc7a4-e670-4315-b4d2-28702c8af2aa-config\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.317668 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2131c614-9f45-4b2b-99cf-1b830da4013a-config\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.318052 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/2131c614-9f45-4b2b-99cf-1b830da4013a-ovsdb-rundir\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.318286 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acbbc7a4-e670-4315-b4d2-28702c8af2aa-scripts\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.319020 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/80e2b474-5b25-4074-920b-844874ab8fab-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.321096 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.321125 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-9184547e-9130-4101-8774-8ccf24276007\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9184547e-9130-4101-8774-8ccf24276007\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/f9ea67928c9e8b5c441e9e01e8c127ea8d6e50b5659b3e2e55373db035943245/globalmount\"" pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.321071 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2131c614-9f45-4b2b-99cf-1b830da4013a-combined-ca-bundle\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.321773 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.321864 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-32e721e8-3107-4a36-8c5a-350bb366fe05\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32e721e8-3107-4a36-8c5a-350bb366fe05\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/fd9db860202d04011c81ec3d4a1ff809c8b9f75320eb2273b6255ac7b43a7632/globalmount\"" pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.322060 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80e2b474-5b25-4074-920b-844874ab8fab-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.322422 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acbbc7a4-e670-4315-b4d2-28702c8af2aa-combined-ca-bundle\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.322765 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.322872 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-7da54e56-fc86-428a-b904-0aaa73b2a446\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7da54e56-fc86-428a-b904-0aaa73b2a446\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/efb61dc03ed876d52a0382d0d3f9690add150cd4c3d4e9b1cc168b99d01756f3/globalmount\"" pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.340161 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4m5t\" (UniqueName: \"kubernetes.io/projected/80e2b474-5b25-4074-920b-844874ab8fab-kube-api-access-m4m5t\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.340743 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wb9cc\" (UniqueName: \"kubernetes.io/projected/2131c614-9f45-4b2b-99cf-1b830da4013a-kube-api-access-wb9cc\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.341380 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvbwx\" (UniqueName: \"kubernetes.io/projected/acbbc7a4-e670-4315-b4d2-28702c8af2aa-kube-api-access-bvbwx\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.364967 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-32e721e8-3107-4a36-8c5a-350bb366fe05\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-32e721e8-3107-4a36-8c5a-350bb366fe05\") pod \"ovsdbserver-sb-0\" (UID: \"80e2b474-5b25-4074-920b-844874ab8fab\") " pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.373971 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-9184547e-9130-4101-8774-8ccf24276007\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-9184547e-9130-4101-8774-8ccf24276007\") pod \"ovsdbserver-sb-1\" (UID: \"2131c614-9f45-4b2b-99cf-1b830da4013a\") " pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.386416 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-7da54e56-fc86-428a-b904-0aaa73b2a446\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-7da54e56-fc86-428a-b904-0aaa73b2a446\") pod \"ovsdbserver-sb-2\" (UID: \"acbbc7a4-e670-4315-b4d2-28702c8af2aa\") " pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.429627 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.497999 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.510812 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.768128 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Dec 03 07:58:43 crc kubenswrapper[4831]: I1203 07:58:43.863727 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-2"] Dec 03 07:58:43 crc kubenswrapper[4831]: W1203 07:58:43.870732 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0589080a_1977_4ad4_9660_3db4472b78b4.slice/crio-c5993d58eb63696bc9f75a6e7f793fcfd3fb15b0acf4d039ad6650ea7823e115 WatchSource:0}: Error finding container c5993d58eb63696bc9f75a6e7f793fcfd3fb15b0acf4d039ad6650ea7823e115: Status 404 returned error can't find the container with id c5993d58eb63696bc9f75a6e7f793fcfd3fb15b0acf4d039ad6650ea7823e115 Dec 03 07:58:44 crc kubenswrapper[4831]: I1203 07:58:44.045443 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-2"] Dec 03 07:58:44 crc kubenswrapper[4831]: W1203 07:58:44.057794 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podacbbc7a4_e670_4315_b4d2_28702c8af2aa.slice/crio-3315ae3b282c0d6344b797a81a306e1801811e060ef222ec8c5d417ec53ff8c9 WatchSource:0}: Error finding container 3315ae3b282c0d6344b797a81a306e1801811e060ef222ec8c5d417ec53ff8c9: Status 404 returned error can't find the container with id 3315ae3b282c0d6344b797a81a306e1801811e060ef222ec8c5d417ec53ff8c9 Dec 03 07:58:44 crc kubenswrapper[4831]: I1203 07:58:44.221500 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0589080a-1977-4ad4-9660-3db4472b78b4","Type":"ContainerStarted","Data":"20c8e72fccbf9b1d5fa895e1115db6ccbd55e1dedc2f48718455e0fa52769b92"} Dec 03 07:58:44 crc kubenswrapper[4831]: I1203 07:58:44.221667 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0589080a-1977-4ad4-9660-3db4472b78b4","Type":"ContainerStarted","Data":"830e789093cf90087f6a779526ca752060ac47b351ab87373cadadf0f6a6b31b"} Dec 03 07:58:44 crc kubenswrapper[4831]: I1203 07:58:44.221692 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-2" event={"ID":"0589080a-1977-4ad4-9660-3db4472b78b4","Type":"ContainerStarted","Data":"c5993d58eb63696bc9f75a6e7f793fcfd3fb15b0acf4d039ad6650ea7823e115"} Dec 03 07:58:44 crc kubenswrapper[4831]: I1203 07:58:44.229625 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"80e2b474-5b25-4074-920b-844874ab8fab","Type":"ContainerStarted","Data":"6d3ef560571e1fc68e44ceb8c81d7492c58b7baa94c429d02af48b339b274744"} Dec 03 07:58:44 crc kubenswrapper[4831]: I1203 07:58:44.229686 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"80e2b474-5b25-4074-920b-844874ab8fab","Type":"ContainerStarted","Data":"db338acc26f7e34d9255e8fd92c07272138f7302508d62af35630089767fa70d"} Dec 03 07:58:44 crc kubenswrapper[4831]: I1203 07:58:44.229704 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"80e2b474-5b25-4074-920b-844874ab8fab","Type":"ContainerStarted","Data":"2be1d2781af86df236e339d11a5562e648c8d334bf3cfe3f6720accaf50df2ca"} Dec 03 07:58:44 crc kubenswrapper[4831]: I1203 07:58:44.232131 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"acbbc7a4-e670-4315-b4d2-28702c8af2aa","Type":"ContainerStarted","Data":"da1e634a5c1d2c8d46db52a5f34871e21c03bbba9dd24904a608be7661faf46c"} Dec 03 07:58:44 crc kubenswrapper[4831]: I1203 07:58:44.232171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"acbbc7a4-e670-4315-b4d2-28702c8af2aa","Type":"ContainerStarted","Data":"3315ae3b282c0d6344b797a81a306e1801811e060ef222ec8c5d417ec53ff8c9"} Dec 03 07:58:44 crc kubenswrapper[4831]: I1203 07:58:44.242812 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-2" podStartSLOduration=3.242792727 podStartE2EDuration="3.242792727s" podCreationTimestamp="2025-12-03 07:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:58:44.239552276 +0000 UTC m=+5261.583135794" watchObservedRunningTime="2025-12-03 07:58:44.242792727 +0000 UTC m=+5261.586376235" Dec 03 07:58:44 crc kubenswrapper[4831]: I1203 07:58:44.267195 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=2.267169055 podStartE2EDuration="2.267169055s" podCreationTimestamp="2025-12-03 07:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:58:44.257824405 +0000 UTC m=+5261.601407903" watchObservedRunningTime="2025-12-03 07:58:44.267169055 +0000 UTC m=+5261.610752563" Dec 03 07:58:44 crc kubenswrapper[4831]: I1203 07:58:44.832311 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Dec 03 07:58:44 crc kubenswrapper[4831]: W1203 07:58:44.849084 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e33c41f_a322_46e0_83ed_309466254c79.slice/crio-b07ad040a06cc54e30c3d8f7b2458e277167ec7c70e6508e89f00c1586e9aff1 WatchSource:0}: Error finding container b07ad040a06cc54e30c3d8f7b2458e277167ec7c70e6508e89f00c1586e9aff1: Status 404 returned error can't find the container with id b07ad040a06cc54e30c3d8f7b2458e277167ec7c70e6508e89f00c1586e9aff1 Dec 03 07:58:44 crc kubenswrapper[4831]: I1203 07:58:44.909261 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-1"] Dec 03 07:58:45 crc kubenswrapper[4831]: I1203 07:58:45.244119 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"2131c614-9f45-4b2b-99cf-1b830da4013a","Type":"ContainerStarted","Data":"ac13d4955ada5ff41a0218a88dfecc83e8f6523d9f2f53ea596262dbf89dc251"} Dec 03 07:58:45 crc kubenswrapper[4831]: I1203 07:58:45.244224 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"2131c614-9f45-4b2b-99cf-1b830da4013a","Type":"ContainerStarted","Data":"b3024ee1fea1acb2298c6c356a7264ec975d39e0321d2b7d43eebf9ae6191b9d"} Dec 03 07:58:45 crc kubenswrapper[4831]: I1203 07:58:45.244248 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-1" event={"ID":"2131c614-9f45-4b2b-99cf-1b830da4013a","Type":"ContainerStarted","Data":"90e48aefdd19f4f51421ae5176d38893749f13f5b38992ef8c47a89b907f064c"} Dec 03 07:58:45 crc kubenswrapper[4831]: I1203 07:58:45.245998 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-2" event={"ID":"acbbc7a4-e670-4315-b4d2-28702c8af2aa","Type":"ContainerStarted","Data":"4c83c08d52ceca68be8df23313117b841f212a7db4ea45ce5bfe03d1f2ad74d8"} Dec 03 07:58:45 crc kubenswrapper[4831]: I1203 07:58:45.249087 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7e33c41f-a322-46e0-83ed-309466254c79","Type":"ContainerStarted","Data":"6281241621fc3b63389377e33c859f3372348ca5a7cf992588ee8bab2e24f93e"} Dec 03 07:58:45 crc kubenswrapper[4831]: I1203 07:58:45.249138 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7e33c41f-a322-46e0-83ed-309466254c79","Type":"ContainerStarted","Data":"fc616c3b5f6ad8a1177f8664f6865a2dd152406b4cd84ebf9f49dd750ced5103"} Dec 03 07:58:45 crc kubenswrapper[4831]: I1203 07:58:45.249162 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"7e33c41f-a322-46e0-83ed-309466254c79","Type":"ContainerStarted","Data":"b07ad040a06cc54e30c3d8f7b2458e277167ec7c70e6508e89f00c1586e9aff1"} Dec 03 07:58:45 crc kubenswrapper[4831]: I1203 07:58:45.271222 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-1" podStartSLOduration=3.2711955010000002 podStartE2EDuration="3.271195501s" podCreationTimestamp="2025-12-03 07:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:58:45.261347844 +0000 UTC m=+5262.604931352" watchObservedRunningTime="2025-12-03 07:58:45.271195501 +0000 UTC m=+5262.614779039" Dec 03 07:58:45 crc kubenswrapper[4831]: I1203 07:58:45.286206 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-2" podStartSLOduration=3.286185107 podStartE2EDuration="3.286185107s" podCreationTimestamp="2025-12-03 07:58:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:58:45.277773335 +0000 UTC m=+5262.621356843" watchObservedRunningTime="2025-12-03 07:58:45.286185107 +0000 UTC m=+5262.629768645" Dec 03 07:58:45 crc kubenswrapper[4831]: I1203 07:58:45.299218 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=4.299200523 podStartE2EDuration="4.299200523s" podCreationTimestamp="2025-12-03 07:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:58:45.292190504 +0000 UTC m=+5262.635774012" watchObservedRunningTime="2025-12-03 07:58:45.299200523 +0000 UTC m=+5262.642784031" Dec 03 07:58:45 crc kubenswrapper[4831]: I1203 07:58:45.646847 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-1"] Dec 03 07:58:46 crc kubenswrapper[4831]: I1203 07:58:46.268792 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"a1bcc142-49f9-466a-af9b-033b4375a87e","Type":"ContainerStarted","Data":"85fa91a72c4d6cef15034e4527a8c1e73da4eb5aed1428c264447fb72267c749"} Dec 03 07:58:46 crc kubenswrapper[4831]: I1203 07:58:46.268867 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"a1bcc142-49f9-466a-af9b-033b4375a87e","Type":"ContainerStarted","Data":"9b011985242807f611a51b888949e84ca89573b96ebfab3d6977a0fd87de5ab9"} Dec 03 07:58:46 crc kubenswrapper[4831]: I1203 07:58:46.268890 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-1" event={"ID":"a1bcc142-49f9-466a-af9b-033b4375a87e","Type":"ContainerStarted","Data":"baee90de8eb0b59e894ff0cb37ffec45955af9acdee759d209f2cf7935a8bc9b"} Dec 03 07:58:46 crc kubenswrapper[4831]: I1203 07:58:46.282868 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:46 crc kubenswrapper[4831]: I1203 07:58:46.299176 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:46 crc kubenswrapper[4831]: I1203 07:58:46.305747 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-1" podStartSLOduration=5.305720905 podStartE2EDuration="5.305720905s" podCreationTimestamp="2025-12-03 07:58:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:58:46.294699822 +0000 UTC m=+5263.638283350" watchObservedRunningTime="2025-12-03 07:58:46.305720905 +0000 UTC m=+5263.649304453" Dec 03 07:58:46 crc kubenswrapper[4831]: I1203 07:58:46.311026 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:46 crc kubenswrapper[4831]: I1203 07:58:46.430609 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:46 crc kubenswrapper[4831]: I1203 07:58:46.498736 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:46 crc kubenswrapper[4831]: I1203 07:58:46.499433 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:46 crc kubenswrapper[4831]: I1203 07:58:46.511914 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:47 crc kubenswrapper[4831]: I1203 07:58:47.280793 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.282249 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.299364 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.310495 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.338861 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.499076 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.512115 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.612445 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6487f7f8fc-xmw9r"] Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.614007 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.616854 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.641362 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6487f7f8fc-xmw9r"] Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.712457 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-ovsdbserver-sb\") pod \"dnsmasq-dns-6487f7f8fc-xmw9r\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.712530 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-dns-svc\") pod \"dnsmasq-dns-6487f7f8fc-xmw9r\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.712586 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-config\") pod \"dnsmasq-dns-6487f7f8fc-xmw9r\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.712636 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jb9c6\" (UniqueName: \"kubernetes.io/projected/ef666420-cc90-4c72-9f0c-9672ea0037f1-kube-api-access-jb9c6\") pod \"dnsmasq-dns-6487f7f8fc-xmw9r\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.814635 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-dns-svc\") pod \"dnsmasq-dns-6487f7f8fc-xmw9r\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.814904 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-config\") pod \"dnsmasq-dns-6487f7f8fc-xmw9r\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.814937 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jb9c6\" (UniqueName: \"kubernetes.io/projected/ef666420-cc90-4c72-9f0c-9672ea0037f1-kube-api-access-jb9c6\") pod \"dnsmasq-dns-6487f7f8fc-xmw9r\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.814998 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-ovsdbserver-sb\") pod \"dnsmasq-dns-6487f7f8fc-xmw9r\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.815517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-dns-svc\") pod \"dnsmasq-dns-6487f7f8fc-xmw9r\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.815622 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-config\") pod \"dnsmasq-dns-6487f7f8fc-xmw9r\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.815721 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-ovsdbserver-sb\") pod \"dnsmasq-dns-6487f7f8fc-xmw9r\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.846545 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jb9c6\" (UniqueName: \"kubernetes.io/projected/ef666420-cc90-4c72-9f0c-9672ea0037f1-kube-api-access-jb9c6\") pod \"dnsmasq-dns-6487f7f8fc-xmw9r\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:48 crc kubenswrapper[4831]: I1203 07:58:48.938903 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.155662 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6487f7f8fc-xmw9r"] Dec 03 07:58:49 crc kubenswrapper[4831]: W1203 07:58:49.160207 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podef666420_cc90_4c72_9f0c_9672ea0037f1.slice/crio-a8faf6ebd9a4983542cb3ca3ae85be2de7cb5e1efbf391b741e17fa56ddfbe3b WatchSource:0}: Error finding container a8faf6ebd9a4983542cb3ca3ae85be2de7cb5e1efbf391b741e17fa56ddfbe3b: Status 404 returned error can't find the container with id a8faf6ebd9a4983542cb3ca3ae85be2de7cb5e1efbf391b741e17fa56ddfbe3b Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.297595 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" event={"ID":"ef666420-cc90-4c72-9f0c-9672ea0037f1","Type":"ContainerStarted","Data":"a8faf6ebd9a4983542cb3ca3ae85be2de7cb5e1efbf391b741e17fa56ddfbe3b"} Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.321577 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.356581 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.361333 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.414199 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-2" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.551544 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.577721 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.705432 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-2" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.748043 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6487f7f8fc-xmw9r"] Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.769186 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-68f795456f-mmrsd"] Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.770654 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.777778 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.778074 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68f795456f-mmrsd"] Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.933997 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-dns-svc\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.934060 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-config\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.934096 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-ovsdbserver-nb\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.934432 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-ovsdbserver-sb\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:49 crc kubenswrapper[4831]: I1203 07:58:49.934483 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfl8g\" (UniqueName: \"kubernetes.io/projected/d7338210-677e-478a-a591-3eacfde2c30f-kube-api-access-lfl8g\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.036114 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-dns-svc\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.036164 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-config\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.036187 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-ovsdbserver-nb\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.036263 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-ovsdbserver-sb\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.036284 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfl8g\" (UniqueName: \"kubernetes.io/projected/d7338210-677e-478a-a591-3eacfde2c30f-kube-api-access-lfl8g\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.036984 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-dns-svc\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.037066 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-ovsdbserver-nb\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.037440 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-config\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.037738 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-ovsdbserver-sb\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.066626 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfl8g\" (UniqueName: \"kubernetes.io/projected/d7338210-677e-478a-a591-3eacfde2c30f-kube-api-access-lfl8g\") pod \"dnsmasq-dns-68f795456f-mmrsd\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.092866 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.307211 4831 generic.go:334] "Generic (PLEG): container finished" podID="ef666420-cc90-4c72-9f0c-9672ea0037f1" containerID="743474bad9fcaae8e0cf7386e00c499d812861467dfd47de1d310d516dd16a28" exitCode=0 Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.307300 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" event={"ID":"ef666420-cc90-4c72-9f0c-9672ea0037f1","Type":"ContainerDied","Data":"743474bad9fcaae8e0cf7386e00c499d812861467dfd47de1d310d516dd16a28"} Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.365714 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-1" Dec 03 07:58:50 crc kubenswrapper[4831]: I1203 07:58:50.573572 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-68f795456f-mmrsd"] Dec 03 07:58:50 crc kubenswrapper[4831]: W1203 07:58:50.580231 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd7338210_677e_478a_a591_3eacfde2c30f.slice/crio-3d43d225b09725a32f454026af7dba0bbee15b8fb9c099df88f73d16418a67c8 WatchSource:0}: Error finding container 3d43d225b09725a32f454026af7dba0bbee15b8fb9c099df88f73d16418a67c8: Status 404 returned error can't find the container with id 3d43d225b09725a32f454026af7dba0bbee15b8fb9c099df88f73d16418a67c8 Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.316928 4831 generic.go:334] "Generic (PLEG): container finished" podID="d7338210-677e-478a-a591-3eacfde2c30f" containerID="b332e8709831a59f3ceb0d2f28e06a243b57e90a84796d97c2fa580ff5882d73" exitCode=0 Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.316983 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f795456f-mmrsd" event={"ID":"d7338210-677e-478a-a591-3eacfde2c30f","Type":"ContainerDied","Data":"b332e8709831a59f3ceb0d2f28e06a243b57e90a84796d97c2fa580ff5882d73"} Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.317477 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f795456f-mmrsd" event={"ID":"d7338210-677e-478a-a591-3eacfde2c30f","Type":"ContainerStarted","Data":"3d43d225b09725a32f454026af7dba0bbee15b8fb9c099df88f73d16418a67c8"} Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.320287 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" event={"ID":"ef666420-cc90-4c72-9f0c-9672ea0037f1","Type":"ContainerStarted","Data":"e5b3172335de3d2f810358326945bc89d6071c422061b9f2df9e221d0170f2d0"} Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.320601 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" podUID="ef666420-cc90-4c72-9f0c-9672ea0037f1" containerName="dnsmasq-dns" containerID="cri-o://e5b3172335de3d2f810358326945bc89d6071c422061b9f2df9e221d0170f2d0" gracePeriod=10 Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.320692 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.380969 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" podStartSLOduration=3.380944276 podStartE2EDuration="3.380944276s" podCreationTimestamp="2025-12-03 07:58:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:58:51.375391323 +0000 UTC m=+5268.718974841" watchObservedRunningTime="2025-12-03 07:58:51.380944276 +0000 UTC m=+5268.724527814" Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.691348 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.766624 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jb9c6\" (UniqueName: \"kubernetes.io/projected/ef666420-cc90-4c72-9f0c-9672ea0037f1-kube-api-access-jb9c6\") pod \"ef666420-cc90-4c72-9f0c-9672ea0037f1\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.766783 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-dns-svc\") pod \"ef666420-cc90-4c72-9f0c-9672ea0037f1\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.766857 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-ovsdbserver-sb\") pod \"ef666420-cc90-4c72-9f0c-9672ea0037f1\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.766898 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-config\") pod \"ef666420-cc90-4c72-9f0c-9672ea0037f1\" (UID: \"ef666420-cc90-4c72-9f0c-9672ea0037f1\") " Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.773513 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ef666420-cc90-4c72-9f0c-9672ea0037f1-kube-api-access-jb9c6" (OuterVolumeSpecName: "kube-api-access-jb9c6") pod "ef666420-cc90-4c72-9f0c-9672ea0037f1" (UID: "ef666420-cc90-4c72-9f0c-9672ea0037f1"). InnerVolumeSpecName "kube-api-access-jb9c6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.807999 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-config" (OuterVolumeSpecName: "config") pod "ef666420-cc90-4c72-9f0c-9672ea0037f1" (UID: "ef666420-cc90-4c72-9f0c-9672ea0037f1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.818249 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ef666420-cc90-4c72-9f0c-9672ea0037f1" (UID: "ef666420-cc90-4c72-9f0c-9672ea0037f1"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.831768 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ef666420-cc90-4c72-9f0c-9672ea0037f1" (UID: "ef666420-cc90-4c72-9f0c-9672ea0037f1"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.873230 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.873567 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.874211 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jb9c6\" (UniqueName: \"kubernetes.io/projected/ef666420-cc90-4c72-9f0c-9672ea0037f1-kube-api-access-jb9c6\") on node \"crc\" DevicePath \"\"" Dec 03 07:58:51 crc kubenswrapper[4831]: I1203 07:58:51.874286 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ef666420-cc90-4c72-9f0c-9672ea0037f1-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.332779 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f795456f-mmrsd" event={"ID":"d7338210-677e-478a-a591-3eacfde2c30f","Type":"ContainerStarted","Data":"f708e044e38a9d6eab398d3aa60ec17ae49b825234a1fdf9bbb4a032f877968e"} Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.333499 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.337669 4831 generic.go:334] "Generic (PLEG): container finished" podID="ef666420-cc90-4c72-9f0c-9672ea0037f1" containerID="e5b3172335de3d2f810358326945bc89d6071c422061b9f2df9e221d0170f2d0" exitCode=0 Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.337708 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.337736 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" event={"ID":"ef666420-cc90-4c72-9f0c-9672ea0037f1","Type":"ContainerDied","Data":"e5b3172335de3d2f810358326945bc89d6071c422061b9f2df9e221d0170f2d0"} Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.338282 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6487f7f8fc-xmw9r" event={"ID":"ef666420-cc90-4c72-9f0c-9672ea0037f1","Type":"ContainerDied","Data":"a8faf6ebd9a4983542cb3ca3ae85be2de7cb5e1efbf391b741e17fa56ddfbe3b"} Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.338364 4831 scope.go:117] "RemoveContainer" containerID="e5b3172335de3d2f810358326945bc89d6071c422061b9f2df9e221d0170f2d0" Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.372838 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-68f795456f-mmrsd" podStartSLOduration=3.372814063 podStartE2EDuration="3.372814063s" podCreationTimestamp="2025-12-03 07:58:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:58:52.362909144 +0000 UTC m=+5269.706492732" watchObservedRunningTime="2025-12-03 07:58:52.372814063 +0000 UTC m=+5269.716397581" Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.393709 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6487f7f8fc-xmw9r"] Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.394014 4831 scope.go:117] "RemoveContainer" containerID="743474bad9fcaae8e0cf7386e00c499d812861467dfd47de1d310d516dd16a28" Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.406973 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6487f7f8fc-xmw9r"] Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.421573 4831 scope.go:117] "RemoveContainer" containerID="e5b3172335de3d2f810358326945bc89d6071c422061b9f2df9e221d0170f2d0" Dec 03 07:58:52 crc kubenswrapper[4831]: E1203 07:58:52.422219 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e5b3172335de3d2f810358326945bc89d6071c422061b9f2df9e221d0170f2d0\": container with ID starting with e5b3172335de3d2f810358326945bc89d6071c422061b9f2df9e221d0170f2d0 not found: ID does not exist" containerID="e5b3172335de3d2f810358326945bc89d6071c422061b9f2df9e221d0170f2d0" Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.422270 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e5b3172335de3d2f810358326945bc89d6071c422061b9f2df9e221d0170f2d0"} err="failed to get container status \"e5b3172335de3d2f810358326945bc89d6071c422061b9f2df9e221d0170f2d0\": rpc error: code = NotFound desc = could not find container \"e5b3172335de3d2f810358326945bc89d6071c422061b9f2df9e221d0170f2d0\": container with ID starting with e5b3172335de3d2f810358326945bc89d6071c422061b9f2df9e221d0170f2d0 not found: ID does not exist" Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.422302 4831 scope.go:117] "RemoveContainer" containerID="743474bad9fcaae8e0cf7386e00c499d812861467dfd47de1d310d516dd16a28" Dec 03 07:58:52 crc kubenswrapper[4831]: E1203 07:58:52.422748 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"743474bad9fcaae8e0cf7386e00c499d812861467dfd47de1d310d516dd16a28\": container with ID starting with 743474bad9fcaae8e0cf7386e00c499d812861467dfd47de1d310d516dd16a28 not found: ID does not exist" containerID="743474bad9fcaae8e0cf7386e00c499d812861467dfd47de1d310d516dd16a28" Dec 03 07:58:52 crc kubenswrapper[4831]: I1203 07:58:52.422790 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"743474bad9fcaae8e0cf7386e00c499d812861467dfd47de1d310d516dd16a28"} err="failed to get container status \"743474bad9fcaae8e0cf7386e00c499d812861467dfd47de1d310d516dd16a28\": rpc error: code = NotFound desc = could not find container \"743474bad9fcaae8e0cf7386e00c499d812861467dfd47de1d310d516dd16a28\": container with ID starting with 743474bad9fcaae8e0cf7386e00c499d812861467dfd47de1d310d516dd16a28 not found: ID does not exist" Dec 03 07:58:53 crc kubenswrapper[4831]: I1203 07:58:53.034661 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ef666420-cc90-4c72-9f0c-9672ea0037f1" path="/var/lib/kubelet/pods/ef666420-cc90-4c72-9f0c-9672ea0037f1/volumes" Dec 03 07:58:53 crc kubenswrapper[4831]: I1203 07:58:53.339478 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Dec 03 07:58:53 crc kubenswrapper[4831]: I1203 07:58:53.555975 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-1" Dec 03 07:58:55 crc kubenswrapper[4831]: I1203 07:58:55.013711 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:58:55 crc kubenswrapper[4831]: E1203 07:58:55.015726 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.162221 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-copy-data"] Dec 03 07:58:56 crc kubenswrapper[4831]: E1203 07:58:56.164477 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef666420-cc90-4c72-9f0c-9672ea0037f1" containerName="dnsmasq-dns" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.164646 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef666420-cc90-4c72-9f0c-9672ea0037f1" containerName="dnsmasq-dns" Dec 03 07:58:56 crc kubenswrapper[4831]: E1203 07:58:56.164793 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ef666420-cc90-4c72-9f0c-9672ea0037f1" containerName="init" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.164905 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ef666420-cc90-4c72-9f0c-9672ea0037f1" containerName="init" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.165524 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ef666420-cc90-4c72-9f0c-9672ea0037f1" containerName="dnsmasq-dns" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.166933 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.170545 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovn-data-cert" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.189269 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.247533 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zm7fp\" (UniqueName: \"kubernetes.io/projected/725c995a-d355-4a06-9824-518cea6948e5-kube-api-access-zm7fp\") pod \"ovn-copy-data\" (UID: \"725c995a-d355-4a06-9824-518cea6948e5\") " pod="openstack/ovn-copy-data" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.247607 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25\") pod \"ovn-copy-data\" (UID: \"725c995a-d355-4a06-9824-518cea6948e5\") " pod="openstack/ovn-copy-data" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.247779 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/725c995a-d355-4a06-9824-518cea6948e5-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"725c995a-d355-4a06-9824-518cea6948e5\") " pod="openstack/ovn-copy-data" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.350369 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zm7fp\" (UniqueName: \"kubernetes.io/projected/725c995a-d355-4a06-9824-518cea6948e5-kube-api-access-zm7fp\") pod \"ovn-copy-data\" (UID: \"725c995a-d355-4a06-9824-518cea6948e5\") " pod="openstack/ovn-copy-data" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.350458 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25\") pod \"ovn-copy-data\" (UID: \"725c995a-d355-4a06-9824-518cea6948e5\") " pod="openstack/ovn-copy-data" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.350560 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/725c995a-d355-4a06-9824-518cea6948e5-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"725c995a-d355-4a06-9824-518cea6948e5\") " pod="openstack/ovn-copy-data" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.354917 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.354997 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25\") pod \"ovn-copy-data\" (UID: \"725c995a-d355-4a06-9824-518cea6948e5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/8eb222bc77c7e7d1a0c8d7b963809a5da8d884d6a712970200cf81dfa877177b/globalmount\"" pod="openstack/ovn-copy-data" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.357047 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/725c995a-d355-4a06-9824-518cea6948e5-ovn-data-cert\") pod \"ovn-copy-data\" (UID: \"725c995a-d355-4a06-9824-518cea6948e5\") " pod="openstack/ovn-copy-data" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.378759 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zm7fp\" (UniqueName: \"kubernetes.io/projected/725c995a-d355-4a06-9824-518cea6948e5-kube-api-access-zm7fp\") pod \"ovn-copy-data\" (UID: \"725c995a-d355-4a06-9824-518cea6948e5\") " pod="openstack/ovn-copy-data" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.401984 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25\") pod \"ovn-copy-data\" (UID: \"725c995a-d355-4a06-9824-518cea6948e5\") " pod="openstack/ovn-copy-data" Dec 03 07:58:56 crc kubenswrapper[4831]: I1203 07:58:56.497763 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 03 07:58:57 crc kubenswrapper[4831]: I1203 07:58:57.096807 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-copy-data"] Dec 03 07:58:57 crc kubenswrapper[4831]: I1203 07:58:57.411298 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"725c995a-d355-4a06-9824-518cea6948e5","Type":"ContainerStarted","Data":"ec5851ca5e1181840c16cabb90be184bd203eb416b53f6c720f8c14986f6e1bc"} Dec 03 07:58:57 crc kubenswrapper[4831]: I1203 07:58:57.411399 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"725c995a-d355-4a06-9824-518cea6948e5","Type":"ContainerStarted","Data":"5df60974a8b13c23c66a903976e3a04a033baa85a6f77e007a314c5d86d39067"} Dec 03 07:58:57 crc kubenswrapper[4831]: I1203 07:58:57.450199 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-copy-data" podStartSLOduration=2.45017051 podStartE2EDuration="2.45017051s" podCreationTimestamp="2025-12-03 07:58:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:58:57.434171263 +0000 UTC m=+5274.777754811" watchObservedRunningTime="2025-12-03 07:58:57.45017051 +0000 UTC m=+5274.793754058" Dec 03 07:59:00 crc kubenswrapper[4831]: I1203 07:59:00.095336 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:59:00 crc kubenswrapper[4831]: I1203 07:59:00.161490 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-nbt4j"] Dec 03 07:59:00 crc kubenswrapper[4831]: I1203 07:59:00.161834 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" podUID="849c8e45-ebca-4115-9048-0f54605f6c3c" containerName="dnsmasq-dns" containerID="cri-o://2b8ed7fcfff60e237f89454604fe5c9db1d089f4f855a0d3f59765a79ac288d6" gracePeriod=10 Dec 03 07:59:00 crc kubenswrapper[4831]: I1203 07:59:00.443381 4831 generic.go:334] "Generic (PLEG): container finished" podID="849c8e45-ebca-4115-9048-0f54605f6c3c" containerID="2b8ed7fcfff60e237f89454604fe5c9db1d089f4f855a0d3f59765a79ac288d6" exitCode=0 Dec 03 07:59:00 crc kubenswrapper[4831]: I1203 07:59:00.443479 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" event={"ID":"849c8e45-ebca-4115-9048-0f54605f6c3c","Type":"ContainerDied","Data":"2b8ed7fcfff60e237f89454604fe5c9db1d089f4f855a0d3f59765a79ac288d6"} Dec 03 07:59:00 crc kubenswrapper[4831]: I1203 07:59:00.656105 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:59:00 crc kubenswrapper[4831]: I1203 07:59:00.827987 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-config\") pod \"849c8e45-ebca-4115-9048-0f54605f6c3c\" (UID: \"849c8e45-ebca-4115-9048-0f54605f6c3c\") " Dec 03 07:59:00 crc kubenswrapper[4831]: I1203 07:59:00.828191 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq6hh\" (UniqueName: \"kubernetes.io/projected/849c8e45-ebca-4115-9048-0f54605f6c3c-kube-api-access-sq6hh\") pod \"849c8e45-ebca-4115-9048-0f54605f6c3c\" (UID: \"849c8e45-ebca-4115-9048-0f54605f6c3c\") " Dec 03 07:59:00 crc kubenswrapper[4831]: I1203 07:59:00.828337 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-dns-svc\") pod \"849c8e45-ebca-4115-9048-0f54605f6c3c\" (UID: \"849c8e45-ebca-4115-9048-0f54605f6c3c\") " Dec 03 07:59:00 crc kubenswrapper[4831]: I1203 07:59:00.834994 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/849c8e45-ebca-4115-9048-0f54605f6c3c-kube-api-access-sq6hh" (OuterVolumeSpecName: "kube-api-access-sq6hh") pod "849c8e45-ebca-4115-9048-0f54605f6c3c" (UID: "849c8e45-ebca-4115-9048-0f54605f6c3c"). InnerVolumeSpecName "kube-api-access-sq6hh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:59:00 crc kubenswrapper[4831]: E1203 07:59:00.873390 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-config podName:849c8e45-ebca-4115-9048-0f54605f6c3c nodeName:}" failed. No retries permitted until 2025-12-03 07:59:01.373362481 +0000 UTC m=+5278.716945989 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config" (UniqueName: "kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-config") pod "849c8e45-ebca-4115-9048-0f54605f6c3c" (UID: "849c8e45-ebca-4115-9048-0f54605f6c3c") : error deleting /var/lib/kubelet/pods/849c8e45-ebca-4115-9048-0f54605f6c3c/volume-subpaths: remove /var/lib/kubelet/pods/849c8e45-ebca-4115-9048-0f54605f6c3c/volume-subpaths: no such file or directory Dec 03 07:59:00 crc kubenswrapper[4831]: I1203 07:59:00.873743 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "849c8e45-ebca-4115-9048-0f54605f6c3c" (UID: "849c8e45-ebca-4115-9048-0f54605f6c3c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:59:00 crc kubenswrapper[4831]: I1203 07:59:00.930261 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq6hh\" (UniqueName: \"kubernetes.io/projected/849c8e45-ebca-4115-9048-0f54605f6c3c-kube-api-access-sq6hh\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:00 crc kubenswrapper[4831]: I1203 07:59:00.930289 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:01 crc kubenswrapper[4831]: I1203 07:59:01.440464 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-config\") pod \"849c8e45-ebca-4115-9048-0f54605f6c3c\" (UID: \"849c8e45-ebca-4115-9048-0f54605f6c3c\") " Dec 03 07:59:01 crc kubenswrapper[4831]: I1203 07:59:01.441005 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-config" (OuterVolumeSpecName: "config") pod "849c8e45-ebca-4115-9048-0f54605f6c3c" (UID: "849c8e45-ebca-4115-9048-0f54605f6c3c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:59:01 crc kubenswrapper[4831]: I1203 07:59:01.452956 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" event={"ID":"849c8e45-ebca-4115-9048-0f54605f6c3c","Type":"ContainerDied","Data":"4c45ac01a8ad3a60cf2025c431821fb19aa89e7e5e60d8992812b559cc3e4988"} Dec 03 07:59:01 crc kubenswrapper[4831]: I1203 07:59:01.453006 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b7946d7b9-nbt4j" Dec 03 07:59:01 crc kubenswrapper[4831]: I1203 07:59:01.453019 4831 scope.go:117] "RemoveContainer" containerID="2b8ed7fcfff60e237f89454604fe5c9db1d089f4f855a0d3f59765a79ac288d6" Dec 03 07:59:01 crc kubenswrapper[4831]: I1203 07:59:01.478965 4831 scope.go:117] "RemoveContainer" containerID="e5c43f4aad554eafc4d45ab473e54c844c10f3bb7abab3f9cf65ba112391d224" Dec 03 07:59:01 crc kubenswrapper[4831]: I1203 07:59:01.481766 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-nbt4j"] Dec 03 07:59:01 crc kubenswrapper[4831]: I1203 07:59:01.487638 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b7946d7b9-nbt4j"] Dec 03 07:59:01 crc kubenswrapper[4831]: I1203 07:59:01.542939 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/849c8e45-ebca-4115-9048-0f54605f6c3c-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:02 crc kubenswrapper[4831]: I1203 07:59:02.804529 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Dec 03 07:59:02 crc kubenswrapper[4831]: E1203 07:59:02.805789 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849c8e45-ebca-4115-9048-0f54605f6c3c" containerName="dnsmasq-dns" Dec 03 07:59:02 crc kubenswrapper[4831]: I1203 07:59:02.805821 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="849c8e45-ebca-4115-9048-0f54605f6c3c" containerName="dnsmasq-dns" Dec 03 07:59:02 crc kubenswrapper[4831]: E1203 07:59:02.805859 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="849c8e45-ebca-4115-9048-0f54605f6c3c" containerName="init" Dec 03 07:59:02 crc kubenswrapper[4831]: I1203 07:59:02.805876 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="849c8e45-ebca-4115-9048-0f54605f6c3c" containerName="init" Dec 03 07:59:02 crc kubenswrapper[4831]: I1203 07:59:02.806250 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="849c8e45-ebca-4115-9048-0f54605f6c3c" containerName="dnsmasq-dns" Dec 03 07:59:02 crc kubenswrapper[4831]: I1203 07:59:02.808211 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 07:59:02 crc kubenswrapper[4831]: I1203 07:59:02.811626 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 07:59:02 crc kubenswrapper[4831]: I1203 07:59:02.826924 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-89tvd" Dec 03 07:59:02 crc kubenswrapper[4831]: I1203 07:59:02.836175 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 07:59:02 crc kubenswrapper[4831]: I1203 07:59:02.836894 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:02.968028 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbr8t\" (UniqueName: \"kubernetes.io/projected/67342401-b261-4a0c-9f2a-f275307dc042-kube-api-access-xbr8t\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:02.968097 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67342401-b261-4a0c-9f2a-f275307dc042-scripts\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:02.968116 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67342401-b261-4a0c-9f2a-f275307dc042-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:02.968150 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67342401-b261-4a0c-9f2a-f275307dc042-config\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:02.968177 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67342401-b261-4a0c-9f2a-f275307dc042-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.038861 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="849c8e45-ebca-4115-9048-0f54605f6c3c" path="/var/lib/kubelet/pods/849c8e45-ebca-4115-9048-0f54605f6c3c/volumes" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.069131 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67342401-b261-4a0c-9f2a-f275307dc042-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.069217 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbr8t\" (UniqueName: \"kubernetes.io/projected/67342401-b261-4a0c-9f2a-f275307dc042-kube-api-access-xbr8t\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.069273 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67342401-b261-4a0c-9f2a-f275307dc042-scripts\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.069301 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67342401-b261-4a0c-9f2a-f275307dc042-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.069354 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67342401-b261-4a0c-9f2a-f275307dc042-config\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.069792 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/67342401-b261-4a0c-9f2a-f275307dc042-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.071129 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.071543 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.078028 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/67342401-b261-4a0c-9f2a-f275307dc042-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.081288 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/67342401-b261-4a0c-9f2a-f275307dc042-scripts\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.081519 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/67342401-b261-4a0c-9f2a-f275307dc042-config\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.089952 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbr8t\" (UniqueName: \"kubernetes.io/projected/67342401-b261-4a0c-9f2a-f275307dc042-kube-api-access-xbr8t\") pod \"ovn-northd-0\" (UID: \"67342401-b261-4a0c-9f2a-f275307dc042\") " pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.139838 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-89tvd" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.148729 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Dec 03 07:59:03 crc kubenswrapper[4831]: I1203 07:59:03.622442 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Dec 03 07:59:04 crc kubenswrapper[4831]: I1203 07:59:04.501807 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"67342401-b261-4a0c-9f2a-f275307dc042","Type":"ContainerStarted","Data":"b3590e8a838c3e1510d570b9f591d3880001b139ea37e01b2a3fc22029b6a6b6"} Dec 03 07:59:04 crc kubenswrapper[4831]: I1203 07:59:04.502441 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Dec 03 07:59:04 crc kubenswrapper[4831]: I1203 07:59:04.502481 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"67342401-b261-4a0c-9f2a-f275307dc042","Type":"ContainerStarted","Data":"bce4f84f8a5f55732be8a79fbf42bb130f479d55e6acdd2b873e01e1235da716"} Dec 03 07:59:04 crc kubenswrapper[4831]: I1203 07:59:04.502496 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"67342401-b261-4a0c-9f2a-f275307dc042","Type":"ContainerStarted","Data":"be853fe96586c83215baf3f7a83700cd76eca6104024c2ac90f995036cf816c5"} Dec 03 07:59:04 crc kubenswrapper[4831]: I1203 07:59:04.521915 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.521893092 podStartE2EDuration="2.521893092s" podCreationTimestamp="2025-12-03 07:59:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:59:04.521541091 +0000 UTC m=+5281.865124599" watchObservedRunningTime="2025-12-03 07:59:04.521893092 +0000 UTC m=+5281.865476600" Dec 03 07:59:06 crc kubenswrapper[4831]: I1203 07:59:06.012430 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:59:06 crc kubenswrapper[4831]: E1203 07:59:06.012965 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.302682 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-pxdqm"] Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.308092 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pxdqm" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.315075 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pxdqm"] Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.394276 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c263-account-create-update-kj8xl"] Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.395242 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c263-account-create-update-kj8xl" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.397670 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.405716 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c263-account-create-update-kj8xl"] Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.459957 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c1dacad-6332-45b2-94be-7b25a1e3c463-operator-scripts\") pod \"keystone-db-create-pxdqm\" (UID: \"3c1dacad-6332-45b2-94be-7b25a1e3c463\") " pod="openstack/keystone-db-create-pxdqm" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.460023 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bk46q\" (UniqueName: \"kubernetes.io/projected/3c1dacad-6332-45b2-94be-7b25a1e3c463-kube-api-access-bk46q\") pod \"keystone-db-create-pxdqm\" (UID: \"3c1dacad-6332-45b2-94be-7b25a1e3c463\") " pod="openstack/keystone-db-create-pxdqm" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.561305 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6436bc1e-96d4-47b2-9724-214eef860853-operator-scripts\") pod \"keystone-c263-account-create-update-kj8xl\" (UID: \"6436bc1e-96d4-47b2-9724-214eef860853\") " pod="openstack/keystone-c263-account-create-update-kj8xl" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.561503 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c1dacad-6332-45b2-94be-7b25a1e3c463-operator-scripts\") pod \"keystone-db-create-pxdqm\" (UID: \"3c1dacad-6332-45b2-94be-7b25a1e3c463\") " pod="openstack/keystone-db-create-pxdqm" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.561591 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hw644\" (UniqueName: \"kubernetes.io/projected/6436bc1e-96d4-47b2-9724-214eef860853-kube-api-access-hw644\") pod \"keystone-c263-account-create-update-kj8xl\" (UID: \"6436bc1e-96d4-47b2-9724-214eef860853\") " pod="openstack/keystone-c263-account-create-update-kj8xl" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.561642 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bk46q\" (UniqueName: \"kubernetes.io/projected/3c1dacad-6332-45b2-94be-7b25a1e3c463-kube-api-access-bk46q\") pod \"keystone-db-create-pxdqm\" (UID: \"3c1dacad-6332-45b2-94be-7b25a1e3c463\") " pod="openstack/keystone-db-create-pxdqm" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.564819 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c1dacad-6332-45b2-94be-7b25a1e3c463-operator-scripts\") pod \"keystone-db-create-pxdqm\" (UID: \"3c1dacad-6332-45b2-94be-7b25a1e3c463\") " pod="openstack/keystone-db-create-pxdqm" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.590218 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bk46q\" (UniqueName: \"kubernetes.io/projected/3c1dacad-6332-45b2-94be-7b25a1e3c463-kube-api-access-bk46q\") pod \"keystone-db-create-pxdqm\" (UID: \"3c1dacad-6332-45b2-94be-7b25a1e3c463\") " pod="openstack/keystone-db-create-pxdqm" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.628535 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pxdqm" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.663286 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6436bc1e-96d4-47b2-9724-214eef860853-operator-scripts\") pod \"keystone-c263-account-create-update-kj8xl\" (UID: \"6436bc1e-96d4-47b2-9724-214eef860853\") " pod="openstack/keystone-c263-account-create-update-kj8xl" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.663399 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hw644\" (UniqueName: \"kubernetes.io/projected/6436bc1e-96d4-47b2-9724-214eef860853-kube-api-access-hw644\") pod \"keystone-c263-account-create-update-kj8xl\" (UID: \"6436bc1e-96d4-47b2-9724-214eef860853\") " pod="openstack/keystone-c263-account-create-update-kj8xl" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.664025 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6436bc1e-96d4-47b2-9724-214eef860853-operator-scripts\") pod \"keystone-c263-account-create-update-kj8xl\" (UID: \"6436bc1e-96d4-47b2-9724-214eef860853\") " pod="openstack/keystone-c263-account-create-update-kj8xl" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.684791 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hw644\" (UniqueName: \"kubernetes.io/projected/6436bc1e-96d4-47b2-9724-214eef860853-kube-api-access-hw644\") pod \"keystone-c263-account-create-update-kj8xl\" (UID: \"6436bc1e-96d4-47b2-9724-214eef860853\") " pod="openstack/keystone-c263-account-create-update-kj8xl" Dec 03 07:59:08 crc kubenswrapper[4831]: I1203 07:59:08.712138 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c263-account-create-update-kj8xl" Dec 03 07:59:09 crc kubenswrapper[4831]: I1203 07:59:09.080565 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-pxdqm"] Dec 03 07:59:09 crc kubenswrapper[4831]: W1203 07:59:09.086623 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c1dacad_6332_45b2_94be_7b25a1e3c463.slice/crio-1c26a1434bb6806dc09052a8cc14e467a3ad7a218fd1aa42c0e58b951d11fca9 WatchSource:0}: Error finding container 1c26a1434bb6806dc09052a8cc14e467a3ad7a218fd1aa42c0e58b951d11fca9: Status 404 returned error can't find the container with id 1c26a1434bb6806dc09052a8cc14e467a3ad7a218fd1aa42c0e58b951d11fca9 Dec 03 07:59:09 crc kubenswrapper[4831]: I1203 07:59:09.238595 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c263-account-create-update-kj8xl"] Dec 03 07:59:09 crc kubenswrapper[4831]: I1203 07:59:09.553912 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c263-account-create-update-kj8xl" event={"ID":"6436bc1e-96d4-47b2-9724-214eef860853","Type":"ContainerStarted","Data":"3feb284151e5983b4315f80a434ce03e91c574931dc5027b83e41550404414f1"} Dec 03 07:59:09 crc kubenswrapper[4831]: I1203 07:59:09.554264 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c263-account-create-update-kj8xl" event={"ID":"6436bc1e-96d4-47b2-9724-214eef860853","Type":"ContainerStarted","Data":"77506c491727602b3b09c374aa9a2ba1c58b4af9fd89b25afd9e20db062e1408"} Dec 03 07:59:09 crc kubenswrapper[4831]: I1203 07:59:09.556078 4831 generic.go:334] "Generic (PLEG): container finished" podID="3c1dacad-6332-45b2-94be-7b25a1e3c463" containerID="b89ce73699b3850b1974a1ba8c4558871b6a7ee61752e50e902b696603da1a08" exitCode=0 Dec 03 07:59:09 crc kubenswrapper[4831]: I1203 07:59:09.556161 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pxdqm" event={"ID":"3c1dacad-6332-45b2-94be-7b25a1e3c463","Type":"ContainerDied","Data":"b89ce73699b3850b1974a1ba8c4558871b6a7ee61752e50e902b696603da1a08"} Dec 03 07:59:09 crc kubenswrapper[4831]: I1203 07:59:09.556209 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pxdqm" event={"ID":"3c1dacad-6332-45b2-94be-7b25a1e3c463","Type":"ContainerStarted","Data":"1c26a1434bb6806dc09052a8cc14e467a3ad7a218fd1aa42c0e58b951d11fca9"} Dec 03 07:59:09 crc kubenswrapper[4831]: I1203 07:59:09.578170 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-c263-account-create-update-kj8xl" podStartSLOduration=1.578143003 podStartE2EDuration="1.578143003s" podCreationTimestamp="2025-12-03 07:59:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:59:09.571749595 +0000 UTC m=+5286.915333163" watchObservedRunningTime="2025-12-03 07:59:09.578143003 +0000 UTC m=+5286.921726531" Dec 03 07:59:10 crc kubenswrapper[4831]: I1203 07:59:10.575391 4831 generic.go:334] "Generic (PLEG): container finished" podID="6436bc1e-96d4-47b2-9724-214eef860853" containerID="3feb284151e5983b4315f80a434ce03e91c574931dc5027b83e41550404414f1" exitCode=0 Dec 03 07:59:10 crc kubenswrapper[4831]: I1203 07:59:10.575664 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c263-account-create-update-kj8xl" event={"ID":"6436bc1e-96d4-47b2-9724-214eef860853","Type":"ContainerDied","Data":"3feb284151e5983b4315f80a434ce03e91c574931dc5027b83e41550404414f1"} Dec 03 07:59:11 crc kubenswrapper[4831]: I1203 07:59:11.042864 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pxdqm" Dec 03 07:59:11 crc kubenswrapper[4831]: I1203 07:59:11.210980 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bk46q\" (UniqueName: \"kubernetes.io/projected/3c1dacad-6332-45b2-94be-7b25a1e3c463-kube-api-access-bk46q\") pod \"3c1dacad-6332-45b2-94be-7b25a1e3c463\" (UID: \"3c1dacad-6332-45b2-94be-7b25a1e3c463\") " Dec 03 07:59:11 crc kubenswrapper[4831]: I1203 07:59:11.211171 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c1dacad-6332-45b2-94be-7b25a1e3c463-operator-scripts\") pod \"3c1dacad-6332-45b2-94be-7b25a1e3c463\" (UID: \"3c1dacad-6332-45b2-94be-7b25a1e3c463\") " Dec 03 07:59:11 crc kubenswrapper[4831]: I1203 07:59:11.211992 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3c1dacad-6332-45b2-94be-7b25a1e3c463-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3c1dacad-6332-45b2-94be-7b25a1e3c463" (UID: "3c1dacad-6332-45b2-94be-7b25a1e3c463"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:59:11 crc kubenswrapper[4831]: I1203 07:59:11.219774 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c1dacad-6332-45b2-94be-7b25a1e3c463-kube-api-access-bk46q" (OuterVolumeSpecName: "kube-api-access-bk46q") pod "3c1dacad-6332-45b2-94be-7b25a1e3c463" (UID: "3c1dacad-6332-45b2-94be-7b25a1e3c463"). InnerVolumeSpecName "kube-api-access-bk46q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:59:11 crc kubenswrapper[4831]: I1203 07:59:11.313587 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3c1dacad-6332-45b2-94be-7b25a1e3c463-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:11 crc kubenswrapper[4831]: I1203 07:59:11.313643 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bk46q\" (UniqueName: \"kubernetes.io/projected/3c1dacad-6332-45b2-94be-7b25a1e3c463-kube-api-access-bk46q\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:11 crc kubenswrapper[4831]: I1203 07:59:11.588248 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-pxdqm" event={"ID":"3c1dacad-6332-45b2-94be-7b25a1e3c463","Type":"ContainerDied","Data":"1c26a1434bb6806dc09052a8cc14e467a3ad7a218fd1aa42c0e58b951d11fca9"} Dec 03 07:59:11 crc kubenswrapper[4831]: I1203 07:59:11.588615 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-pxdqm" Dec 03 07:59:11 crc kubenswrapper[4831]: I1203 07:59:11.588619 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1c26a1434bb6806dc09052a8cc14e467a3ad7a218fd1aa42c0e58b951d11fca9" Dec 03 07:59:11 crc kubenswrapper[4831]: I1203 07:59:11.977360 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c263-account-create-update-kj8xl" Dec 03 07:59:12 crc kubenswrapper[4831]: I1203 07:59:12.125150 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6436bc1e-96d4-47b2-9724-214eef860853-operator-scripts\") pod \"6436bc1e-96d4-47b2-9724-214eef860853\" (UID: \"6436bc1e-96d4-47b2-9724-214eef860853\") " Dec 03 07:59:12 crc kubenswrapper[4831]: I1203 07:59:12.125280 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hw644\" (UniqueName: \"kubernetes.io/projected/6436bc1e-96d4-47b2-9724-214eef860853-kube-api-access-hw644\") pod \"6436bc1e-96d4-47b2-9724-214eef860853\" (UID: \"6436bc1e-96d4-47b2-9724-214eef860853\") " Dec 03 07:59:12 crc kubenswrapper[4831]: I1203 07:59:12.125719 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6436bc1e-96d4-47b2-9724-214eef860853-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6436bc1e-96d4-47b2-9724-214eef860853" (UID: "6436bc1e-96d4-47b2-9724-214eef860853"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:59:12 crc kubenswrapper[4831]: I1203 07:59:12.126089 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6436bc1e-96d4-47b2-9724-214eef860853-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:12 crc kubenswrapper[4831]: I1203 07:59:12.130618 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6436bc1e-96d4-47b2-9724-214eef860853-kube-api-access-hw644" (OuterVolumeSpecName: "kube-api-access-hw644") pod "6436bc1e-96d4-47b2-9724-214eef860853" (UID: "6436bc1e-96d4-47b2-9724-214eef860853"). InnerVolumeSpecName "kube-api-access-hw644". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:59:12 crc kubenswrapper[4831]: I1203 07:59:12.227793 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hw644\" (UniqueName: \"kubernetes.io/projected/6436bc1e-96d4-47b2-9724-214eef860853-kube-api-access-hw644\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:12 crc kubenswrapper[4831]: I1203 07:59:12.603552 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c263-account-create-update-kj8xl" event={"ID":"6436bc1e-96d4-47b2-9724-214eef860853","Type":"ContainerDied","Data":"77506c491727602b3b09c374aa9a2ba1c58b4af9fd89b25afd9e20db062e1408"} Dec 03 07:59:12 crc kubenswrapper[4831]: I1203 07:59:12.603629 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="77506c491727602b3b09c374aa9a2ba1c58b4af9fd89b25afd9e20db062e1408" Dec 03 07:59:12 crc kubenswrapper[4831]: I1203 07:59:12.603727 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c263-account-create-update-kj8xl" Dec 03 07:59:13 crc kubenswrapper[4831]: I1203 07:59:13.225164 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.043794 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-2lq75"] Dec 03 07:59:14 crc kubenswrapper[4831]: E1203 07:59:14.044576 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6436bc1e-96d4-47b2-9724-214eef860853" containerName="mariadb-account-create-update" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.044593 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6436bc1e-96d4-47b2-9724-214eef860853" containerName="mariadb-account-create-update" Dec 03 07:59:14 crc kubenswrapper[4831]: E1203 07:59:14.044611 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c1dacad-6332-45b2-94be-7b25a1e3c463" containerName="mariadb-database-create" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.044622 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c1dacad-6332-45b2-94be-7b25a1e3c463" containerName="mariadb-database-create" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.044835 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c1dacad-6332-45b2-94be-7b25a1e3c463" containerName="mariadb-database-create" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.044862 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6436bc1e-96d4-47b2-9724-214eef860853" containerName="mariadb-account-create-update" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.045602 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2lq75" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.047225 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.053544 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2lq75"] Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.054747 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.054855 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.057658 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rhrp2" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.159518 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h86jm\" (UniqueName: \"kubernetes.io/projected/05cc059b-0638-4e4c-8410-ace0ba4f391a-kube-api-access-h86jm\") pod \"keystone-db-sync-2lq75\" (UID: \"05cc059b-0638-4e4c-8410-ace0ba4f391a\") " pod="openstack/keystone-db-sync-2lq75" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.159594 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cc059b-0638-4e4c-8410-ace0ba4f391a-combined-ca-bundle\") pod \"keystone-db-sync-2lq75\" (UID: \"05cc059b-0638-4e4c-8410-ace0ba4f391a\") " pod="openstack/keystone-db-sync-2lq75" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.159663 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05cc059b-0638-4e4c-8410-ace0ba4f391a-config-data\") pod \"keystone-db-sync-2lq75\" (UID: \"05cc059b-0638-4e4c-8410-ace0ba4f391a\") " pod="openstack/keystone-db-sync-2lq75" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.260730 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05cc059b-0638-4e4c-8410-ace0ba4f391a-config-data\") pod \"keystone-db-sync-2lq75\" (UID: \"05cc059b-0638-4e4c-8410-ace0ba4f391a\") " pod="openstack/keystone-db-sync-2lq75" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.260895 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h86jm\" (UniqueName: \"kubernetes.io/projected/05cc059b-0638-4e4c-8410-ace0ba4f391a-kube-api-access-h86jm\") pod \"keystone-db-sync-2lq75\" (UID: \"05cc059b-0638-4e4c-8410-ace0ba4f391a\") " pod="openstack/keystone-db-sync-2lq75" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.260934 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cc059b-0638-4e4c-8410-ace0ba4f391a-combined-ca-bundle\") pod \"keystone-db-sync-2lq75\" (UID: \"05cc059b-0638-4e4c-8410-ace0ba4f391a\") " pod="openstack/keystone-db-sync-2lq75" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.266080 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cc059b-0638-4e4c-8410-ace0ba4f391a-combined-ca-bundle\") pod \"keystone-db-sync-2lq75\" (UID: \"05cc059b-0638-4e4c-8410-ace0ba4f391a\") " pod="openstack/keystone-db-sync-2lq75" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.275060 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05cc059b-0638-4e4c-8410-ace0ba4f391a-config-data\") pod \"keystone-db-sync-2lq75\" (UID: \"05cc059b-0638-4e4c-8410-ace0ba4f391a\") " pod="openstack/keystone-db-sync-2lq75" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.280715 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h86jm\" (UniqueName: \"kubernetes.io/projected/05cc059b-0638-4e4c-8410-ace0ba4f391a-kube-api-access-h86jm\") pod \"keystone-db-sync-2lq75\" (UID: \"05cc059b-0638-4e4c-8410-ace0ba4f391a\") " pod="openstack/keystone-db-sync-2lq75" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.368251 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2lq75" Dec 03 07:59:14 crc kubenswrapper[4831]: I1203 07:59:14.915469 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-2lq75"] Dec 03 07:59:14 crc kubenswrapper[4831]: W1203 07:59:14.917468 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05cc059b_0638_4e4c_8410_ace0ba4f391a.slice/crio-610923abc1e7e96fd8b7160fc056e8d117558f4aa24790eecfabb7a47b219be8 WatchSource:0}: Error finding container 610923abc1e7e96fd8b7160fc056e8d117558f4aa24790eecfabb7a47b219be8: Status 404 returned error can't find the container with id 610923abc1e7e96fd8b7160fc056e8d117558f4aa24790eecfabb7a47b219be8 Dec 03 07:59:15 crc kubenswrapper[4831]: I1203 07:59:15.636899 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2lq75" event={"ID":"05cc059b-0638-4e4c-8410-ace0ba4f391a","Type":"ContainerStarted","Data":"c04360e9abf6313203910473b490fd4400547523df0b433bc6b72f055e338698"} Dec 03 07:59:15 crc kubenswrapper[4831]: I1203 07:59:15.637309 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2lq75" event={"ID":"05cc059b-0638-4e4c-8410-ace0ba4f391a","Type":"ContainerStarted","Data":"610923abc1e7e96fd8b7160fc056e8d117558f4aa24790eecfabb7a47b219be8"} Dec 03 07:59:15 crc kubenswrapper[4831]: I1203 07:59:15.674942 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-2lq75" podStartSLOduration=1.674909554 podStartE2EDuration="1.674909554s" podCreationTimestamp="2025-12-03 07:59:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:59:15.665601135 +0000 UTC m=+5293.009184683" watchObservedRunningTime="2025-12-03 07:59:15.674909554 +0000 UTC m=+5293.018493102" Dec 03 07:59:17 crc kubenswrapper[4831]: I1203 07:59:17.659788 4831 generic.go:334] "Generic (PLEG): container finished" podID="05cc059b-0638-4e4c-8410-ace0ba4f391a" containerID="c04360e9abf6313203910473b490fd4400547523df0b433bc6b72f055e338698" exitCode=0 Dec 03 07:59:17 crc kubenswrapper[4831]: I1203 07:59:17.659938 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2lq75" event={"ID":"05cc059b-0638-4e4c-8410-ace0ba4f391a","Type":"ContainerDied","Data":"c04360e9abf6313203910473b490fd4400547523df0b433bc6b72f055e338698"} Dec 03 07:59:18 crc kubenswrapper[4831]: I1203 07:59:18.013503 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:59:18 crc kubenswrapper[4831]: E1203 07:59:18.014301 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 07:59:18 crc kubenswrapper[4831]: I1203 07:59:18.814497 4831 scope.go:117] "RemoveContainer" containerID="a78523f273d45c4a96f1498fe8b9235a8197c9a6cc18841acd55355f57009a0a" Dec 03 07:59:18 crc kubenswrapper[4831]: I1203 07:59:18.853195 4831 scope.go:117] "RemoveContainer" containerID="1119170d5a73d59578adef9531478efbb8273e8d56d789e589b57fec50ed3b61" Dec 03 07:59:18 crc kubenswrapper[4831]: I1203 07:59:18.938822 4831 scope.go:117] "RemoveContainer" containerID="1827fb7b07c920c84180ec7aeb648c8c903e1f314c151a1df9583c3b321718d7" Dec 03 07:59:18 crc kubenswrapper[4831]: I1203 07:59:18.957691 4831 scope.go:117] "RemoveContainer" containerID="dc7b50e85ee5b590677e46c724a121832ef46a8fb7d2589f1d2cbef19dc4a779" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.001512 4831 scope.go:117] "RemoveContainer" containerID="ee7ebf71b9bfbd5e2036592443931e9a199849cce989ced67546287f62619c78" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.077558 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2lq75" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.253787 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cc059b-0638-4e4c-8410-ace0ba4f391a-combined-ca-bundle\") pod \"05cc059b-0638-4e4c-8410-ace0ba4f391a\" (UID: \"05cc059b-0638-4e4c-8410-ace0ba4f391a\") " Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.253864 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05cc059b-0638-4e4c-8410-ace0ba4f391a-config-data\") pod \"05cc059b-0638-4e4c-8410-ace0ba4f391a\" (UID: \"05cc059b-0638-4e4c-8410-ace0ba4f391a\") " Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.254114 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h86jm\" (UniqueName: \"kubernetes.io/projected/05cc059b-0638-4e4c-8410-ace0ba4f391a-kube-api-access-h86jm\") pod \"05cc059b-0638-4e4c-8410-ace0ba4f391a\" (UID: \"05cc059b-0638-4e4c-8410-ace0ba4f391a\") " Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.261785 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05cc059b-0638-4e4c-8410-ace0ba4f391a-kube-api-access-h86jm" (OuterVolumeSpecName: "kube-api-access-h86jm") pod "05cc059b-0638-4e4c-8410-ace0ba4f391a" (UID: "05cc059b-0638-4e4c-8410-ace0ba4f391a"). InnerVolumeSpecName "kube-api-access-h86jm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.298441 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05cc059b-0638-4e4c-8410-ace0ba4f391a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05cc059b-0638-4e4c-8410-ace0ba4f391a" (UID: "05cc059b-0638-4e4c-8410-ace0ba4f391a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.316085 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05cc059b-0638-4e4c-8410-ace0ba4f391a-config-data" (OuterVolumeSpecName: "config-data") pod "05cc059b-0638-4e4c-8410-ace0ba4f391a" (UID: "05cc059b-0638-4e4c-8410-ace0ba4f391a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.356054 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h86jm\" (UniqueName: \"kubernetes.io/projected/05cc059b-0638-4e4c-8410-ace0ba4f391a-kube-api-access-h86jm\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.356119 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05cc059b-0638-4e4c-8410-ace0ba4f391a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.356139 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05cc059b-0638-4e4c-8410-ace0ba4f391a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.677217 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-2lq75" event={"ID":"05cc059b-0638-4e4c-8410-ace0ba4f391a","Type":"ContainerDied","Data":"610923abc1e7e96fd8b7160fc056e8d117558f4aa24790eecfabb7a47b219be8"} Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.677253 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="610923abc1e7e96fd8b7160fc056e8d117558f4aa24790eecfabb7a47b219be8" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.677267 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-2lq75" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.935095 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-778c4f6d7f-bvnv9"] Dec 03 07:59:19 crc kubenswrapper[4831]: E1203 07:59:19.935509 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05cc059b-0638-4e4c-8410-ace0ba4f391a" containerName="keystone-db-sync" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.935525 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="05cc059b-0638-4e4c-8410-ace0ba4f391a" containerName="keystone-db-sync" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.935727 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="05cc059b-0638-4e4c-8410-ace0ba4f391a" containerName="keystone-db-sync" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.936770 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.949266 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-778c4f6d7f-bvnv9"] Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.966458 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-b2nx7"] Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.968423 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.972769 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.973001 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.973149 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.973429 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rhrp2" Dec 03 07:59:19 crc kubenswrapper[4831]: I1203 07:59:19.973631 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.006069 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b2nx7"] Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.072159 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-scripts\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.072571 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48nsl\" (UniqueName: \"kubernetes.io/projected/4c7fb030-01a4-4122-833f-c306bea2f68a-kube-api-access-48nsl\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.072607 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-ovsdbserver-sb\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.072636 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-fernet-keys\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.072681 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-config\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.072738 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-combined-ca-bundle\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.072772 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-dns-svc\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.072793 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq656\" (UniqueName: \"kubernetes.io/projected/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-kube-api-access-hq656\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.072904 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-config-data\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.073035 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-ovsdbserver-nb\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.073202 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-credential-keys\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.175265 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-config-data\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.175329 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-ovsdbserver-nb\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.175397 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-credential-keys\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.175439 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-scripts\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.175475 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48nsl\" (UniqueName: \"kubernetes.io/projected/4c7fb030-01a4-4122-833f-c306bea2f68a-kube-api-access-48nsl\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.175508 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-ovsdbserver-sb\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.175534 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-fernet-keys\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.175575 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-config\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.175610 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-combined-ca-bundle\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.175641 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-dns-svc\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.175662 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq656\" (UniqueName: \"kubernetes.io/projected/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-kube-api-access-hq656\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.176391 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-ovsdbserver-nb\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.176913 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-config\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.177043 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-dns-svc\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.177258 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-ovsdbserver-sb\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.179819 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-credential-keys\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.180072 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-fernet-keys\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.180087 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-scripts\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.180111 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-combined-ca-bundle\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.189583 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-config-data\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.195012 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq656\" (UniqueName: \"kubernetes.io/projected/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-kube-api-access-hq656\") pod \"keystone-bootstrap-b2nx7\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.196461 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48nsl\" (UniqueName: \"kubernetes.io/projected/4c7fb030-01a4-4122-833f-c306bea2f68a-kube-api-access-48nsl\") pod \"dnsmasq-dns-778c4f6d7f-bvnv9\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.252277 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.293843 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.699062 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-778c4f6d7f-bvnv9"] Dec 03 07:59:20 crc kubenswrapper[4831]: W1203 07:59:20.703822 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c7fb030_01a4_4122_833f_c306bea2f68a.slice/crio-c9a3fa8387a41eb574c663ca1963957cb446713475967ba06f051715e8104931 WatchSource:0}: Error finding container c9a3fa8387a41eb574c663ca1963957cb446713475967ba06f051715e8104931: Status 404 returned error can't find the container with id c9a3fa8387a41eb574c663ca1963957cb446713475967ba06f051715e8104931 Dec 03 07:59:20 crc kubenswrapper[4831]: I1203 07:59:20.784466 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-b2nx7"] Dec 03 07:59:20 crc kubenswrapper[4831]: W1203 07:59:20.798379 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0132f630_9e7f_4d4a_ad97_b8068fe9db6b.slice/crio-f6d444866e0c717ad98611813d9b2e08f4be193b492599dcd210e272811e70b4 WatchSource:0}: Error finding container f6d444866e0c717ad98611813d9b2e08f4be193b492599dcd210e272811e70b4: Status 404 returned error can't find the container with id f6d444866e0c717ad98611813d9b2e08f4be193b492599dcd210e272811e70b4 Dec 03 07:59:21 crc kubenswrapper[4831]: I1203 07:59:21.696693 4831 generic.go:334] "Generic (PLEG): container finished" podID="4c7fb030-01a4-4122-833f-c306bea2f68a" containerID="55053fad3553271c219c4b1de24491f1ce045ad258007169fa5ef7c4fd030c55" exitCode=0 Dec 03 07:59:21 crc kubenswrapper[4831]: I1203 07:59:21.696759 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" event={"ID":"4c7fb030-01a4-4122-833f-c306bea2f68a","Type":"ContainerDied","Data":"55053fad3553271c219c4b1de24491f1ce045ad258007169fa5ef7c4fd030c55"} Dec 03 07:59:21 crc kubenswrapper[4831]: I1203 07:59:21.697106 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" event={"ID":"4c7fb030-01a4-4122-833f-c306bea2f68a","Type":"ContainerStarted","Data":"c9a3fa8387a41eb574c663ca1963957cb446713475967ba06f051715e8104931"} Dec 03 07:59:21 crc kubenswrapper[4831]: I1203 07:59:21.702652 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b2nx7" event={"ID":"0132f630-9e7f-4d4a-ad97-b8068fe9db6b","Type":"ContainerStarted","Data":"9ccc325c6c1a21c7688c9a00f678b1bda9089ee92ad8bdba2bd474ba475ef818"} Dec 03 07:59:21 crc kubenswrapper[4831]: I1203 07:59:21.702703 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b2nx7" event={"ID":"0132f630-9e7f-4d4a-ad97-b8068fe9db6b","Type":"ContainerStarted","Data":"f6d444866e0c717ad98611813d9b2e08f4be193b492599dcd210e272811e70b4"} Dec 03 07:59:21 crc kubenswrapper[4831]: I1203 07:59:21.756223 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-b2nx7" podStartSLOduration=2.756200994 podStartE2EDuration="2.756200994s" podCreationTimestamp="2025-12-03 07:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:59:21.750701263 +0000 UTC m=+5299.094284821" watchObservedRunningTime="2025-12-03 07:59:21.756200994 +0000 UTC m=+5299.099784502" Dec 03 07:59:22 crc kubenswrapper[4831]: I1203 07:59:22.724984 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" event={"ID":"4c7fb030-01a4-4122-833f-c306bea2f68a","Type":"ContainerStarted","Data":"c88b9207228d72b8506900533ba62271325085ba307281123c30f329bc06aa8b"} Dec 03 07:59:22 crc kubenswrapper[4831]: I1203 07:59:22.726156 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:22 crc kubenswrapper[4831]: I1203 07:59:22.772093 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" podStartSLOduration=3.772066937 podStartE2EDuration="3.772066937s" podCreationTimestamp="2025-12-03 07:59:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:59:22.756528164 +0000 UTC m=+5300.100111662" watchObservedRunningTime="2025-12-03 07:59:22.772066937 +0000 UTC m=+5300.115650475" Dec 03 07:59:24 crc kubenswrapper[4831]: I1203 07:59:24.753861 4831 generic.go:334] "Generic (PLEG): container finished" podID="0132f630-9e7f-4d4a-ad97-b8068fe9db6b" containerID="9ccc325c6c1a21c7688c9a00f678b1bda9089ee92ad8bdba2bd474ba475ef818" exitCode=0 Dec 03 07:59:24 crc kubenswrapper[4831]: I1203 07:59:24.753924 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b2nx7" event={"ID":"0132f630-9e7f-4d4a-ad97-b8068fe9db6b","Type":"ContainerDied","Data":"9ccc325c6c1a21c7688c9a00f678b1bda9089ee92ad8bdba2bd474ba475ef818"} Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.132421 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.282311 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-scripts\") pod \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.282437 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hq656\" (UniqueName: \"kubernetes.io/projected/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-kube-api-access-hq656\") pod \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.282524 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-config-data\") pod \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.282607 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-combined-ca-bundle\") pod \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.282634 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-credential-keys\") pod \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.282686 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-fernet-keys\") pod \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\" (UID: \"0132f630-9e7f-4d4a-ad97-b8068fe9db6b\") " Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.289107 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-kube-api-access-hq656" (OuterVolumeSpecName: "kube-api-access-hq656") pod "0132f630-9e7f-4d4a-ad97-b8068fe9db6b" (UID: "0132f630-9e7f-4d4a-ad97-b8068fe9db6b"). InnerVolumeSpecName "kube-api-access-hq656". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.290408 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "0132f630-9e7f-4d4a-ad97-b8068fe9db6b" (UID: "0132f630-9e7f-4d4a-ad97-b8068fe9db6b"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.290493 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "0132f630-9e7f-4d4a-ad97-b8068fe9db6b" (UID: "0132f630-9e7f-4d4a-ad97-b8068fe9db6b"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.297506 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-scripts" (OuterVolumeSpecName: "scripts") pod "0132f630-9e7f-4d4a-ad97-b8068fe9db6b" (UID: "0132f630-9e7f-4d4a-ad97-b8068fe9db6b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.325076 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0132f630-9e7f-4d4a-ad97-b8068fe9db6b" (UID: "0132f630-9e7f-4d4a-ad97-b8068fe9db6b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.330283 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-config-data" (OuterVolumeSpecName: "config-data") pod "0132f630-9e7f-4d4a-ad97-b8068fe9db6b" (UID: "0132f630-9e7f-4d4a-ad97-b8068fe9db6b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.384491 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.384550 4831 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.384566 4831 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.384579 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.384594 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hq656\" (UniqueName: \"kubernetes.io/projected/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-kube-api-access-hq656\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.384607 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0132f630-9e7f-4d4a-ad97-b8068fe9db6b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.781520 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-b2nx7" event={"ID":"0132f630-9e7f-4d4a-ad97-b8068fe9db6b","Type":"ContainerDied","Data":"f6d444866e0c717ad98611813d9b2e08f4be193b492599dcd210e272811e70b4"} Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.781566 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6d444866e0c717ad98611813d9b2e08f4be193b492599dcd210e272811e70b4" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.781608 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-b2nx7" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.873073 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-b2nx7"] Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.880644 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-b2nx7"] Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.966494 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-4cvbb"] Dec 03 07:59:26 crc kubenswrapper[4831]: E1203 07:59:26.967308 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0132f630-9e7f-4d4a-ad97-b8068fe9db6b" containerName="keystone-bootstrap" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.967357 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0132f630-9e7f-4d4a-ad97-b8068fe9db6b" containerName="keystone-bootstrap" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.967605 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0132f630-9e7f-4d4a-ad97-b8068fe9db6b" containerName="keystone-bootstrap" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.968776 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:26 crc kubenswrapper[4831]: I1203 07:59:26.978927 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4cvbb"] Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.007839 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.007979 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.008121 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rhrp2" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.008245 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.008302 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.009357 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-combined-ca-bundle\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.009415 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlks7\" (UniqueName: \"kubernetes.io/projected/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-kube-api-access-zlks7\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.009443 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-fernet-keys\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.009487 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-config-data\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.009578 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-credential-keys\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.009694 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-scripts\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.041562 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0132f630-9e7f-4d4a-ad97-b8068fe9db6b" path="/var/lib/kubelet/pods/0132f630-9e7f-4d4a-ad97-b8068fe9db6b/volumes" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.110897 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-combined-ca-bundle\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.111834 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlks7\" (UniqueName: \"kubernetes.io/projected/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-kube-api-access-zlks7\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.111914 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-fernet-keys\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.112447 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-config-data\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.112575 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-credential-keys\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.112757 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-scripts\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.123051 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-fernet-keys\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.123073 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-config-data\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.123052 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-credential-keys\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.123439 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-combined-ca-bundle\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.125503 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-scripts\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.140187 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlks7\" (UniqueName: \"kubernetes.io/projected/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-kube-api-access-zlks7\") pod \"keystone-bootstrap-4cvbb\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.326643 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:27 crc kubenswrapper[4831]: I1203 07:59:27.794477 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-4cvbb"] Dec 03 07:59:28 crc kubenswrapper[4831]: I1203 07:59:28.809381 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4cvbb" event={"ID":"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e","Type":"ContainerStarted","Data":"4118cdbf98f0a5b447c2361fdcae316c630d4c8a1e091b507663859a4ee3db73"} Dec 03 07:59:28 crc kubenswrapper[4831]: I1203 07:59:28.812049 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4cvbb" event={"ID":"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e","Type":"ContainerStarted","Data":"4bd5cf2381af48ab3572389a7f9f7c1eb880b86fd3787dc34ca3e857408a5bc1"} Dec 03 07:59:28 crc kubenswrapper[4831]: I1203 07:59:28.844935 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-4cvbb" podStartSLOduration=2.844904264 podStartE2EDuration="2.844904264s" podCreationTimestamp="2025-12-03 07:59:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:59:28.832712696 +0000 UTC m=+5306.176296204" watchObservedRunningTime="2025-12-03 07:59:28.844904264 +0000 UTC m=+5306.188487802" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.254625 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.362842 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68f795456f-mmrsd"] Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.363502 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-68f795456f-mmrsd" podUID="d7338210-677e-478a-a591-3eacfde2c30f" containerName="dnsmasq-dns" containerID="cri-o://f708e044e38a9d6eab398d3aa60ec17ae49b825234a1fdf9bbb4a032f877968e" gracePeriod=10 Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.784056 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.808473 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-ovsdbserver-nb\") pod \"d7338210-677e-478a-a591-3eacfde2c30f\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.808551 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-ovsdbserver-sb\") pod \"d7338210-677e-478a-a591-3eacfde2c30f\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.808619 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-dns-svc\") pod \"d7338210-677e-478a-a591-3eacfde2c30f\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.808651 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-config\") pod \"d7338210-677e-478a-a591-3eacfde2c30f\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.808671 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfl8g\" (UniqueName: \"kubernetes.io/projected/d7338210-677e-478a-a591-3eacfde2c30f-kube-api-access-lfl8g\") pod \"d7338210-677e-478a-a591-3eacfde2c30f\" (UID: \"d7338210-677e-478a-a591-3eacfde2c30f\") " Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.814836 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7338210-677e-478a-a591-3eacfde2c30f-kube-api-access-lfl8g" (OuterVolumeSpecName: "kube-api-access-lfl8g") pod "d7338210-677e-478a-a591-3eacfde2c30f" (UID: "d7338210-677e-478a-a591-3eacfde2c30f"). InnerVolumeSpecName "kube-api-access-lfl8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.832905 4831 generic.go:334] "Generic (PLEG): container finished" podID="d7338210-677e-478a-a591-3eacfde2c30f" containerID="f708e044e38a9d6eab398d3aa60ec17ae49b825234a1fdf9bbb4a032f877968e" exitCode=0 Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.832976 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-68f795456f-mmrsd" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.832984 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f795456f-mmrsd" event={"ID":"d7338210-677e-478a-a591-3eacfde2c30f","Type":"ContainerDied","Data":"f708e044e38a9d6eab398d3aa60ec17ae49b825234a1fdf9bbb4a032f877968e"} Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.833234 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-68f795456f-mmrsd" event={"ID":"d7338210-677e-478a-a591-3eacfde2c30f","Type":"ContainerDied","Data":"3d43d225b09725a32f454026af7dba0bbee15b8fb9c099df88f73d16418a67c8"} Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.833309 4831 scope.go:117] "RemoveContainer" containerID="f708e044e38a9d6eab398d3aa60ec17ae49b825234a1fdf9bbb4a032f877968e" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.836395 4831 generic.go:334] "Generic (PLEG): container finished" podID="5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e" containerID="4118cdbf98f0a5b447c2361fdcae316c630d4c8a1e091b507663859a4ee3db73" exitCode=0 Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.836437 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4cvbb" event={"ID":"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e","Type":"ContainerDied","Data":"4118cdbf98f0a5b447c2361fdcae316c630d4c8a1e091b507663859a4ee3db73"} Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.867838 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-config" (OuterVolumeSpecName: "config") pod "d7338210-677e-478a-a591-3eacfde2c30f" (UID: "d7338210-677e-478a-a591-3eacfde2c30f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.874953 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d7338210-677e-478a-a591-3eacfde2c30f" (UID: "d7338210-677e-478a-a591-3eacfde2c30f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.881584 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d7338210-677e-478a-a591-3eacfde2c30f" (UID: "d7338210-677e-478a-a591-3eacfde2c30f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.895232 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d7338210-677e-478a-a591-3eacfde2c30f" (UID: "d7338210-677e-478a-a591-3eacfde2c30f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.903688 4831 scope.go:117] "RemoveContainer" containerID="b332e8709831a59f3ceb0d2f28e06a243b57e90a84796d97c2fa580ff5882d73" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.910707 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.910742 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.910758 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.910772 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d7338210-677e-478a-a591-3eacfde2c30f-config\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.910787 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfl8g\" (UniqueName: \"kubernetes.io/projected/d7338210-677e-478a-a591-3eacfde2c30f-kube-api-access-lfl8g\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.924540 4831 scope.go:117] "RemoveContainer" containerID="f708e044e38a9d6eab398d3aa60ec17ae49b825234a1fdf9bbb4a032f877968e" Dec 03 07:59:30 crc kubenswrapper[4831]: E1203 07:59:30.925135 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f708e044e38a9d6eab398d3aa60ec17ae49b825234a1fdf9bbb4a032f877968e\": container with ID starting with f708e044e38a9d6eab398d3aa60ec17ae49b825234a1fdf9bbb4a032f877968e not found: ID does not exist" containerID="f708e044e38a9d6eab398d3aa60ec17ae49b825234a1fdf9bbb4a032f877968e" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.925164 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f708e044e38a9d6eab398d3aa60ec17ae49b825234a1fdf9bbb4a032f877968e"} err="failed to get container status \"f708e044e38a9d6eab398d3aa60ec17ae49b825234a1fdf9bbb4a032f877968e\": rpc error: code = NotFound desc = could not find container \"f708e044e38a9d6eab398d3aa60ec17ae49b825234a1fdf9bbb4a032f877968e\": container with ID starting with f708e044e38a9d6eab398d3aa60ec17ae49b825234a1fdf9bbb4a032f877968e not found: ID does not exist" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.925185 4831 scope.go:117] "RemoveContainer" containerID="b332e8709831a59f3ceb0d2f28e06a243b57e90a84796d97c2fa580ff5882d73" Dec 03 07:59:30 crc kubenswrapper[4831]: E1203 07:59:30.925636 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b332e8709831a59f3ceb0d2f28e06a243b57e90a84796d97c2fa580ff5882d73\": container with ID starting with b332e8709831a59f3ceb0d2f28e06a243b57e90a84796d97c2fa580ff5882d73 not found: ID does not exist" containerID="b332e8709831a59f3ceb0d2f28e06a243b57e90a84796d97c2fa580ff5882d73" Dec 03 07:59:30 crc kubenswrapper[4831]: I1203 07:59:30.925659 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b332e8709831a59f3ceb0d2f28e06a243b57e90a84796d97c2fa580ff5882d73"} err="failed to get container status \"b332e8709831a59f3ceb0d2f28e06a243b57e90a84796d97c2fa580ff5882d73\": rpc error: code = NotFound desc = could not find container \"b332e8709831a59f3ceb0d2f28e06a243b57e90a84796d97c2fa580ff5882d73\": container with ID starting with b332e8709831a59f3ceb0d2f28e06a243b57e90a84796d97c2fa580ff5882d73 not found: ID does not exist" Dec 03 07:59:31 crc kubenswrapper[4831]: I1203 07:59:31.171049 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-68f795456f-mmrsd"] Dec 03 07:59:31 crc kubenswrapper[4831]: I1203 07:59:31.186543 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-68f795456f-mmrsd"] Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.239271 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.435038 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-scripts\") pod \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.435453 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlks7\" (UniqueName: \"kubernetes.io/projected/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-kube-api-access-zlks7\") pod \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.435625 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-fernet-keys\") pod \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.435691 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-config-data\") pod \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.435752 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-combined-ca-bundle\") pod \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.435801 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-credential-keys\") pod \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\" (UID: \"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e\") " Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.444292 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-scripts" (OuterVolumeSpecName: "scripts") pod "5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e" (UID: "5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.444372 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-kube-api-access-zlks7" (OuterVolumeSpecName: "kube-api-access-zlks7") pod "5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e" (UID: "5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e"). InnerVolumeSpecName "kube-api-access-zlks7". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.444461 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e" (UID: "5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.445623 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e" (UID: "5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.479413 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-config-data" (OuterVolumeSpecName: "config-data") pod "5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e" (UID: "5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.480564 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e" (UID: "5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.538810 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlks7\" (UniqueName: \"kubernetes.io/projected/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-kube-api-access-zlks7\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.538892 4831 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.538921 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.538945 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.538969 4831 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-credential-keys\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.538991 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.873998 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-4cvbb" event={"ID":"5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e","Type":"ContainerDied","Data":"4bd5cf2381af48ab3572389a7f9f7c1eb880b86fd3787dc34ca3e857408a5bc1"} Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.874059 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4bd5cf2381af48ab3572389a7f9f7c1eb880b86fd3787dc34ca3e857408a5bc1" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.874071 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-4cvbb" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.962026 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-549cddd46f-vzg66"] Dec 03 07:59:32 crc kubenswrapper[4831]: E1203 07:59:32.962730 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7338210-677e-478a-a591-3eacfde2c30f" containerName="dnsmasq-dns" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.962779 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7338210-677e-478a-a591-3eacfde2c30f" containerName="dnsmasq-dns" Dec 03 07:59:32 crc kubenswrapper[4831]: E1203 07:59:32.962854 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e" containerName="keystone-bootstrap" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.962876 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e" containerName="keystone-bootstrap" Dec 03 07:59:32 crc kubenswrapper[4831]: E1203 07:59:32.962913 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7338210-677e-478a-a591-3eacfde2c30f" containerName="init" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.962932 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7338210-677e-478a-a591-3eacfde2c30f" containerName="init" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.963363 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7338210-677e-478a-a591-3eacfde2c30f" containerName="dnsmasq-dns" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.963423 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e" containerName="keystone-bootstrap" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.964499 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.967216 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.967652 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.967733 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-rhrp2" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.967668 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Dec 03 07:59:32 crc kubenswrapper[4831]: I1203 07:59:32.977277 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-549cddd46f-vzg66"] Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.019756 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.028662 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7338210-677e-478a-a591-3eacfde2c30f" path="/var/lib/kubelet/pods/d7338210-677e-478a-a591-3eacfde2c30f/volumes" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.149458 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-scripts\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.149877 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6fnd\" (UniqueName: \"kubernetes.io/projected/ebdb8511-1378-49f0-bda9-e7ae48599ca6-kube-api-access-c6fnd\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.150015 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-combined-ca-bundle\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.150105 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-credential-keys\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.150138 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-config-data\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.150182 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-fernet-keys\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.251428 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-fernet-keys\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.251503 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-scripts\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.251541 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6fnd\" (UniqueName: \"kubernetes.io/projected/ebdb8511-1378-49f0-bda9-e7ae48599ca6-kube-api-access-c6fnd\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.251585 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-combined-ca-bundle\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.251661 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-credential-keys\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.251716 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-config-data\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.256511 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-combined-ca-bundle\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.267046 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-credential-keys\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.267728 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-fernet-keys\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.268999 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-config-data\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.269326 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ebdb8511-1378-49f0-bda9-e7ae48599ca6-scripts\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.286721 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6fnd\" (UniqueName: \"kubernetes.io/projected/ebdb8511-1378-49f0-bda9-e7ae48599ca6-kube-api-access-c6fnd\") pod \"keystone-549cddd46f-vzg66\" (UID: \"ebdb8511-1378-49f0-bda9-e7ae48599ca6\") " pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.585428 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:33 crc kubenswrapper[4831]: I1203 07:59:33.891509 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"b048367eeeb7f28a9954ddb05cb80ad0ca3f94ca3e078cded79e040a0e5a5a5d"} Dec 03 07:59:34 crc kubenswrapper[4831]: I1203 07:59:34.052813 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-549cddd46f-vzg66"] Dec 03 07:59:34 crc kubenswrapper[4831]: I1203 07:59:34.903862 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-549cddd46f-vzg66" event={"ID":"ebdb8511-1378-49f0-bda9-e7ae48599ca6","Type":"ContainerStarted","Data":"e915f01f4af19bf59cbc2c01bd30d3b5a55787379b41655bce6bdc6e3d2def4d"} Dec 03 07:59:34 crc kubenswrapper[4831]: I1203 07:59:34.903906 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-549cddd46f-vzg66" event={"ID":"ebdb8511-1378-49f0-bda9-e7ae48599ca6","Type":"ContainerStarted","Data":"af2e9796f46e32ff95949b4c9c3c2371740a7bd581f25ac74613dc45969d4072"} Dec 03 07:59:34 crc kubenswrapper[4831]: I1203 07:59:34.904131 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-549cddd46f-vzg66" Dec 03 07:59:34 crc kubenswrapper[4831]: I1203 07:59:34.926129 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-549cddd46f-vzg66" podStartSLOduration=2.9261044910000003 podStartE2EDuration="2.926104491s" podCreationTimestamp="2025-12-03 07:59:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 07:59:34.925370458 +0000 UTC m=+5312.268953986" watchObservedRunningTime="2025-12-03 07:59:34.926104491 +0000 UTC m=+5312.269688019" Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.215712 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m"] Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.217925 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.220570 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.221566 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.230015 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m"] Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.300552 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11173dd1-e076-4ad0-8a7e-ba71b69a805e-config-volume\") pod \"collect-profiles-29412480-ngp2m\" (UID: \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.300796 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4mhg\" (UniqueName: \"kubernetes.io/projected/11173dd1-e076-4ad0-8a7e-ba71b69a805e-kube-api-access-s4mhg\") pod \"collect-profiles-29412480-ngp2m\" (UID: \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.300972 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11173dd1-e076-4ad0-8a7e-ba71b69a805e-secret-volume\") pod \"collect-profiles-29412480-ngp2m\" (UID: \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.402922 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s4mhg\" (UniqueName: \"kubernetes.io/projected/11173dd1-e076-4ad0-8a7e-ba71b69a805e-kube-api-access-s4mhg\") pod \"collect-profiles-29412480-ngp2m\" (UID: \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.403147 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11173dd1-e076-4ad0-8a7e-ba71b69a805e-secret-volume\") pod \"collect-profiles-29412480-ngp2m\" (UID: \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.403243 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11173dd1-e076-4ad0-8a7e-ba71b69a805e-config-volume\") pod \"collect-profiles-29412480-ngp2m\" (UID: \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.405462 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11173dd1-e076-4ad0-8a7e-ba71b69a805e-config-volume\") pod \"collect-profiles-29412480-ngp2m\" (UID: \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.411081 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11173dd1-e076-4ad0-8a7e-ba71b69a805e-secret-volume\") pod \"collect-profiles-29412480-ngp2m\" (UID: \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.424126 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4mhg\" (UniqueName: \"kubernetes.io/projected/11173dd1-e076-4ad0-8a7e-ba71b69a805e-kube-api-access-s4mhg\") pod \"collect-profiles-29412480-ngp2m\" (UID: \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" Dec 03 08:00:00 crc kubenswrapper[4831]: I1203 08:00:00.541884 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" Dec 03 08:00:01 crc kubenswrapper[4831]: I1203 08:00:01.022487 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m"] Dec 03 08:00:01 crc kubenswrapper[4831]: I1203 08:00:01.216717 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" event={"ID":"11173dd1-e076-4ad0-8a7e-ba71b69a805e","Type":"ContainerStarted","Data":"0b6783a4d664aa8dabb5065d54ba12ca57a961d37f718644e0bb8543a4e0a51d"} Dec 03 08:00:02 crc kubenswrapper[4831]: I1203 08:00:02.229382 4831 generic.go:334] "Generic (PLEG): container finished" podID="11173dd1-e076-4ad0-8a7e-ba71b69a805e" containerID="d94b20946ce0656c0d56e455c92899d295eb3a008f134160ab2a94bf4d8dd742" exitCode=0 Dec 03 08:00:02 crc kubenswrapper[4831]: I1203 08:00:02.229448 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" event={"ID":"11173dd1-e076-4ad0-8a7e-ba71b69a805e","Type":"ContainerDied","Data":"d94b20946ce0656c0d56e455c92899d295eb3a008f134160ab2a94bf4d8dd742"} Dec 03 08:00:03 crc kubenswrapper[4831]: I1203 08:00:03.540304 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" Dec 03 08:00:03 crc kubenswrapper[4831]: I1203 08:00:03.661880 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4mhg\" (UniqueName: \"kubernetes.io/projected/11173dd1-e076-4ad0-8a7e-ba71b69a805e-kube-api-access-s4mhg\") pod \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\" (UID: \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\") " Dec 03 08:00:03 crc kubenswrapper[4831]: I1203 08:00:03.662236 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11173dd1-e076-4ad0-8a7e-ba71b69a805e-config-volume\") pod \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\" (UID: \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\") " Dec 03 08:00:03 crc kubenswrapper[4831]: I1203 08:00:03.662444 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11173dd1-e076-4ad0-8a7e-ba71b69a805e-secret-volume\") pod \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\" (UID: \"11173dd1-e076-4ad0-8a7e-ba71b69a805e\") " Dec 03 08:00:03 crc kubenswrapper[4831]: I1203 08:00:03.664974 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11173dd1-e076-4ad0-8a7e-ba71b69a805e-config-volume" (OuterVolumeSpecName: "config-volume") pod "11173dd1-e076-4ad0-8a7e-ba71b69a805e" (UID: "11173dd1-e076-4ad0-8a7e-ba71b69a805e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:00:03 crc kubenswrapper[4831]: I1203 08:00:03.668366 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11173dd1-e076-4ad0-8a7e-ba71b69a805e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "11173dd1-e076-4ad0-8a7e-ba71b69a805e" (UID: "11173dd1-e076-4ad0-8a7e-ba71b69a805e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:00:03 crc kubenswrapper[4831]: I1203 08:00:03.668536 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11173dd1-e076-4ad0-8a7e-ba71b69a805e-kube-api-access-s4mhg" (OuterVolumeSpecName: "kube-api-access-s4mhg") pod "11173dd1-e076-4ad0-8a7e-ba71b69a805e" (UID: "11173dd1-e076-4ad0-8a7e-ba71b69a805e"). InnerVolumeSpecName "kube-api-access-s4mhg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:00:03 crc kubenswrapper[4831]: I1203 08:00:03.764935 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/11173dd1-e076-4ad0-8a7e-ba71b69a805e-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:03 crc kubenswrapper[4831]: I1203 08:00:03.764984 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4mhg\" (UniqueName: \"kubernetes.io/projected/11173dd1-e076-4ad0-8a7e-ba71b69a805e-kube-api-access-s4mhg\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:03 crc kubenswrapper[4831]: I1203 08:00:03.765001 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/11173dd1-e076-4ad0-8a7e-ba71b69a805e-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:04 crc kubenswrapper[4831]: I1203 08:00:04.255357 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" event={"ID":"11173dd1-e076-4ad0-8a7e-ba71b69a805e","Type":"ContainerDied","Data":"0b6783a4d664aa8dabb5065d54ba12ca57a961d37f718644e0bb8543a4e0a51d"} Dec 03 08:00:04 crc kubenswrapper[4831]: I1203 08:00:04.255446 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b6783a4d664aa8dabb5065d54ba12ca57a961d37f718644e0bb8543a4e0a51d" Dec 03 08:00:04 crc kubenswrapper[4831]: I1203 08:00:04.255606 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m" Dec 03 08:00:04 crc kubenswrapper[4831]: I1203 08:00:04.618895 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65"] Dec 03 08:00:04 crc kubenswrapper[4831]: I1203 08:00:04.646097 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412435-jch65"] Dec 03 08:00:05 crc kubenswrapper[4831]: I1203 08:00:05.024377 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06133fcc-8cb2-4cd0-8d86-d18e7fbb838c" path="/var/lib/kubelet/pods/06133fcc-8cb2-4cd0-8d86-d18e7fbb838c/volumes" Dec 03 08:00:05 crc kubenswrapper[4831]: I1203 08:00:05.062457 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-549cddd46f-vzg66" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.354545 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 08:00:08 crc kubenswrapper[4831]: E1203 08:00:08.355856 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11173dd1-e076-4ad0-8a7e-ba71b69a805e" containerName="collect-profiles" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.355889 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="11173dd1-e076-4ad0-8a7e-ba71b69a805e" containerName="collect-profiles" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.356262 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="11173dd1-e076-4ad0-8a7e-ba71b69a805e" containerName="collect-profiles" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.357646 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.360475 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.360782 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wmlv6" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.361668 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.368234 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.450380 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brb7p\" (UniqueName: \"kubernetes.io/projected/fdb7ee35-d757-414f-b20f-227ce78917e7-kube-api-access-brb7p\") pod \"openstackclient\" (UID: \"fdb7ee35-d757-414f-b20f-227ce78917e7\") " pod="openstack/openstackclient" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.450503 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fdb7ee35-d757-414f-b20f-227ce78917e7-openstack-config\") pod \"openstackclient\" (UID: \"fdb7ee35-d757-414f-b20f-227ce78917e7\") " pod="openstack/openstackclient" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.450546 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fdb7ee35-d757-414f-b20f-227ce78917e7-openstack-config-secret\") pod \"openstackclient\" (UID: \"fdb7ee35-d757-414f-b20f-227ce78917e7\") " pod="openstack/openstackclient" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.551811 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brb7p\" (UniqueName: \"kubernetes.io/projected/fdb7ee35-d757-414f-b20f-227ce78917e7-kube-api-access-brb7p\") pod \"openstackclient\" (UID: \"fdb7ee35-d757-414f-b20f-227ce78917e7\") " pod="openstack/openstackclient" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.551912 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fdb7ee35-d757-414f-b20f-227ce78917e7-openstack-config\") pod \"openstackclient\" (UID: \"fdb7ee35-d757-414f-b20f-227ce78917e7\") " pod="openstack/openstackclient" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.551942 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fdb7ee35-d757-414f-b20f-227ce78917e7-openstack-config-secret\") pod \"openstackclient\" (UID: \"fdb7ee35-d757-414f-b20f-227ce78917e7\") " pod="openstack/openstackclient" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.553194 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fdb7ee35-d757-414f-b20f-227ce78917e7-openstack-config\") pod \"openstackclient\" (UID: \"fdb7ee35-d757-414f-b20f-227ce78917e7\") " pod="openstack/openstackclient" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.563819 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fdb7ee35-d757-414f-b20f-227ce78917e7-openstack-config-secret\") pod \"openstackclient\" (UID: \"fdb7ee35-d757-414f-b20f-227ce78917e7\") " pod="openstack/openstackclient" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.570640 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brb7p\" (UniqueName: \"kubernetes.io/projected/fdb7ee35-d757-414f-b20f-227ce78917e7-kube-api-access-brb7p\") pod \"openstackclient\" (UID: \"fdb7ee35-d757-414f-b20f-227ce78917e7\") " pod="openstack/openstackclient" Dec 03 08:00:08 crc kubenswrapper[4831]: I1203 08:00:08.693884 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 08:00:09 crc kubenswrapper[4831]: I1203 08:00:09.160526 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 08:00:09 crc kubenswrapper[4831]: I1203 08:00:09.322667 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fdb7ee35-d757-414f-b20f-227ce78917e7","Type":"ContainerStarted","Data":"1219e03ca599966e61664e1dd26bec625ea26af74f9551d1bedad613cfeea905"} Dec 03 08:00:10 crc kubenswrapper[4831]: I1203 08:00:10.333711 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"fdb7ee35-d757-414f-b20f-227ce78917e7","Type":"ContainerStarted","Data":"4355591e4499fd416dff34cd2bd7e62fac68847762422a98953e866be23cf375"} Dec 03 08:00:10 crc kubenswrapper[4831]: I1203 08:00:10.360012 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.359989063 podStartE2EDuration="2.359989063s" podCreationTimestamp="2025-12-03 08:00:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:00:10.357224807 +0000 UTC m=+5347.700808335" watchObservedRunningTime="2025-12-03 08:00:10.359989063 +0000 UTC m=+5347.703572581" Dec 03 08:00:19 crc kubenswrapper[4831]: I1203 08:00:19.157074 4831 scope.go:117] "RemoveContainer" containerID="b2a439d81b7701e329ea12c32447d28753fa831e721e77cdd70233207db7e14a" Dec 03 08:00:38 crc kubenswrapper[4831]: I1203 08:00:38.511496 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-84tzh"] Dec 03 08:00:38 crc kubenswrapper[4831]: I1203 08:00:38.516175 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:38 crc kubenswrapper[4831]: I1203 08:00:38.535568 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-84tzh"] Dec 03 08:00:38 crc kubenswrapper[4831]: I1203 08:00:38.548651 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6db3827a-b9e3-4662-b929-8108160236ca-utilities\") pod \"redhat-operators-84tzh\" (UID: \"6db3827a-b9e3-4662-b929-8108160236ca\") " pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:38 crc kubenswrapper[4831]: I1203 08:00:38.548731 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pwxd\" (UniqueName: \"kubernetes.io/projected/6db3827a-b9e3-4662-b929-8108160236ca-kube-api-access-6pwxd\") pod \"redhat-operators-84tzh\" (UID: \"6db3827a-b9e3-4662-b929-8108160236ca\") " pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:38 crc kubenswrapper[4831]: I1203 08:00:38.548770 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6db3827a-b9e3-4662-b929-8108160236ca-catalog-content\") pod \"redhat-operators-84tzh\" (UID: \"6db3827a-b9e3-4662-b929-8108160236ca\") " pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:38 crc kubenswrapper[4831]: I1203 08:00:38.650302 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6db3827a-b9e3-4662-b929-8108160236ca-utilities\") pod \"redhat-operators-84tzh\" (UID: \"6db3827a-b9e3-4662-b929-8108160236ca\") " pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:38 crc kubenswrapper[4831]: I1203 08:00:38.650393 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pwxd\" (UniqueName: \"kubernetes.io/projected/6db3827a-b9e3-4662-b929-8108160236ca-kube-api-access-6pwxd\") pod \"redhat-operators-84tzh\" (UID: \"6db3827a-b9e3-4662-b929-8108160236ca\") " pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:38 crc kubenswrapper[4831]: I1203 08:00:38.650422 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6db3827a-b9e3-4662-b929-8108160236ca-catalog-content\") pod \"redhat-operators-84tzh\" (UID: \"6db3827a-b9e3-4662-b929-8108160236ca\") " pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:38 crc kubenswrapper[4831]: I1203 08:00:38.650770 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6db3827a-b9e3-4662-b929-8108160236ca-utilities\") pod \"redhat-operators-84tzh\" (UID: \"6db3827a-b9e3-4662-b929-8108160236ca\") " pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:38 crc kubenswrapper[4831]: I1203 08:00:38.650853 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6db3827a-b9e3-4662-b929-8108160236ca-catalog-content\") pod \"redhat-operators-84tzh\" (UID: \"6db3827a-b9e3-4662-b929-8108160236ca\") " pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:38 crc kubenswrapper[4831]: I1203 08:00:38.669767 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pwxd\" (UniqueName: \"kubernetes.io/projected/6db3827a-b9e3-4662-b929-8108160236ca-kube-api-access-6pwxd\") pod \"redhat-operators-84tzh\" (UID: \"6db3827a-b9e3-4662-b929-8108160236ca\") " pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:38 crc kubenswrapper[4831]: I1203 08:00:38.854341 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.291619 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-4phrr"] Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.293705 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.372756 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d16927-3e78-401d-8adb-8f2ad72ee72f-utilities\") pod \"community-operators-4phrr\" (UID: \"08d16927-3e78-401d-8adb-8f2ad72ee72f\") " pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.372881 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zbqh\" (UniqueName: \"kubernetes.io/projected/08d16927-3e78-401d-8adb-8f2ad72ee72f-kube-api-access-7zbqh\") pod \"community-operators-4phrr\" (UID: \"08d16927-3e78-401d-8adb-8f2ad72ee72f\") " pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.372994 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d16927-3e78-401d-8adb-8f2ad72ee72f-catalog-content\") pod \"community-operators-4phrr\" (UID: \"08d16927-3e78-401d-8adb-8f2ad72ee72f\") " pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.374526 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4phrr"] Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.405485 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-84tzh"] Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.480213 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d16927-3e78-401d-8adb-8f2ad72ee72f-catalog-content\") pod \"community-operators-4phrr\" (UID: \"08d16927-3e78-401d-8adb-8f2ad72ee72f\") " pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.480329 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d16927-3e78-401d-8adb-8f2ad72ee72f-utilities\") pod \"community-operators-4phrr\" (UID: \"08d16927-3e78-401d-8adb-8f2ad72ee72f\") " pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.480361 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7zbqh\" (UniqueName: \"kubernetes.io/projected/08d16927-3e78-401d-8adb-8f2ad72ee72f-kube-api-access-7zbqh\") pod \"community-operators-4phrr\" (UID: \"08d16927-3e78-401d-8adb-8f2ad72ee72f\") " pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.481003 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d16927-3e78-401d-8adb-8f2ad72ee72f-catalog-content\") pod \"community-operators-4phrr\" (UID: \"08d16927-3e78-401d-8adb-8f2ad72ee72f\") " pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.481207 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d16927-3e78-401d-8adb-8f2ad72ee72f-utilities\") pod \"community-operators-4phrr\" (UID: \"08d16927-3e78-401d-8adb-8f2ad72ee72f\") " pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.509266 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zbqh\" (UniqueName: \"kubernetes.io/projected/08d16927-3e78-401d-8adb-8f2ad72ee72f-kube-api-access-7zbqh\") pod \"community-operators-4phrr\" (UID: \"08d16927-3e78-401d-8adb-8f2ad72ee72f\") " pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.613404 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.647260 4831 generic.go:334] "Generic (PLEG): container finished" podID="6db3827a-b9e3-4662-b929-8108160236ca" containerID="b3d0fca47ef5c5123758adc4c7d1c3489b036a5d227d4e6d31e602b27d145225" exitCode=0 Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.647587 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84tzh" event={"ID":"6db3827a-b9e3-4662-b929-8108160236ca","Type":"ContainerDied","Data":"b3d0fca47ef5c5123758adc4c7d1c3489b036a5d227d4e6d31e602b27d145225"} Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.647690 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84tzh" event={"ID":"6db3827a-b9e3-4662-b929-8108160236ca","Type":"ContainerStarted","Data":"b5090a5ed19ea92d1db7d582501410e16e011f5d03b2ad68099535a0cefea3dd"} Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.649816 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:00:39 crc kubenswrapper[4831]: I1203 08:00:39.911925 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-4phrr"] Dec 03 08:00:40 crc kubenswrapper[4831]: I1203 08:00:40.659236 4831 generic.go:334] "Generic (PLEG): container finished" podID="08d16927-3e78-401d-8adb-8f2ad72ee72f" containerID="f2586c4a552e38e5220818c23016d60700c3bf4615b24e8a02cf9ad125dded3b" exitCode=0 Dec 03 08:00:40 crc kubenswrapper[4831]: I1203 08:00:40.660965 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4phrr" event={"ID":"08d16927-3e78-401d-8adb-8f2ad72ee72f","Type":"ContainerDied","Data":"f2586c4a552e38e5220818c23016d60700c3bf4615b24e8a02cf9ad125dded3b"} Dec 03 08:00:40 crc kubenswrapper[4831]: I1203 08:00:40.661165 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4phrr" event={"ID":"08d16927-3e78-401d-8adb-8f2ad72ee72f","Type":"ContainerStarted","Data":"2c8b427d8fbd019d144c686be7545f5472e953fa3b7fe0e51990925c9bd94f25"} Dec 03 08:00:41 crc kubenswrapper[4831]: I1203 08:00:41.676781 4831 generic.go:334] "Generic (PLEG): container finished" podID="6db3827a-b9e3-4662-b929-8108160236ca" containerID="5978ed12e80f7d7935f8ecfbc6022b0b42de03b4f0c57d088a3cd2beb605578a" exitCode=0 Dec 03 08:00:41 crc kubenswrapper[4831]: I1203 08:00:41.676908 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84tzh" event={"ID":"6db3827a-b9e3-4662-b929-8108160236ca","Type":"ContainerDied","Data":"5978ed12e80f7d7935f8ecfbc6022b0b42de03b4f0c57d088a3cd2beb605578a"} Dec 03 08:00:42 crc kubenswrapper[4831]: I1203 08:00:42.690458 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84tzh" event={"ID":"6db3827a-b9e3-4662-b929-8108160236ca","Type":"ContainerStarted","Data":"3b22c6180da3643cb455633c51a7647ea2e6881653c1ab950bf799cbf0c09c49"} Dec 03 08:00:42 crc kubenswrapper[4831]: I1203 08:00:42.692094 4831 generic.go:334] "Generic (PLEG): container finished" podID="08d16927-3e78-401d-8adb-8f2ad72ee72f" containerID="32e95c7480b811f96ec86ea221c6e0f2b096eb3c2d13dd2962af8f081fd43a80" exitCode=0 Dec 03 08:00:42 crc kubenswrapper[4831]: I1203 08:00:42.692148 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4phrr" event={"ID":"08d16927-3e78-401d-8adb-8f2ad72ee72f","Type":"ContainerDied","Data":"32e95c7480b811f96ec86ea221c6e0f2b096eb3c2d13dd2962af8f081fd43a80"} Dec 03 08:00:43 crc kubenswrapper[4831]: I1203 08:00:43.702307 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4phrr" event={"ID":"08d16927-3e78-401d-8adb-8f2ad72ee72f","Type":"ContainerStarted","Data":"4d26490ed9c88b4bb9c7aa893515514f20a3217aadc9ab73b251009fb485f665"} Dec 03 08:00:43 crc kubenswrapper[4831]: I1203 08:00:43.724415 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-84tzh" podStartSLOduration=3.120084045 podStartE2EDuration="5.724393792s" podCreationTimestamp="2025-12-03 08:00:38 +0000 UTC" firstStartedPulling="2025-12-03 08:00:39.649533643 +0000 UTC m=+5376.993117151" lastFinishedPulling="2025-12-03 08:00:42.25384337 +0000 UTC m=+5379.597426898" observedRunningTime="2025-12-03 08:00:43.723479084 +0000 UTC m=+5381.067062622" watchObservedRunningTime="2025-12-03 08:00:43.724393792 +0000 UTC m=+5381.067977300" Dec 03 08:00:43 crc kubenswrapper[4831]: I1203 08:00:43.750135 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-4phrr" podStartSLOduration=2.131578094 podStartE2EDuration="4.750117003s" podCreationTimestamp="2025-12-03 08:00:39 +0000 UTC" firstStartedPulling="2025-12-03 08:00:40.66325389 +0000 UTC m=+5378.006837438" lastFinishedPulling="2025-12-03 08:00:43.281792839 +0000 UTC m=+5380.625376347" observedRunningTime="2025-12-03 08:00:43.744418596 +0000 UTC m=+5381.088002154" watchObservedRunningTime="2025-12-03 08:00:43.750117003 +0000 UTC m=+5381.093700521" Dec 03 08:00:48 crc kubenswrapper[4831]: I1203 08:00:48.855465 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:48 crc kubenswrapper[4831]: I1203 08:00:48.855881 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:48 crc kubenswrapper[4831]: I1203 08:00:48.910346 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:49 crc kubenswrapper[4831]: I1203 08:00:49.614100 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:49 crc kubenswrapper[4831]: I1203 08:00:49.614155 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:49 crc kubenswrapper[4831]: I1203 08:00:49.677202 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:49 crc kubenswrapper[4831]: I1203 08:00:49.796797 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:49 crc kubenswrapper[4831]: I1203 08:00:49.802286 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:51 crc kubenswrapper[4831]: I1203 08:00:51.483740 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4phrr"] Dec 03 08:00:51 crc kubenswrapper[4831]: I1203 08:00:51.771729 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-4phrr" podUID="08d16927-3e78-401d-8adb-8f2ad72ee72f" containerName="registry-server" containerID="cri-o://4d26490ed9c88b4bb9c7aa893515514f20a3217aadc9ab73b251009fb485f665" gracePeriod=2 Dec 03 08:00:52 crc kubenswrapper[4831]: I1203 08:00:52.093118 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-84tzh"] Dec 03 08:00:52 crc kubenswrapper[4831]: I1203 08:00:52.093585 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-84tzh" podUID="6db3827a-b9e3-4662-b929-8108160236ca" containerName="registry-server" containerID="cri-o://3b22c6180da3643cb455633c51a7647ea2e6881653c1ab950bf799cbf0c09c49" gracePeriod=2 Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.671126 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.765107 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6pwxd\" (UniqueName: \"kubernetes.io/projected/6db3827a-b9e3-4662-b929-8108160236ca-kube-api-access-6pwxd\") pod \"6db3827a-b9e3-4662-b929-8108160236ca\" (UID: \"6db3827a-b9e3-4662-b929-8108160236ca\") " Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.765179 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6db3827a-b9e3-4662-b929-8108160236ca-catalog-content\") pod \"6db3827a-b9e3-4662-b929-8108160236ca\" (UID: \"6db3827a-b9e3-4662-b929-8108160236ca\") " Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.765204 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6db3827a-b9e3-4662-b929-8108160236ca-utilities\") pod \"6db3827a-b9e3-4662-b929-8108160236ca\" (UID: \"6db3827a-b9e3-4662-b929-8108160236ca\") " Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.766478 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db3827a-b9e3-4662-b929-8108160236ca-utilities" (OuterVolumeSpecName: "utilities") pod "6db3827a-b9e3-4662-b929-8108160236ca" (UID: "6db3827a-b9e3-4662-b929-8108160236ca"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.779293 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6db3827a-b9e3-4662-b929-8108160236ca-kube-api-access-6pwxd" (OuterVolumeSpecName: "kube-api-access-6pwxd") pod "6db3827a-b9e3-4662-b929-8108160236ca" (UID: "6db3827a-b9e3-4662-b929-8108160236ca"). InnerVolumeSpecName "kube-api-access-6pwxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.802998 4831 generic.go:334] "Generic (PLEG): container finished" podID="6db3827a-b9e3-4662-b929-8108160236ca" containerID="3b22c6180da3643cb455633c51a7647ea2e6881653c1ab950bf799cbf0c09c49" exitCode=0 Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.803068 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84tzh" event={"ID":"6db3827a-b9e3-4662-b929-8108160236ca","Type":"ContainerDied","Data":"3b22c6180da3643cb455633c51a7647ea2e6881653c1ab950bf799cbf0c09c49"} Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.803100 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-84tzh" event={"ID":"6db3827a-b9e3-4662-b929-8108160236ca","Type":"ContainerDied","Data":"b5090a5ed19ea92d1db7d582501410e16e011f5d03b2ad68099535a0cefea3dd"} Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.803121 4831 scope.go:117] "RemoveContainer" containerID="3b22c6180da3643cb455633c51a7647ea2e6881653c1ab950bf799cbf0c09c49" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.803258 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-84tzh" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.807334 4831 generic.go:334] "Generic (PLEG): container finished" podID="08d16927-3e78-401d-8adb-8f2ad72ee72f" containerID="4d26490ed9c88b4bb9c7aa893515514f20a3217aadc9ab73b251009fb485f665" exitCode=0 Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.807383 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4phrr" event={"ID":"08d16927-3e78-401d-8adb-8f2ad72ee72f","Type":"ContainerDied","Data":"4d26490ed9c88b4bb9c7aa893515514f20a3217aadc9ab73b251009fb485f665"} Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.842978 4831 scope.go:117] "RemoveContainer" containerID="5978ed12e80f7d7935f8ecfbc6022b0b42de03b4f0c57d088a3cd2beb605578a" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.867596 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6pwxd\" (UniqueName: \"kubernetes.io/projected/6db3827a-b9e3-4662-b929-8108160236ca-kube-api-access-6pwxd\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.867648 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6db3827a-b9e3-4662-b929-8108160236ca-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.883031 4831 scope.go:117] "RemoveContainer" containerID="b3d0fca47ef5c5123758adc4c7d1c3489b036a5d227d4e6d31e602b27d145225" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.921559 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6db3827a-b9e3-4662-b929-8108160236ca-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6db3827a-b9e3-4662-b929-8108160236ca" (UID: "6db3827a-b9e3-4662-b929-8108160236ca"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.926143 4831 scope.go:117] "RemoveContainer" containerID="3b22c6180da3643cb455633c51a7647ea2e6881653c1ab950bf799cbf0c09c49" Dec 03 08:00:53 crc kubenswrapper[4831]: E1203 08:00:53.926800 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b22c6180da3643cb455633c51a7647ea2e6881653c1ab950bf799cbf0c09c49\": container with ID starting with 3b22c6180da3643cb455633c51a7647ea2e6881653c1ab950bf799cbf0c09c49 not found: ID does not exist" containerID="3b22c6180da3643cb455633c51a7647ea2e6881653c1ab950bf799cbf0c09c49" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.926839 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b22c6180da3643cb455633c51a7647ea2e6881653c1ab950bf799cbf0c09c49"} err="failed to get container status \"3b22c6180da3643cb455633c51a7647ea2e6881653c1ab950bf799cbf0c09c49\": rpc error: code = NotFound desc = could not find container \"3b22c6180da3643cb455633c51a7647ea2e6881653c1ab950bf799cbf0c09c49\": container with ID starting with 3b22c6180da3643cb455633c51a7647ea2e6881653c1ab950bf799cbf0c09c49 not found: ID does not exist" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.926893 4831 scope.go:117] "RemoveContainer" containerID="5978ed12e80f7d7935f8ecfbc6022b0b42de03b4f0c57d088a3cd2beb605578a" Dec 03 08:00:53 crc kubenswrapper[4831]: E1203 08:00:53.927346 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5978ed12e80f7d7935f8ecfbc6022b0b42de03b4f0c57d088a3cd2beb605578a\": container with ID starting with 5978ed12e80f7d7935f8ecfbc6022b0b42de03b4f0c57d088a3cd2beb605578a not found: ID does not exist" containerID="5978ed12e80f7d7935f8ecfbc6022b0b42de03b4f0c57d088a3cd2beb605578a" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.927384 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5978ed12e80f7d7935f8ecfbc6022b0b42de03b4f0c57d088a3cd2beb605578a"} err="failed to get container status \"5978ed12e80f7d7935f8ecfbc6022b0b42de03b4f0c57d088a3cd2beb605578a\": rpc error: code = NotFound desc = could not find container \"5978ed12e80f7d7935f8ecfbc6022b0b42de03b4f0c57d088a3cd2beb605578a\": container with ID starting with 5978ed12e80f7d7935f8ecfbc6022b0b42de03b4f0c57d088a3cd2beb605578a not found: ID does not exist" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.927414 4831 scope.go:117] "RemoveContainer" containerID="b3d0fca47ef5c5123758adc4c7d1c3489b036a5d227d4e6d31e602b27d145225" Dec 03 08:00:53 crc kubenswrapper[4831]: E1203 08:00:53.927753 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3d0fca47ef5c5123758adc4c7d1c3489b036a5d227d4e6d31e602b27d145225\": container with ID starting with b3d0fca47ef5c5123758adc4c7d1c3489b036a5d227d4e6d31e602b27d145225 not found: ID does not exist" containerID="b3d0fca47ef5c5123758adc4c7d1c3489b036a5d227d4e6d31e602b27d145225" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.927826 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3d0fca47ef5c5123758adc4c7d1c3489b036a5d227d4e6d31e602b27d145225"} err="failed to get container status \"b3d0fca47ef5c5123758adc4c7d1c3489b036a5d227d4e6d31e602b27d145225\": rpc error: code = NotFound desc = could not find container \"b3d0fca47ef5c5123758adc4c7d1c3489b036a5d227d4e6d31e602b27d145225\": container with ID starting with b3d0fca47ef5c5123758adc4c7d1c3489b036a5d227d4e6d31e602b27d145225 not found: ID does not exist" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.968732 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6db3827a-b9e3-4662-b929-8108160236ca-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:53 crc kubenswrapper[4831]: I1203 08:00:53.984173 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.069617 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d16927-3e78-401d-8adb-8f2ad72ee72f-utilities\") pod \"08d16927-3e78-401d-8adb-8f2ad72ee72f\" (UID: \"08d16927-3e78-401d-8adb-8f2ad72ee72f\") " Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.069752 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d16927-3e78-401d-8adb-8f2ad72ee72f-catalog-content\") pod \"08d16927-3e78-401d-8adb-8f2ad72ee72f\" (UID: \"08d16927-3e78-401d-8adb-8f2ad72ee72f\") " Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.069835 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zbqh\" (UniqueName: \"kubernetes.io/projected/08d16927-3e78-401d-8adb-8f2ad72ee72f-kube-api-access-7zbqh\") pod \"08d16927-3e78-401d-8adb-8f2ad72ee72f\" (UID: \"08d16927-3e78-401d-8adb-8f2ad72ee72f\") " Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.071030 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08d16927-3e78-401d-8adb-8f2ad72ee72f-utilities" (OuterVolumeSpecName: "utilities") pod "08d16927-3e78-401d-8adb-8f2ad72ee72f" (UID: "08d16927-3e78-401d-8adb-8f2ad72ee72f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.073782 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08d16927-3e78-401d-8adb-8f2ad72ee72f-kube-api-access-7zbqh" (OuterVolumeSpecName: "kube-api-access-7zbqh") pod "08d16927-3e78-401d-8adb-8f2ad72ee72f" (UID: "08d16927-3e78-401d-8adb-8f2ad72ee72f"). InnerVolumeSpecName "kube-api-access-7zbqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.120093 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/08d16927-3e78-401d-8adb-8f2ad72ee72f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "08d16927-3e78-401d-8adb-8f2ad72ee72f" (UID: "08d16927-3e78-401d-8adb-8f2ad72ee72f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.154814 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-84tzh"] Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.165592 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-84tzh"] Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.171565 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/08d16927-3e78-401d-8adb-8f2ad72ee72f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.171600 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7zbqh\" (UniqueName: \"kubernetes.io/projected/08d16927-3e78-401d-8adb-8f2ad72ee72f-kube-api-access-7zbqh\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.171610 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/08d16927-3e78-401d-8adb-8f2ad72ee72f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.819538 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-4phrr" event={"ID":"08d16927-3e78-401d-8adb-8f2ad72ee72f","Type":"ContainerDied","Data":"2c8b427d8fbd019d144c686be7545f5472e953fa3b7fe0e51990925c9bd94f25"} Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.819906 4831 scope.go:117] "RemoveContainer" containerID="4d26490ed9c88b4bb9c7aa893515514f20a3217aadc9ab73b251009fb485f665" Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.819914 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-4phrr" Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.851284 4831 scope.go:117] "RemoveContainer" containerID="32e95c7480b811f96ec86ea221c6e0f2b096eb3c2d13dd2962af8f081fd43a80" Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.877007 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-4phrr"] Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.896538 4831 scope.go:117] "RemoveContainer" containerID="f2586c4a552e38e5220818c23016d60700c3bf4615b24e8a02cf9ad125dded3b" Dec 03 08:00:54 crc kubenswrapper[4831]: I1203 08:00:54.897878 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-4phrr"] Dec 03 08:00:55 crc kubenswrapper[4831]: I1203 08:00:55.023325 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08d16927-3e78-401d-8adb-8f2ad72ee72f" path="/var/lib/kubelet/pods/08d16927-3e78-401d-8adb-8f2ad72ee72f/volumes" Dec 03 08:00:55 crc kubenswrapper[4831]: I1203 08:00:55.024276 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6db3827a-b9e3-4662-b929-8108160236ca" path="/var/lib/kubelet/pods/6db3827a-b9e3-4662-b929-8108160236ca/volumes" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.176856 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412481-qnsng"] Dec 03 08:01:00 crc kubenswrapper[4831]: E1203 08:01:00.177805 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d16927-3e78-401d-8adb-8f2ad72ee72f" containerName="extract-utilities" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.177820 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d16927-3e78-401d-8adb-8f2ad72ee72f" containerName="extract-utilities" Dec 03 08:01:00 crc kubenswrapper[4831]: E1203 08:01:00.177839 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db3827a-b9e3-4662-b929-8108160236ca" containerName="registry-server" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.177850 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db3827a-b9e3-4662-b929-8108160236ca" containerName="registry-server" Dec 03 08:01:00 crc kubenswrapper[4831]: E1203 08:01:00.177866 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db3827a-b9e3-4662-b929-8108160236ca" containerName="extract-content" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.177874 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db3827a-b9e3-4662-b929-8108160236ca" containerName="extract-content" Dec 03 08:01:00 crc kubenswrapper[4831]: E1203 08:01:00.177889 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d16927-3e78-401d-8adb-8f2ad72ee72f" containerName="registry-server" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.177895 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d16927-3e78-401d-8adb-8f2ad72ee72f" containerName="registry-server" Dec 03 08:01:00 crc kubenswrapper[4831]: E1203 08:01:00.177912 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6db3827a-b9e3-4662-b929-8108160236ca" containerName="extract-utilities" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.177920 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6db3827a-b9e3-4662-b929-8108160236ca" containerName="extract-utilities" Dec 03 08:01:00 crc kubenswrapper[4831]: E1203 08:01:00.177931 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08d16927-3e78-401d-8adb-8f2ad72ee72f" containerName="extract-content" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.177937 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="08d16927-3e78-401d-8adb-8f2ad72ee72f" containerName="extract-content" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.178111 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6db3827a-b9e3-4662-b929-8108160236ca" containerName="registry-server" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.178129 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="08d16927-3e78-401d-8adb-8f2ad72ee72f" containerName="registry-server" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.178808 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.183740 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412481-qnsng"] Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.293499 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-fernet-keys\") pod \"keystone-cron-29412481-qnsng\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.293547 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-combined-ca-bundle\") pod \"keystone-cron-29412481-qnsng\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.293593 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-config-data\") pod \"keystone-cron-29412481-qnsng\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.293698 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5z55\" (UniqueName: \"kubernetes.io/projected/34df800e-8b70-4658-a6ae-a639bca251f5-kube-api-access-j5z55\") pod \"keystone-cron-29412481-qnsng\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.395445 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5z55\" (UniqueName: \"kubernetes.io/projected/34df800e-8b70-4658-a6ae-a639bca251f5-kube-api-access-j5z55\") pod \"keystone-cron-29412481-qnsng\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.395601 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-fernet-keys\") pod \"keystone-cron-29412481-qnsng\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.395629 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-combined-ca-bundle\") pod \"keystone-cron-29412481-qnsng\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.395679 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-config-data\") pod \"keystone-cron-29412481-qnsng\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.401402 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-config-data\") pod \"keystone-cron-29412481-qnsng\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.401748 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-fernet-keys\") pod \"keystone-cron-29412481-qnsng\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.403867 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-combined-ca-bundle\") pod \"keystone-cron-29412481-qnsng\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.439559 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5z55\" (UniqueName: \"kubernetes.io/projected/34df800e-8b70-4658-a6ae-a639bca251f5-kube-api-access-j5z55\") pod \"keystone-cron-29412481-qnsng\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.496467 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:00 crc kubenswrapper[4831]: I1203 08:01:00.953658 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412481-qnsng"] Dec 03 08:01:01 crc kubenswrapper[4831]: I1203 08:01:01.904301 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412481-qnsng" event={"ID":"34df800e-8b70-4658-a6ae-a639bca251f5","Type":"ContainerStarted","Data":"00c6dc340ea9db6636906c3ffead827c209b6bd4c0b71dffa94ff719a0ff9cc2"} Dec 03 08:01:01 crc kubenswrapper[4831]: I1203 08:01:01.904838 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412481-qnsng" event={"ID":"34df800e-8b70-4658-a6ae-a639bca251f5","Type":"ContainerStarted","Data":"2e4b39513bcd121ed78c7b3f93c9795343addbc6fe74948527796fd69eaca938"} Dec 03 08:01:01 crc kubenswrapper[4831]: I1203 08:01:01.938958 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412481-qnsng" podStartSLOduration=1.938930998 podStartE2EDuration="1.938930998s" podCreationTimestamp="2025-12-03 08:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:01:01.931123515 +0000 UTC m=+5399.274707113" watchObservedRunningTime="2025-12-03 08:01:01.938930998 +0000 UTC m=+5399.282514546" Dec 03 08:01:02 crc kubenswrapper[4831]: I1203 08:01:02.912779 4831 generic.go:334] "Generic (PLEG): container finished" podID="34df800e-8b70-4658-a6ae-a639bca251f5" containerID="00c6dc340ea9db6636906c3ffead827c209b6bd4c0b71dffa94ff719a0ff9cc2" exitCode=0 Dec 03 08:01:02 crc kubenswrapper[4831]: I1203 08:01:02.912848 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412481-qnsng" event={"ID":"34df800e-8b70-4658-a6ae-a639bca251f5","Type":"ContainerDied","Data":"00c6dc340ea9db6636906c3ffead827c209b6bd4c0b71dffa94ff719a0ff9cc2"} Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.286485 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.365115 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-combined-ca-bundle\") pod \"34df800e-8b70-4658-a6ae-a639bca251f5\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.365173 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-fernet-keys\") pod \"34df800e-8b70-4658-a6ae-a639bca251f5\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.365206 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5z55\" (UniqueName: \"kubernetes.io/projected/34df800e-8b70-4658-a6ae-a639bca251f5-kube-api-access-j5z55\") pod \"34df800e-8b70-4658-a6ae-a639bca251f5\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.370491 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34df800e-8b70-4658-a6ae-a639bca251f5-kube-api-access-j5z55" (OuterVolumeSpecName: "kube-api-access-j5z55") pod "34df800e-8b70-4658-a6ae-a639bca251f5" (UID: "34df800e-8b70-4658-a6ae-a639bca251f5"). InnerVolumeSpecName "kube-api-access-j5z55". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.375399 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "34df800e-8b70-4658-a6ae-a639bca251f5" (UID: "34df800e-8b70-4658-a6ae-a639bca251f5"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.409145 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34df800e-8b70-4658-a6ae-a639bca251f5" (UID: "34df800e-8b70-4658-a6ae-a639bca251f5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.466505 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-config-data\") pod \"34df800e-8b70-4658-a6ae-a639bca251f5\" (UID: \"34df800e-8b70-4658-a6ae-a639bca251f5\") " Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.467072 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.467099 4831 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.467111 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5z55\" (UniqueName: \"kubernetes.io/projected/34df800e-8b70-4658-a6ae-a639bca251f5-kube-api-access-j5z55\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.507282 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-config-data" (OuterVolumeSpecName: "config-data") pod "34df800e-8b70-4658-a6ae-a639bca251f5" (UID: "34df800e-8b70-4658-a6ae-a639bca251f5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.568465 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34df800e-8b70-4658-a6ae-a639bca251f5-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.934829 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412481-qnsng" event={"ID":"34df800e-8b70-4658-a6ae-a639bca251f5","Type":"ContainerDied","Data":"2e4b39513bcd121ed78c7b3f93c9795343addbc6fe74948527796fd69eaca938"} Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.934880 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e4b39513bcd121ed78c7b3f93c9795343addbc6fe74948527796fd69eaca938" Dec 03 08:01:04 crc kubenswrapper[4831]: I1203 08:01:04.934930 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412481-qnsng" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.341799 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-25bxf"] Dec 03 08:01:47 crc kubenswrapper[4831]: E1203 08:01:47.342816 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34df800e-8b70-4658-a6ae-a639bca251f5" containerName="keystone-cron" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.342840 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="34df800e-8b70-4658-a6ae-a639bca251f5" containerName="keystone-cron" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.343113 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="34df800e-8b70-4658-a6ae-a639bca251f5" containerName="keystone-cron" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.343923 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-25bxf" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.352211 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-25bxf"] Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.440283 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-218e-account-create-update-drsks"] Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.441802 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-218e-account-create-update-drsks" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.443869 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.449601 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-218e-account-create-update-drsks"] Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.481236 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e47a0a9-fa16-47ce-aecb-2e300ac07ea2-operator-scripts\") pod \"barbican-db-create-25bxf\" (UID: \"0e47a0a9-fa16-47ce-aecb-2e300ac07ea2\") " pod="openstack/barbican-db-create-25bxf" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.481487 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mk66\" (UniqueName: \"kubernetes.io/projected/0e47a0a9-fa16-47ce-aecb-2e300ac07ea2-kube-api-access-2mk66\") pod \"barbican-db-create-25bxf\" (UID: \"0e47a0a9-fa16-47ce-aecb-2e300ac07ea2\") " pod="openstack/barbican-db-create-25bxf" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.582522 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e47a0a9-fa16-47ce-aecb-2e300ac07ea2-operator-scripts\") pod \"barbican-db-create-25bxf\" (UID: \"0e47a0a9-fa16-47ce-aecb-2e300ac07ea2\") " pod="openstack/barbican-db-create-25bxf" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.582901 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mk66\" (UniqueName: \"kubernetes.io/projected/0e47a0a9-fa16-47ce-aecb-2e300ac07ea2-kube-api-access-2mk66\") pod \"barbican-db-create-25bxf\" (UID: \"0e47a0a9-fa16-47ce-aecb-2e300ac07ea2\") " pod="openstack/barbican-db-create-25bxf" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.582941 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md76l\" (UniqueName: \"kubernetes.io/projected/27769491-0a4a-41cd-869a-83484c81873c-kube-api-access-md76l\") pod \"barbican-218e-account-create-update-drsks\" (UID: \"27769491-0a4a-41cd-869a-83484c81873c\") " pod="openstack/barbican-218e-account-create-update-drsks" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.582990 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27769491-0a4a-41cd-869a-83484c81873c-operator-scripts\") pod \"barbican-218e-account-create-update-drsks\" (UID: \"27769491-0a4a-41cd-869a-83484c81873c\") " pod="openstack/barbican-218e-account-create-update-drsks" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.583215 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e47a0a9-fa16-47ce-aecb-2e300ac07ea2-operator-scripts\") pod \"barbican-db-create-25bxf\" (UID: \"0e47a0a9-fa16-47ce-aecb-2e300ac07ea2\") " pod="openstack/barbican-db-create-25bxf" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.608582 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mk66\" (UniqueName: \"kubernetes.io/projected/0e47a0a9-fa16-47ce-aecb-2e300ac07ea2-kube-api-access-2mk66\") pod \"barbican-db-create-25bxf\" (UID: \"0e47a0a9-fa16-47ce-aecb-2e300ac07ea2\") " pod="openstack/barbican-db-create-25bxf" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.685168 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md76l\" (UniqueName: \"kubernetes.io/projected/27769491-0a4a-41cd-869a-83484c81873c-kube-api-access-md76l\") pod \"barbican-218e-account-create-update-drsks\" (UID: \"27769491-0a4a-41cd-869a-83484c81873c\") " pod="openstack/barbican-218e-account-create-update-drsks" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.685267 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27769491-0a4a-41cd-869a-83484c81873c-operator-scripts\") pod \"barbican-218e-account-create-update-drsks\" (UID: \"27769491-0a4a-41cd-869a-83484c81873c\") " pod="openstack/barbican-218e-account-create-update-drsks" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.686352 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27769491-0a4a-41cd-869a-83484c81873c-operator-scripts\") pod \"barbican-218e-account-create-update-drsks\" (UID: \"27769491-0a4a-41cd-869a-83484c81873c\") " pod="openstack/barbican-218e-account-create-update-drsks" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.704955 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md76l\" (UniqueName: \"kubernetes.io/projected/27769491-0a4a-41cd-869a-83484c81873c-kube-api-access-md76l\") pod \"barbican-218e-account-create-update-drsks\" (UID: \"27769491-0a4a-41cd-869a-83484c81873c\") " pod="openstack/barbican-218e-account-create-update-drsks" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.710794 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-25bxf" Dec 03 08:01:47 crc kubenswrapper[4831]: I1203 08:01:47.768246 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-218e-account-create-update-drsks" Dec 03 08:01:48 crc kubenswrapper[4831]: I1203 08:01:48.208955 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-218e-account-create-update-drsks"] Dec 03 08:01:48 crc kubenswrapper[4831]: I1203 08:01:48.270285 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-25bxf"] Dec 03 08:01:48 crc kubenswrapper[4831]: W1203 08:01:48.272532 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0e47a0a9_fa16_47ce_aecb_2e300ac07ea2.slice/crio-eb4dcbbfc4122320c855cbf7a47886077a9fc2f98de62b5db8008643b1430989 WatchSource:0}: Error finding container eb4dcbbfc4122320c855cbf7a47886077a9fc2f98de62b5db8008643b1430989: Status 404 returned error can't find the container with id eb4dcbbfc4122320c855cbf7a47886077a9fc2f98de62b5db8008643b1430989 Dec 03 08:01:48 crc kubenswrapper[4831]: I1203 08:01:48.429053 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-218e-account-create-update-drsks" event={"ID":"27769491-0a4a-41cd-869a-83484c81873c","Type":"ContainerStarted","Data":"be3cce8c69525fd8502a9b86191a3cf670815ece566d0489c60f1396e30559a1"} Dec 03 08:01:48 crc kubenswrapper[4831]: I1203 08:01:48.429426 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-218e-account-create-update-drsks" event={"ID":"27769491-0a4a-41cd-869a-83484c81873c","Type":"ContainerStarted","Data":"38ec9b9ebaff12ef93989b18d6b7e511fd7553f8db446cf744051934849643c2"} Dec 03 08:01:48 crc kubenswrapper[4831]: I1203 08:01:48.436235 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-25bxf" event={"ID":"0e47a0a9-fa16-47ce-aecb-2e300ac07ea2","Type":"ContainerStarted","Data":"eb4dcbbfc4122320c855cbf7a47886077a9fc2f98de62b5db8008643b1430989"} Dec 03 08:01:48 crc kubenswrapper[4831]: I1203 08:01:48.478199 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-218e-account-create-update-drsks" podStartSLOduration=1.478184658 podStartE2EDuration="1.478184658s" podCreationTimestamp="2025-12-03 08:01:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:01:48.476153315 +0000 UTC m=+5445.819736843" watchObservedRunningTime="2025-12-03 08:01:48.478184658 +0000 UTC m=+5445.821768166" Dec 03 08:01:49 crc kubenswrapper[4831]: I1203 08:01:49.445893 4831 generic.go:334] "Generic (PLEG): container finished" podID="27769491-0a4a-41cd-869a-83484c81873c" containerID="be3cce8c69525fd8502a9b86191a3cf670815ece566d0489c60f1396e30559a1" exitCode=0 Dec 03 08:01:49 crc kubenswrapper[4831]: I1203 08:01:49.445958 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-218e-account-create-update-drsks" event={"ID":"27769491-0a4a-41cd-869a-83484c81873c","Type":"ContainerDied","Data":"be3cce8c69525fd8502a9b86191a3cf670815ece566d0489c60f1396e30559a1"} Dec 03 08:01:49 crc kubenswrapper[4831]: I1203 08:01:49.449002 4831 generic.go:334] "Generic (PLEG): container finished" podID="0e47a0a9-fa16-47ce-aecb-2e300ac07ea2" containerID="a5474d58cb26367afdfdcf3687160c61d1384a6a99e81a9df015b289d7275ce0" exitCode=0 Dec 03 08:01:49 crc kubenswrapper[4831]: I1203 08:01:49.449068 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-25bxf" event={"ID":"0e47a0a9-fa16-47ce-aecb-2e300ac07ea2","Type":"ContainerDied","Data":"a5474d58cb26367afdfdcf3687160c61d1384a6a99e81a9df015b289d7275ce0"} Dec 03 08:01:50 crc kubenswrapper[4831]: I1203 08:01:50.846292 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-25bxf" Dec 03 08:01:50 crc kubenswrapper[4831]: I1203 08:01:50.855019 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-218e-account-create-update-drsks" Dec 03 08:01:50 crc kubenswrapper[4831]: I1203 08:01:50.941752 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2mk66\" (UniqueName: \"kubernetes.io/projected/0e47a0a9-fa16-47ce-aecb-2e300ac07ea2-kube-api-access-2mk66\") pod \"0e47a0a9-fa16-47ce-aecb-2e300ac07ea2\" (UID: \"0e47a0a9-fa16-47ce-aecb-2e300ac07ea2\") " Dec 03 08:01:50 crc kubenswrapper[4831]: I1203 08:01:50.942125 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e47a0a9-fa16-47ce-aecb-2e300ac07ea2-operator-scripts\") pod \"0e47a0a9-fa16-47ce-aecb-2e300ac07ea2\" (UID: \"0e47a0a9-fa16-47ce-aecb-2e300ac07ea2\") " Dec 03 08:01:50 crc kubenswrapper[4831]: I1203 08:01:50.942738 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0e47a0a9-fa16-47ce-aecb-2e300ac07ea2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0e47a0a9-fa16-47ce-aecb-2e300ac07ea2" (UID: "0e47a0a9-fa16-47ce-aecb-2e300ac07ea2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:01:50 crc kubenswrapper[4831]: I1203 08:01:50.947680 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e47a0a9-fa16-47ce-aecb-2e300ac07ea2-kube-api-access-2mk66" (OuterVolumeSpecName: "kube-api-access-2mk66") pod "0e47a0a9-fa16-47ce-aecb-2e300ac07ea2" (UID: "0e47a0a9-fa16-47ce-aecb-2e300ac07ea2"). InnerVolumeSpecName "kube-api-access-2mk66". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.043698 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27769491-0a4a-41cd-869a-83484c81873c-operator-scripts\") pod \"27769491-0a4a-41cd-869a-83484c81873c\" (UID: \"27769491-0a4a-41cd-869a-83484c81873c\") " Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.043807 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-md76l\" (UniqueName: \"kubernetes.io/projected/27769491-0a4a-41cd-869a-83484c81873c-kube-api-access-md76l\") pod \"27769491-0a4a-41cd-869a-83484c81873c\" (UID: \"27769491-0a4a-41cd-869a-83484c81873c\") " Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.044107 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2mk66\" (UniqueName: \"kubernetes.io/projected/0e47a0a9-fa16-47ce-aecb-2e300ac07ea2-kube-api-access-2mk66\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.044122 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0e47a0a9-fa16-47ce-aecb-2e300ac07ea2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.044343 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27769491-0a4a-41cd-869a-83484c81873c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "27769491-0a4a-41cd-869a-83484c81873c" (UID: "27769491-0a4a-41cd-869a-83484c81873c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.046571 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27769491-0a4a-41cd-869a-83484c81873c-kube-api-access-md76l" (OuterVolumeSpecName: "kube-api-access-md76l") pod "27769491-0a4a-41cd-869a-83484c81873c" (UID: "27769491-0a4a-41cd-869a-83484c81873c"). InnerVolumeSpecName "kube-api-access-md76l". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.145877 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/27769491-0a4a-41cd-869a-83484c81873c-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.145922 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-md76l\" (UniqueName: \"kubernetes.io/projected/27769491-0a4a-41cd-869a-83484c81873c-kube-api-access-md76l\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.472050 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-218e-account-create-update-drsks" event={"ID":"27769491-0a4a-41cd-869a-83484c81873c","Type":"ContainerDied","Data":"38ec9b9ebaff12ef93989b18d6b7e511fd7553f8db446cf744051934849643c2"} Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.472104 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38ec9b9ebaff12ef93989b18d6b7e511fd7553f8db446cf744051934849643c2" Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.472220 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-218e-account-create-update-drsks" Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.474521 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-25bxf" event={"ID":"0e47a0a9-fa16-47ce-aecb-2e300ac07ea2","Type":"ContainerDied","Data":"eb4dcbbfc4122320c855cbf7a47886077a9fc2f98de62b5db8008643b1430989"} Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.474543 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb4dcbbfc4122320c855cbf7a47886077a9fc2f98de62b5db8008643b1430989" Dec 03 08:01:51 crc kubenswrapper[4831]: I1203 08:01:51.474607 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-25bxf" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.707128 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-n2s9j"] Dec 03 08:01:52 crc kubenswrapper[4831]: E1203 08:01:52.707976 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e47a0a9-fa16-47ce-aecb-2e300ac07ea2" containerName="mariadb-database-create" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.707998 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e47a0a9-fa16-47ce-aecb-2e300ac07ea2" containerName="mariadb-database-create" Dec 03 08:01:52 crc kubenswrapper[4831]: E1203 08:01:52.708037 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27769491-0a4a-41cd-869a-83484c81873c" containerName="mariadb-account-create-update" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.708049 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="27769491-0a4a-41cd-869a-83484c81873c" containerName="mariadb-account-create-update" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.708370 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="27769491-0a4a-41cd-869a-83484c81873c" containerName="mariadb-account-create-update" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.708396 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e47a0a9-fa16-47ce-aecb-2e300ac07ea2" containerName="mariadb-database-create" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.709374 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n2s9j" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.712175 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.712605 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qrh9w" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.719379 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-n2s9j"] Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.887871 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgs5z\" (UniqueName: \"kubernetes.io/projected/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-kube-api-access-bgs5z\") pod \"barbican-db-sync-n2s9j\" (UID: \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\") " pod="openstack/barbican-db-sync-n2s9j" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.888116 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-combined-ca-bundle\") pod \"barbican-db-sync-n2s9j\" (UID: \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\") " pod="openstack/barbican-db-sync-n2s9j" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.888391 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-db-sync-config-data\") pod \"barbican-db-sync-n2s9j\" (UID: \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\") " pod="openstack/barbican-db-sync-n2s9j" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.990649 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-combined-ca-bundle\") pod \"barbican-db-sync-n2s9j\" (UID: \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\") " pod="openstack/barbican-db-sync-n2s9j" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.990771 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-db-sync-config-data\") pod \"barbican-db-sync-n2s9j\" (UID: \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\") " pod="openstack/barbican-db-sync-n2s9j" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.990874 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgs5z\" (UniqueName: \"kubernetes.io/projected/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-kube-api-access-bgs5z\") pod \"barbican-db-sync-n2s9j\" (UID: \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\") " pod="openstack/barbican-db-sync-n2s9j" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.997481 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-db-sync-config-data\") pod \"barbican-db-sync-n2s9j\" (UID: \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\") " pod="openstack/barbican-db-sync-n2s9j" Dec 03 08:01:52 crc kubenswrapper[4831]: I1203 08:01:52.997713 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-combined-ca-bundle\") pod \"barbican-db-sync-n2s9j\" (UID: \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\") " pod="openstack/barbican-db-sync-n2s9j" Dec 03 08:01:53 crc kubenswrapper[4831]: I1203 08:01:53.015574 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgs5z\" (UniqueName: \"kubernetes.io/projected/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-kube-api-access-bgs5z\") pod \"barbican-db-sync-n2s9j\" (UID: \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\") " pod="openstack/barbican-db-sync-n2s9j" Dec 03 08:01:53 crc kubenswrapper[4831]: I1203 08:01:53.036162 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n2s9j" Dec 03 08:01:53 crc kubenswrapper[4831]: I1203 08:01:53.318345 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-n2s9j"] Dec 03 08:01:53 crc kubenswrapper[4831]: I1203 08:01:53.498930 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n2s9j" event={"ID":"be0b20d0-fd57-4f34-a011-7b50b9fb7af9","Type":"ContainerStarted","Data":"f56992e1020cd7a32df9df166945828f6e43da2f8edc0694124ec93a470ac27f"} Dec 03 08:01:54 crc kubenswrapper[4831]: I1203 08:01:54.507821 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n2s9j" event={"ID":"be0b20d0-fd57-4f34-a011-7b50b9fb7af9","Type":"ContainerStarted","Data":"0de074cd3685fe1e0ac59b03b11acc0dc7d15626ae841da1acc1f73815b2f917"} Dec 03 08:01:54 crc kubenswrapper[4831]: I1203 08:01:54.534734 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-n2s9j" podStartSLOduration=2.534714357 podStartE2EDuration="2.534714357s" podCreationTimestamp="2025-12-03 08:01:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:01:54.53287843 +0000 UTC m=+5451.876461938" watchObservedRunningTime="2025-12-03 08:01:54.534714357 +0000 UTC m=+5451.878297885" Dec 03 08:01:55 crc kubenswrapper[4831]: I1203 08:01:55.527437 4831 generic.go:334] "Generic (PLEG): container finished" podID="be0b20d0-fd57-4f34-a011-7b50b9fb7af9" containerID="0de074cd3685fe1e0ac59b03b11acc0dc7d15626ae841da1acc1f73815b2f917" exitCode=0 Dec 03 08:01:55 crc kubenswrapper[4831]: I1203 08:01:55.527585 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n2s9j" event={"ID":"be0b20d0-fd57-4f34-a011-7b50b9fb7af9","Type":"ContainerDied","Data":"0de074cd3685fe1e0ac59b03b11acc0dc7d15626ae841da1acc1f73815b2f917"} Dec 03 08:01:56 crc kubenswrapper[4831]: I1203 08:01:56.963261 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n2s9j" Dec 03 08:01:56 crc kubenswrapper[4831]: I1203 08:01:56.976414 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgs5z\" (UniqueName: \"kubernetes.io/projected/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-kube-api-access-bgs5z\") pod \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\" (UID: \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\") " Dec 03 08:01:56 crc kubenswrapper[4831]: I1203 08:01:56.976610 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-db-sync-config-data\") pod \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\" (UID: \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\") " Dec 03 08:01:56 crc kubenswrapper[4831]: I1203 08:01:56.976745 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-combined-ca-bundle\") pod \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\" (UID: \"be0b20d0-fd57-4f34-a011-7b50b9fb7af9\") " Dec 03 08:01:56 crc kubenswrapper[4831]: I1203 08:01:56.987385 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-kube-api-access-bgs5z" (OuterVolumeSpecName: "kube-api-access-bgs5z") pod "be0b20d0-fd57-4f34-a011-7b50b9fb7af9" (UID: "be0b20d0-fd57-4f34-a011-7b50b9fb7af9"). InnerVolumeSpecName "kube-api-access-bgs5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:01:56 crc kubenswrapper[4831]: I1203 08:01:56.988253 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "be0b20d0-fd57-4f34-a011-7b50b9fb7af9" (UID: "be0b20d0-fd57-4f34-a011-7b50b9fb7af9"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.012464 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "be0b20d0-fd57-4f34-a011-7b50b9fb7af9" (UID: "be0b20d0-fd57-4f34-a011-7b50b9fb7af9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.078568 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgs5z\" (UniqueName: \"kubernetes.io/projected/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-kube-api-access-bgs5z\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.078606 4831 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.078615 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/be0b20d0-fd57-4f34-a011-7b50b9fb7af9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.551550 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-n2s9j" event={"ID":"be0b20d0-fd57-4f34-a011-7b50b9fb7af9","Type":"ContainerDied","Data":"f56992e1020cd7a32df9df166945828f6e43da2f8edc0694124ec93a470ac27f"} Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.551600 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f56992e1020cd7a32df9df166945828f6e43da2f8edc0694124ec93a470ac27f" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.551637 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-n2s9j" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.597113 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.597200 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.843954 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-f65b6d578-mkb48"] Dec 03 08:01:57 crc kubenswrapper[4831]: E1203 08:01:57.844697 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be0b20d0-fd57-4f34-a011-7b50b9fb7af9" containerName="barbican-db-sync" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.850205 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="be0b20d0-fd57-4f34-a011-7b50b9fb7af9" containerName="barbican-db-sync" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.850676 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="be0b20d0-fd57-4f34-a011-7b50b9fb7af9" containerName="barbican-db-sync" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.851915 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.858035 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f65b6d578-mkb48"] Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.859255 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-qrh9w" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.859459 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.859637 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.901206 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-758d7559-cbnxt"] Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.907712 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.914635 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.915899 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-758d7559-cbnxt"] Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.946977 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-74b69dd9c7-k8bw7"] Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.950193 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.960705 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74b69dd9c7-k8bw7"] Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.997924 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f30a154-45b6-41f5-8aad-4019e18f01b6-config-data\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.998999 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f30a154-45b6-41f5-8aad-4019e18f01b6-logs\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.999163 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f30a154-45b6-41f5-8aad-4019e18f01b6-config-data-custom\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.999456 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqg2k\" (UniqueName: \"kubernetes.io/projected/3f30a154-45b6-41f5-8aad-4019e18f01b6-kube-api-access-xqg2k\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:57 crc kubenswrapper[4831]: I1203 08:01:57.999569 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f30a154-45b6-41f5-8aad-4019e18f01b6-combined-ca-bundle\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.001195 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d665c4464-jxss9"] Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.002662 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.014433 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.019977 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d665c4464-jxss9"] Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.100528 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1877ded-6e84-4ca9-b911-3e2996993bdb-logs\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.100579 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bh5k\" (UniqueName: \"kubernetes.io/projected/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-kube-api-access-2bh5k\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.100611 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882ddda9-c85e-4b93-afaf-b34b080d7047-config-data\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.100674 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1877ded-6e84-4ca9-b911-3e2996993bdb-config-data-custom\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.100778 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xqg2k\" (UniqueName: \"kubernetes.io/projected/3f30a154-45b6-41f5-8aad-4019e18f01b6-kube-api-access-xqg2k\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.100839 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f30a154-45b6-41f5-8aad-4019e18f01b6-combined-ca-bundle\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.100926 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-dns-svc\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.100965 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tfdv\" (UniqueName: \"kubernetes.io/projected/882ddda9-c85e-4b93-afaf-b34b080d7047-kube-api-access-8tfdv\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.101031 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1877ded-6e84-4ca9-b911-3e2996993bdb-config-data\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.101133 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f30a154-45b6-41f5-8aad-4019e18f01b6-config-data\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.101163 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/882ddda9-c85e-4b93-afaf-b34b080d7047-logs\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.101199 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882ddda9-c85e-4b93-afaf-b34b080d7047-combined-ca-bundle\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.101220 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvlz7\" (UniqueName: \"kubernetes.io/projected/e1877ded-6e84-4ca9-b911-3e2996993bdb-kube-api-access-nvlz7\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.101260 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f30a154-45b6-41f5-8aad-4019e18f01b6-logs\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.101300 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f30a154-45b6-41f5-8aad-4019e18f01b6-config-data-custom\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.101350 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1877ded-6e84-4ca9-b911-3e2996993bdb-combined-ca-bundle\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.101371 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-ovsdbserver-sb\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.101388 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-config\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.102114 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3f30a154-45b6-41f5-8aad-4019e18f01b6-logs\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.102162 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/882ddda9-c85e-4b93-afaf-b34b080d7047-config-data-custom\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.102185 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-ovsdbserver-nb\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.106772 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f30a154-45b6-41f5-8aad-4019e18f01b6-config-data-custom\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.107436 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f30a154-45b6-41f5-8aad-4019e18f01b6-config-data\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.109079 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f30a154-45b6-41f5-8aad-4019e18f01b6-combined-ca-bundle\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.119654 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqg2k\" (UniqueName: \"kubernetes.io/projected/3f30a154-45b6-41f5-8aad-4019e18f01b6-kube-api-access-xqg2k\") pod \"barbican-keystone-listener-f65b6d578-mkb48\" (UID: \"3f30a154-45b6-41f5-8aad-4019e18f01b6\") " pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.173418 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.207984 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1877ded-6e84-4ca9-b911-3e2996993bdb-config-data-custom\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208042 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-dns-svc\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208066 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tfdv\" (UniqueName: \"kubernetes.io/projected/882ddda9-c85e-4b93-afaf-b34b080d7047-kube-api-access-8tfdv\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208088 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1877ded-6e84-4ca9-b911-3e2996993bdb-config-data\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208117 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/882ddda9-c85e-4b93-afaf-b34b080d7047-logs\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208139 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882ddda9-c85e-4b93-afaf-b34b080d7047-combined-ca-bundle\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208154 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvlz7\" (UniqueName: \"kubernetes.io/projected/e1877ded-6e84-4ca9-b911-3e2996993bdb-kube-api-access-nvlz7\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208190 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1877ded-6e84-4ca9-b911-3e2996993bdb-combined-ca-bundle\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208205 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-ovsdbserver-sb\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208219 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-config\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208234 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/882ddda9-c85e-4b93-afaf-b34b080d7047-config-data-custom\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208251 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-ovsdbserver-nb\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208274 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1877ded-6e84-4ca9-b911-3e2996993bdb-logs\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208291 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bh5k\" (UniqueName: \"kubernetes.io/projected/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-kube-api-access-2bh5k\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.208332 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882ddda9-c85e-4b93-afaf-b34b080d7047-config-data\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.209390 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e1877ded-6e84-4ca9-b911-3e2996993bdb-logs\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.210263 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-ovsdbserver-sb\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.210552 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-config\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.210828 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/882ddda9-c85e-4b93-afaf-b34b080d7047-logs\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.211375 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-dns-svc\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.211558 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-ovsdbserver-nb\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.222050 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/882ddda9-c85e-4b93-afaf-b34b080d7047-combined-ca-bundle\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.223815 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e1877ded-6e84-4ca9-b911-3e2996993bdb-combined-ca-bundle\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.235978 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e1877ded-6e84-4ca9-b911-3e2996993bdb-config-data-custom\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.236016 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/882ddda9-c85e-4b93-afaf-b34b080d7047-config-data-custom\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.236861 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e1877ded-6e84-4ca9-b911-3e2996993bdb-config-data\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.237149 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/882ddda9-c85e-4b93-afaf-b34b080d7047-config-data\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.262256 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvlz7\" (UniqueName: \"kubernetes.io/projected/e1877ded-6e84-4ca9-b911-3e2996993bdb-kube-api-access-nvlz7\") pod \"barbican-worker-758d7559-cbnxt\" (UID: \"e1877ded-6e84-4ca9-b911-3e2996993bdb\") " pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.265794 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tfdv\" (UniqueName: \"kubernetes.io/projected/882ddda9-c85e-4b93-afaf-b34b080d7047-kube-api-access-8tfdv\") pod \"barbican-api-5d665c4464-jxss9\" (UID: \"882ddda9-c85e-4b93-afaf-b34b080d7047\") " pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.269087 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bh5k\" (UniqueName: \"kubernetes.io/projected/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-kube-api-access-2bh5k\") pod \"dnsmasq-dns-74b69dd9c7-k8bw7\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.338719 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.532455 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-758d7559-cbnxt" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.565782 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.839804 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-f65b6d578-mkb48"] Dec 03 08:01:58 crc kubenswrapper[4831]: I1203 08:01:58.888201 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d665c4464-jxss9"] Dec 03 08:01:58 crc kubenswrapper[4831]: W1203 08:01:58.898064 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod882ddda9_c85e_4b93_afaf_b34b080d7047.slice/crio-06cb900db9588157baa90f0eb2353e9788d6658462590d19461a1a6753dce713 WatchSource:0}: Error finding container 06cb900db9588157baa90f0eb2353e9788d6658462590d19461a1a6753dce713: Status 404 returned error can't find the container with id 06cb900db9588157baa90f0eb2353e9788d6658462590d19461a1a6753dce713 Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.002493 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-758d7559-cbnxt"] Dec 03 08:01:59 crc kubenswrapper[4831]: W1203 08:01:59.005655 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode1877ded_6e84_4ca9_b911_3e2996993bdb.slice/crio-429437703e9ae2529571275bdcec7b366b9d1e2067a029ef0ab0fe54fed6a13e WatchSource:0}: Error finding container 429437703e9ae2529571275bdcec7b366b9d1e2067a029ef0ab0fe54fed6a13e: Status 404 returned error can't find the container with id 429437703e9ae2529571275bdcec7b366b9d1e2067a029ef0ab0fe54fed6a13e Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.057654 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-74b69dd9c7-k8bw7"] Dec 03 08:01:59 crc kubenswrapper[4831]: W1203 08:01:59.059991 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbe5b6fe2_0642_45a1_b2ae_943f4ac7f553.slice/crio-9ad4801def2edbe702ca49aa03641db6186a2aeddc32e8bcfb107785db5c3c4b WatchSource:0}: Error finding container 9ad4801def2edbe702ca49aa03641db6186a2aeddc32e8bcfb107785db5c3c4b: Status 404 returned error can't find the container with id 9ad4801def2edbe702ca49aa03641db6186a2aeddc32e8bcfb107785db5c3c4b Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.567857 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" event={"ID":"3f30a154-45b6-41f5-8aad-4019e18f01b6","Type":"ContainerStarted","Data":"f753cfa8d8eef85b9ba2aac387a90aed817bd50c8de1609583cde03743cbeb4e"} Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.568199 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" event={"ID":"3f30a154-45b6-41f5-8aad-4019e18f01b6","Type":"ContainerStarted","Data":"001e9a91d864b9012f49e5666b7964ed650835227b1fa686d26c095f92ffb144"} Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.568216 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" event={"ID":"3f30a154-45b6-41f5-8aad-4019e18f01b6","Type":"ContainerStarted","Data":"00c724083142e60412a954f0111656335d729dc59445d3c5b022e8ff7cfd59d3"} Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.569457 4831 generic.go:334] "Generic (PLEG): container finished" podID="be5b6fe2-0642-45a1-b2ae-943f4ac7f553" containerID="091a687bdb89cea965370d56df3be61911f21b83c66bf57a4cde5d68b262299b" exitCode=0 Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.569574 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" event={"ID":"be5b6fe2-0642-45a1-b2ae-943f4ac7f553","Type":"ContainerDied","Data":"091a687bdb89cea965370d56df3be61911f21b83c66bf57a4cde5d68b262299b"} Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.569611 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" event={"ID":"be5b6fe2-0642-45a1-b2ae-943f4ac7f553","Type":"ContainerStarted","Data":"9ad4801def2edbe702ca49aa03641db6186a2aeddc32e8bcfb107785db5c3c4b"} Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.571732 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-758d7559-cbnxt" event={"ID":"e1877ded-6e84-4ca9-b911-3e2996993bdb","Type":"ContainerStarted","Data":"e85c2c0b8755da7f8e290a197efc83fb9bb57a58b33182c112386708b391b96b"} Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.571768 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-758d7559-cbnxt" event={"ID":"e1877ded-6e84-4ca9-b911-3e2996993bdb","Type":"ContainerStarted","Data":"285aeeb77ff7bd2a0bc93e2394fead15eb9c766f7b297d498585bdd6a2d0376b"} Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.571783 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-758d7559-cbnxt" event={"ID":"e1877ded-6e84-4ca9-b911-3e2996993bdb","Type":"ContainerStarted","Data":"429437703e9ae2529571275bdcec7b366b9d1e2067a029ef0ab0fe54fed6a13e"} Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.574099 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d665c4464-jxss9" event={"ID":"882ddda9-c85e-4b93-afaf-b34b080d7047","Type":"ContainerStarted","Data":"61deeffeb0c933949cbc1bccb87d1d66f3c8cc984056a83f1bf94fb0c0df7eee"} Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.574151 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d665c4464-jxss9" event={"ID":"882ddda9-c85e-4b93-afaf-b34b080d7047","Type":"ContainerStarted","Data":"3f21aa01960e75a410f1f5771c2097bd4bc616fe3cc367d9dbe158f5c85202f1"} Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.574165 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d665c4464-jxss9" event={"ID":"882ddda9-c85e-4b93-afaf-b34b080d7047","Type":"ContainerStarted","Data":"06cb900db9588157baa90f0eb2353e9788d6658462590d19461a1a6753dce713"} Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.574874 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.574908 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.614986 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-f65b6d578-mkb48" podStartSLOduration=2.614964874 podStartE2EDuration="2.614964874s" podCreationTimestamp="2025-12-03 08:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:01:59.588759349 +0000 UTC m=+5456.932342857" watchObservedRunningTime="2025-12-03 08:01:59.614964874 +0000 UTC m=+5456.958548382" Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.649585 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d665c4464-jxss9" podStartSLOduration=2.649562641 podStartE2EDuration="2.649562641s" podCreationTimestamp="2025-12-03 08:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:01:59.611686023 +0000 UTC m=+5456.955269531" watchObservedRunningTime="2025-12-03 08:01:59.649562641 +0000 UTC m=+5456.993146149" Dec 03 08:01:59 crc kubenswrapper[4831]: I1203 08:01:59.669911 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-758d7559-cbnxt" podStartSLOduration=2.669890074 podStartE2EDuration="2.669890074s" podCreationTimestamp="2025-12-03 08:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:01:59.642610965 +0000 UTC m=+5456.986194473" watchObservedRunningTime="2025-12-03 08:01:59.669890074 +0000 UTC m=+5457.013473582" Dec 03 08:02:00 crc kubenswrapper[4831]: I1203 08:02:00.583609 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" event={"ID":"be5b6fe2-0642-45a1-b2ae-943f4ac7f553","Type":"ContainerStarted","Data":"48f9b69f6b09e57e772f32297bda2ffa0e76c14ea2fc7b9ca48655110b721e7a"} Dec 03 08:02:00 crc kubenswrapper[4831]: I1203 08:02:00.585291 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:02:00 crc kubenswrapper[4831]: I1203 08:02:00.620104 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" podStartSLOduration=3.620085163 podStartE2EDuration="3.620085163s" podCreationTimestamp="2025-12-03 08:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:02:00.610818906 +0000 UTC m=+5457.954402434" watchObservedRunningTime="2025-12-03 08:02:00.620085163 +0000 UTC m=+5457.963668681" Dec 03 08:02:08 crc kubenswrapper[4831]: I1203 08:02:08.567511 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:02:08 crc kubenswrapper[4831]: I1203 08:02:08.652339 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-778c4f6d7f-bvnv9"] Dec 03 08:02:08 crc kubenswrapper[4831]: I1203 08:02:08.652696 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" podUID="4c7fb030-01a4-4122-833f-c306bea2f68a" containerName="dnsmasq-dns" containerID="cri-o://c88b9207228d72b8506900533ba62271325085ba307281123c30f329bc06aa8b" gracePeriod=10 Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.117046 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.155066 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-ovsdbserver-sb\") pod \"4c7fb030-01a4-4122-833f-c306bea2f68a\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.155126 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-ovsdbserver-nb\") pod \"4c7fb030-01a4-4122-833f-c306bea2f68a\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.155206 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-config\") pod \"4c7fb030-01a4-4122-833f-c306bea2f68a\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.155249 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48nsl\" (UniqueName: \"kubernetes.io/projected/4c7fb030-01a4-4122-833f-c306bea2f68a-kube-api-access-48nsl\") pod \"4c7fb030-01a4-4122-833f-c306bea2f68a\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.155380 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-dns-svc\") pod \"4c7fb030-01a4-4122-833f-c306bea2f68a\" (UID: \"4c7fb030-01a4-4122-833f-c306bea2f68a\") " Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.170764 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c7fb030-01a4-4122-833f-c306bea2f68a-kube-api-access-48nsl" (OuterVolumeSpecName: "kube-api-access-48nsl") pod "4c7fb030-01a4-4122-833f-c306bea2f68a" (UID: "4c7fb030-01a4-4122-833f-c306bea2f68a"). InnerVolumeSpecName "kube-api-access-48nsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.199698 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4c7fb030-01a4-4122-833f-c306bea2f68a" (UID: "4c7fb030-01a4-4122-833f-c306bea2f68a"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.206016 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4c7fb030-01a4-4122-833f-c306bea2f68a" (UID: "4c7fb030-01a4-4122-833f-c306bea2f68a"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.208233 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-config" (OuterVolumeSpecName: "config") pod "4c7fb030-01a4-4122-833f-c306bea2f68a" (UID: "4c7fb030-01a4-4122-833f-c306bea2f68a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.209593 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4c7fb030-01a4-4122-833f-c306bea2f68a" (UID: "4c7fb030-01a4-4122-833f-c306bea2f68a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.257439 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.257475 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.257486 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.257494 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c7fb030-01a4-4122-833f-c306bea2f68a-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.257512 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48nsl\" (UniqueName: \"kubernetes.io/projected/4c7fb030-01a4-4122-833f-c306bea2f68a-kube-api-access-48nsl\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.688358 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.688292 4831 generic.go:334] "Generic (PLEG): container finished" podID="4c7fb030-01a4-4122-833f-c306bea2f68a" containerID="c88b9207228d72b8506900533ba62271325085ba307281123c30f329bc06aa8b" exitCode=0 Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.688380 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" event={"ID":"4c7fb030-01a4-4122-833f-c306bea2f68a","Type":"ContainerDied","Data":"c88b9207228d72b8506900533ba62271325085ba307281123c30f329bc06aa8b"} Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.690010 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-778c4f6d7f-bvnv9" event={"ID":"4c7fb030-01a4-4122-833f-c306bea2f68a","Type":"ContainerDied","Data":"c9a3fa8387a41eb574c663ca1963957cb446713475967ba06f051715e8104931"} Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.690070 4831 scope.go:117] "RemoveContainer" containerID="c88b9207228d72b8506900533ba62271325085ba307281123c30f329bc06aa8b" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.735610 4831 scope.go:117] "RemoveContainer" containerID="55053fad3553271c219c4b1de24491f1ce045ad258007169fa5ef7c4fd030c55" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.738004 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-778c4f6d7f-bvnv9"] Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.747433 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-778c4f6d7f-bvnv9"] Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.753826 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.768007 4831 scope.go:117] "RemoveContainer" containerID="c88b9207228d72b8506900533ba62271325085ba307281123c30f329bc06aa8b" Dec 03 08:02:09 crc kubenswrapper[4831]: E1203 08:02:09.772352 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c88b9207228d72b8506900533ba62271325085ba307281123c30f329bc06aa8b\": container with ID starting with c88b9207228d72b8506900533ba62271325085ba307281123c30f329bc06aa8b not found: ID does not exist" containerID="c88b9207228d72b8506900533ba62271325085ba307281123c30f329bc06aa8b" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.772625 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c88b9207228d72b8506900533ba62271325085ba307281123c30f329bc06aa8b"} err="failed to get container status \"c88b9207228d72b8506900533ba62271325085ba307281123c30f329bc06aa8b\": rpc error: code = NotFound desc = could not find container \"c88b9207228d72b8506900533ba62271325085ba307281123c30f329bc06aa8b\": container with ID starting with c88b9207228d72b8506900533ba62271325085ba307281123c30f329bc06aa8b not found: ID does not exist" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.772683 4831 scope.go:117] "RemoveContainer" containerID="55053fad3553271c219c4b1de24491f1ce045ad258007169fa5ef7c4fd030c55" Dec 03 08:02:09 crc kubenswrapper[4831]: E1203 08:02:09.773215 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"55053fad3553271c219c4b1de24491f1ce045ad258007169fa5ef7c4fd030c55\": container with ID starting with 55053fad3553271c219c4b1de24491f1ce045ad258007169fa5ef7c4fd030c55 not found: ID does not exist" containerID="55053fad3553271c219c4b1de24491f1ce045ad258007169fa5ef7c4fd030c55" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.773287 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"55053fad3553271c219c4b1de24491f1ce045ad258007169fa5ef7c4fd030c55"} err="failed to get container status \"55053fad3553271c219c4b1de24491f1ce045ad258007169fa5ef7c4fd030c55\": rpc error: code = NotFound desc = could not find container \"55053fad3553271c219c4b1de24491f1ce045ad258007169fa5ef7c4fd030c55\": container with ID starting with 55053fad3553271c219c4b1de24491f1ce045ad258007169fa5ef7c4fd030c55 not found: ID does not exist" Dec 03 08:02:09 crc kubenswrapper[4831]: I1203 08:02:09.810107 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d665c4464-jxss9" Dec 03 08:02:11 crc kubenswrapper[4831]: I1203 08:02:11.029577 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c7fb030-01a4-4122-833f-c306bea2f68a" path="/var/lib/kubelet/pods/4c7fb030-01a4-4122-833f-c306bea2f68a/volumes" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.870308 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-eb20-account-create-update-pv52t"] Dec 03 08:02:21 crc kubenswrapper[4831]: E1203 08:02:21.871439 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c7fb030-01a4-4122-833f-c306bea2f68a" containerName="init" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.871462 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c7fb030-01a4-4122-833f-c306bea2f68a" containerName="init" Dec 03 08:02:21 crc kubenswrapper[4831]: E1203 08:02:21.871496 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c7fb030-01a4-4122-833f-c306bea2f68a" containerName="dnsmasq-dns" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.871507 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c7fb030-01a4-4122-833f-c306bea2f68a" containerName="dnsmasq-dns" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.871791 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c7fb030-01a4-4122-833f-c306bea2f68a" containerName="dnsmasq-dns" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.872704 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-eb20-account-create-update-pv52t" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.874952 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.881627 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-h98qc"] Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.882900 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-h98qc" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.891493 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-h98qc"] Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.892037 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fl94\" (UniqueName: \"kubernetes.io/projected/8d8aef98-04c2-4da9-b3cc-da5f44144eea-kube-api-access-7fl94\") pod \"neutron-db-create-h98qc\" (UID: \"8d8aef98-04c2-4da9-b3cc-da5f44144eea\") " pod="openstack/neutron-db-create-h98qc" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.892087 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnklk\" (UniqueName: \"kubernetes.io/projected/edd22683-04e1-40b1-b6b5-31caa4789313-kube-api-access-pnklk\") pod \"neutron-eb20-account-create-update-pv52t\" (UID: \"edd22683-04e1-40b1-b6b5-31caa4789313\") " pod="openstack/neutron-eb20-account-create-update-pv52t" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.892113 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edd22683-04e1-40b1-b6b5-31caa4789313-operator-scripts\") pod \"neutron-eb20-account-create-update-pv52t\" (UID: \"edd22683-04e1-40b1-b6b5-31caa4789313\") " pod="openstack/neutron-eb20-account-create-update-pv52t" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.892158 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d8aef98-04c2-4da9-b3cc-da5f44144eea-operator-scripts\") pod \"neutron-db-create-h98qc\" (UID: \"8d8aef98-04c2-4da9-b3cc-da5f44144eea\") " pod="openstack/neutron-db-create-h98qc" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.900551 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-eb20-account-create-update-pv52t"] Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.993294 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d8aef98-04c2-4da9-b3cc-da5f44144eea-operator-scripts\") pod \"neutron-db-create-h98qc\" (UID: \"8d8aef98-04c2-4da9-b3cc-da5f44144eea\") " pod="openstack/neutron-db-create-h98qc" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.993445 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fl94\" (UniqueName: \"kubernetes.io/projected/8d8aef98-04c2-4da9-b3cc-da5f44144eea-kube-api-access-7fl94\") pod \"neutron-db-create-h98qc\" (UID: \"8d8aef98-04c2-4da9-b3cc-da5f44144eea\") " pod="openstack/neutron-db-create-h98qc" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.993467 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pnklk\" (UniqueName: \"kubernetes.io/projected/edd22683-04e1-40b1-b6b5-31caa4789313-kube-api-access-pnklk\") pod \"neutron-eb20-account-create-update-pv52t\" (UID: \"edd22683-04e1-40b1-b6b5-31caa4789313\") " pod="openstack/neutron-eb20-account-create-update-pv52t" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.993486 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edd22683-04e1-40b1-b6b5-31caa4789313-operator-scripts\") pod \"neutron-eb20-account-create-update-pv52t\" (UID: \"edd22683-04e1-40b1-b6b5-31caa4789313\") " pod="openstack/neutron-eb20-account-create-update-pv52t" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.994126 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edd22683-04e1-40b1-b6b5-31caa4789313-operator-scripts\") pod \"neutron-eb20-account-create-update-pv52t\" (UID: \"edd22683-04e1-40b1-b6b5-31caa4789313\") " pod="openstack/neutron-eb20-account-create-update-pv52t" Dec 03 08:02:21 crc kubenswrapper[4831]: I1203 08:02:21.994133 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d8aef98-04c2-4da9-b3cc-da5f44144eea-operator-scripts\") pod \"neutron-db-create-h98qc\" (UID: \"8d8aef98-04c2-4da9-b3cc-da5f44144eea\") " pod="openstack/neutron-db-create-h98qc" Dec 03 08:02:22 crc kubenswrapper[4831]: I1203 08:02:22.021246 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fl94\" (UniqueName: \"kubernetes.io/projected/8d8aef98-04c2-4da9-b3cc-da5f44144eea-kube-api-access-7fl94\") pod \"neutron-db-create-h98qc\" (UID: \"8d8aef98-04c2-4da9-b3cc-da5f44144eea\") " pod="openstack/neutron-db-create-h98qc" Dec 03 08:02:22 crc kubenswrapper[4831]: I1203 08:02:22.027844 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnklk\" (UniqueName: \"kubernetes.io/projected/edd22683-04e1-40b1-b6b5-31caa4789313-kube-api-access-pnklk\") pod \"neutron-eb20-account-create-update-pv52t\" (UID: \"edd22683-04e1-40b1-b6b5-31caa4789313\") " pod="openstack/neutron-eb20-account-create-update-pv52t" Dec 03 08:02:22 crc kubenswrapper[4831]: I1203 08:02:22.206553 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-eb20-account-create-update-pv52t" Dec 03 08:02:22 crc kubenswrapper[4831]: I1203 08:02:22.222009 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-h98qc" Dec 03 08:02:22 crc kubenswrapper[4831]: W1203 08:02:22.702040 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podedd22683_04e1_40b1_b6b5_31caa4789313.slice/crio-2a0e94bab7044ca7d5d4877c89b6932a5fe21c3967c72c5dbcb0ef0c573f401f WatchSource:0}: Error finding container 2a0e94bab7044ca7d5d4877c89b6932a5fe21c3967c72c5dbcb0ef0c573f401f: Status 404 returned error can't find the container with id 2a0e94bab7044ca7d5d4877c89b6932a5fe21c3967c72c5dbcb0ef0c573f401f Dec 03 08:02:22 crc kubenswrapper[4831]: I1203 08:02:22.709089 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-eb20-account-create-update-pv52t"] Dec 03 08:02:22 crc kubenswrapper[4831]: W1203 08:02:22.799560 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8d8aef98_04c2_4da9_b3cc_da5f44144eea.slice/crio-554cc351f06062deefd47c742f22bd21b2a28a8a0c1e1e81b6843ee902f306ea WatchSource:0}: Error finding container 554cc351f06062deefd47c742f22bd21b2a28a8a0c1e1e81b6843ee902f306ea: Status 404 returned error can't find the container with id 554cc351f06062deefd47c742f22bd21b2a28a8a0c1e1e81b6843ee902f306ea Dec 03 08:02:22 crc kubenswrapper[4831]: I1203 08:02:22.807168 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-h98qc"] Dec 03 08:02:22 crc kubenswrapper[4831]: I1203 08:02:22.831941 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-h98qc" event={"ID":"8d8aef98-04c2-4da9-b3cc-da5f44144eea","Type":"ContainerStarted","Data":"554cc351f06062deefd47c742f22bd21b2a28a8a0c1e1e81b6843ee902f306ea"} Dec 03 08:02:22 crc kubenswrapper[4831]: I1203 08:02:22.833270 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-eb20-account-create-update-pv52t" event={"ID":"edd22683-04e1-40b1-b6b5-31caa4789313","Type":"ContainerStarted","Data":"2a0e94bab7044ca7d5d4877c89b6932a5fe21c3967c72c5dbcb0ef0c573f401f"} Dec 03 08:02:23 crc kubenswrapper[4831]: I1203 08:02:23.847873 4831 generic.go:334] "Generic (PLEG): container finished" podID="edd22683-04e1-40b1-b6b5-31caa4789313" containerID="2bf8b40b5f8181eacfe88d3b130caa1eff0314b3f85e7e5b003597acb98746de" exitCode=0 Dec 03 08:02:23 crc kubenswrapper[4831]: I1203 08:02:23.847979 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-eb20-account-create-update-pv52t" event={"ID":"edd22683-04e1-40b1-b6b5-31caa4789313","Type":"ContainerDied","Data":"2bf8b40b5f8181eacfe88d3b130caa1eff0314b3f85e7e5b003597acb98746de"} Dec 03 08:02:23 crc kubenswrapper[4831]: I1203 08:02:23.852466 4831 generic.go:334] "Generic (PLEG): container finished" podID="8d8aef98-04c2-4da9-b3cc-da5f44144eea" containerID="19ea6c2b93f47bab01399b4628a20b8a07fec2607fb8433e13f36493c877aba4" exitCode=0 Dec 03 08:02:23 crc kubenswrapper[4831]: I1203 08:02:23.852538 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-h98qc" event={"ID":"8d8aef98-04c2-4da9-b3cc-da5f44144eea","Type":"ContainerDied","Data":"19ea6c2b93f47bab01399b4628a20b8a07fec2607fb8433e13f36493c877aba4"} Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.309703 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-h98qc" Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.333548 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-eb20-account-create-update-pv52t" Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.354008 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pnklk\" (UniqueName: \"kubernetes.io/projected/edd22683-04e1-40b1-b6b5-31caa4789313-kube-api-access-pnklk\") pod \"edd22683-04e1-40b1-b6b5-31caa4789313\" (UID: \"edd22683-04e1-40b1-b6b5-31caa4789313\") " Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.354061 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fl94\" (UniqueName: \"kubernetes.io/projected/8d8aef98-04c2-4da9-b3cc-da5f44144eea-kube-api-access-7fl94\") pod \"8d8aef98-04c2-4da9-b3cc-da5f44144eea\" (UID: \"8d8aef98-04c2-4da9-b3cc-da5f44144eea\") " Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.354132 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d8aef98-04c2-4da9-b3cc-da5f44144eea-operator-scripts\") pod \"8d8aef98-04c2-4da9-b3cc-da5f44144eea\" (UID: \"8d8aef98-04c2-4da9-b3cc-da5f44144eea\") " Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.354187 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edd22683-04e1-40b1-b6b5-31caa4789313-operator-scripts\") pod \"edd22683-04e1-40b1-b6b5-31caa4789313\" (UID: \"edd22683-04e1-40b1-b6b5-31caa4789313\") " Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.355288 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d8aef98-04c2-4da9-b3cc-da5f44144eea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8d8aef98-04c2-4da9-b3cc-da5f44144eea" (UID: "8d8aef98-04c2-4da9-b3cc-da5f44144eea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.355509 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/edd22683-04e1-40b1-b6b5-31caa4789313-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "edd22683-04e1-40b1-b6b5-31caa4789313" (UID: "edd22683-04e1-40b1-b6b5-31caa4789313"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.360563 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd22683-04e1-40b1-b6b5-31caa4789313-kube-api-access-pnklk" (OuterVolumeSpecName: "kube-api-access-pnklk") pod "edd22683-04e1-40b1-b6b5-31caa4789313" (UID: "edd22683-04e1-40b1-b6b5-31caa4789313"). InnerVolumeSpecName "kube-api-access-pnklk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.360615 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d8aef98-04c2-4da9-b3cc-da5f44144eea-kube-api-access-7fl94" (OuterVolumeSpecName: "kube-api-access-7fl94") pod "8d8aef98-04c2-4da9-b3cc-da5f44144eea" (UID: "8d8aef98-04c2-4da9-b3cc-da5f44144eea"). InnerVolumeSpecName "kube-api-access-7fl94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.455839 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pnklk\" (UniqueName: \"kubernetes.io/projected/edd22683-04e1-40b1-b6b5-31caa4789313-kube-api-access-pnklk\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.456148 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fl94\" (UniqueName: \"kubernetes.io/projected/8d8aef98-04c2-4da9-b3cc-da5f44144eea-kube-api-access-7fl94\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.456166 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8d8aef98-04c2-4da9-b3cc-da5f44144eea-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.456174 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/edd22683-04e1-40b1-b6b5-31caa4789313-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.872412 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-eb20-account-create-update-pv52t" event={"ID":"edd22683-04e1-40b1-b6b5-31caa4789313","Type":"ContainerDied","Data":"2a0e94bab7044ca7d5d4877c89b6932a5fe21c3967c72c5dbcb0ef0c573f401f"} Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.872440 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-eb20-account-create-update-pv52t" Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.872460 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a0e94bab7044ca7d5d4877c89b6932a5fe21c3967c72c5dbcb0ef0c573f401f" Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.873781 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-h98qc" event={"ID":"8d8aef98-04c2-4da9-b3cc-da5f44144eea","Type":"ContainerDied","Data":"554cc351f06062deefd47c742f22bd21b2a28a8a0c1e1e81b6843ee902f306ea"} Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.873825 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="554cc351f06062deefd47c742f22bd21b2a28a8a0c1e1e81b6843ee902f306ea" Dec 03 08:02:25 crc kubenswrapper[4831]: I1203 08:02:25.873900 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-h98qc" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.097506 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-kg6v2"] Dec 03 08:02:27 crc kubenswrapper[4831]: E1203 08:02:27.098160 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="edd22683-04e1-40b1-b6b5-31caa4789313" containerName="mariadb-account-create-update" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.098178 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd22683-04e1-40b1-b6b5-31caa4789313" containerName="mariadb-account-create-update" Dec 03 08:02:27 crc kubenswrapper[4831]: E1203 08:02:27.098227 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d8aef98-04c2-4da9-b3cc-da5f44144eea" containerName="mariadb-database-create" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.098238 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d8aef98-04c2-4da9-b3cc-da5f44144eea" containerName="mariadb-database-create" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.098428 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d8aef98-04c2-4da9-b3cc-da5f44144eea" containerName="mariadb-database-create" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.098446 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="edd22683-04e1-40b1-b6b5-31caa4789313" containerName="mariadb-account-create-update" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.099077 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kg6v2" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.101306 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.101359 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.101432 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v2j98" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.107930 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kg6v2"] Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.287078 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4836bb4-0fd6-403e-ad70-27483213715e-config\") pod \"neutron-db-sync-kg6v2\" (UID: \"f4836bb4-0fd6-403e-ad70-27483213715e\") " pod="openstack/neutron-db-sync-kg6v2" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.287398 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4836bb4-0fd6-403e-ad70-27483213715e-combined-ca-bundle\") pod \"neutron-db-sync-kg6v2\" (UID: \"f4836bb4-0fd6-403e-ad70-27483213715e\") " pod="openstack/neutron-db-sync-kg6v2" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.287785 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jdc9\" (UniqueName: \"kubernetes.io/projected/f4836bb4-0fd6-403e-ad70-27483213715e-kube-api-access-2jdc9\") pod \"neutron-db-sync-kg6v2\" (UID: \"f4836bb4-0fd6-403e-ad70-27483213715e\") " pod="openstack/neutron-db-sync-kg6v2" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.389613 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4836bb4-0fd6-403e-ad70-27483213715e-config\") pod \"neutron-db-sync-kg6v2\" (UID: \"f4836bb4-0fd6-403e-ad70-27483213715e\") " pod="openstack/neutron-db-sync-kg6v2" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.389676 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4836bb4-0fd6-403e-ad70-27483213715e-combined-ca-bundle\") pod \"neutron-db-sync-kg6v2\" (UID: \"f4836bb4-0fd6-403e-ad70-27483213715e\") " pod="openstack/neutron-db-sync-kg6v2" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.389762 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jdc9\" (UniqueName: \"kubernetes.io/projected/f4836bb4-0fd6-403e-ad70-27483213715e-kube-api-access-2jdc9\") pod \"neutron-db-sync-kg6v2\" (UID: \"f4836bb4-0fd6-403e-ad70-27483213715e\") " pod="openstack/neutron-db-sync-kg6v2" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.395623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4836bb4-0fd6-403e-ad70-27483213715e-config\") pod \"neutron-db-sync-kg6v2\" (UID: \"f4836bb4-0fd6-403e-ad70-27483213715e\") " pod="openstack/neutron-db-sync-kg6v2" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.395656 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4836bb4-0fd6-403e-ad70-27483213715e-combined-ca-bundle\") pod \"neutron-db-sync-kg6v2\" (UID: \"f4836bb4-0fd6-403e-ad70-27483213715e\") " pod="openstack/neutron-db-sync-kg6v2" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.409797 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jdc9\" (UniqueName: \"kubernetes.io/projected/f4836bb4-0fd6-403e-ad70-27483213715e-kube-api-access-2jdc9\") pod \"neutron-db-sync-kg6v2\" (UID: \"f4836bb4-0fd6-403e-ad70-27483213715e\") " pod="openstack/neutron-db-sync-kg6v2" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.420526 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kg6v2" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.596770 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.597119 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.877226 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-kg6v2"] Dec 03 08:02:27 crc kubenswrapper[4831]: I1203 08:02:27.892808 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kg6v2" event={"ID":"f4836bb4-0fd6-403e-ad70-27483213715e","Type":"ContainerStarted","Data":"ad9e0b81a657c76ca1ff034ed62bd16ab5a0d5b345ca995fab0fd8bc0280ac1e"} Dec 03 08:02:28 crc kubenswrapper[4831]: I1203 08:02:28.906857 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kg6v2" event={"ID":"f4836bb4-0fd6-403e-ad70-27483213715e","Type":"ContainerStarted","Data":"f66b2dd44e26e8b57f1d4f6970c830c14562f4786bbdd54f9d77f8f672414a21"} Dec 03 08:02:28 crc kubenswrapper[4831]: I1203 08:02:28.934858 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-kg6v2" podStartSLOduration=1.9348311169999999 podStartE2EDuration="1.934831117s" podCreationTimestamp="2025-12-03 08:02:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:02:28.929891793 +0000 UTC m=+5486.273475351" watchObservedRunningTime="2025-12-03 08:02:28.934831117 +0000 UTC m=+5486.278414665" Dec 03 08:02:34 crc kubenswrapper[4831]: I1203 08:02:34.972216 4831 generic.go:334] "Generic (PLEG): container finished" podID="f4836bb4-0fd6-403e-ad70-27483213715e" containerID="f66b2dd44e26e8b57f1d4f6970c830c14562f4786bbdd54f9d77f8f672414a21" exitCode=0 Dec 03 08:02:34 crc kubenswrapper[4831]: I1203 08:02:34.972306 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kg6v2" event={"ID":"f4836bb4-0fd6-403e-ad70-27483213715e","Type":"ContainerDied","Data":"f66b2dd44e26e8b57f1d4f6970c830c14562f4786bbdd54f9d77f8f672414a21"} Dec 03 08:02:36 crc kubenswrapper[4831]: I1203 08:02:36.312259 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kg6v2" Dec 03 08:02:36 crc kubenswrapper[4831]: I1203 08:02:36.464157 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4836bb4-0fd6-403e-ad70-27483213715e-config\") pod \"f4836bb4-0fd6-403e-ad70-27483213715e\" (UID: \"f4836bb4-0fd6-403e-ad70-27483213715e\") " Dec 03 08:02:36 crc kubenswrapper[4831]: I1203 08:02:36.464577 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4836bb4-0fd6-403e-ad70-27483213715e-combined-ca-bundle\") pod \"f4836bb4-0fd6-403e-ad70-27483213715e\" (UID: \"f4836bb4-0fd6-403e-ad70-27483213715e\") " Dec 03 08:02:36 crc kubenswrapper[4831]: I1203 08:02:36.464608 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jdc9\" (UniqueName: \"kubernetes.io/projected/f4836bb4-0fd6-403e-ad70-27483213715e-kube-api-access-2jdc9\") pod \"f4836bb4-0fd6-403e-ad70-27483213715e\" (UID: \"f4836bb4-0fd6-403e-ad70-27483213715e\") " Dec 03 08:02:36 crc kubenswrapper[4831]: I1203 08:02:36.472815 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4836bb4-0fd6-403e-ad70-27483213715e-kube-api-access-2jdc9" (OuterVolumeSpecName: "kube-api-access-2jdc9") pod "f4836bb4-0fd6-403e-ad70-27483213715e" (UID: "f4836bb4-0fd6-403e-ad70-27483213715e"). InnerVolumeSpecName "kube-api-access-2jdc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:02:36 crc kubenswrapper[4831]: I1203 08:02:36.512541 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4836bb4-0fd6-403e-ad70-27483213715e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f4836bb4-0fd6-403e-ad70-27483213715e" (UID: "f4836bb4-0fd6-403e-ad70-27483213715e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:02:36 crc kubenswrapper[4831]: I1203 08:02:36.567152 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f4836bb4-0fd6-403e-ad70-27483213715e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:36 crc kubenswrapper[4831]: I1203 08:02:36.567195 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jdc9\" (UniqueName: \"kubernetes.io/projected/f4836bb4-0fd6-403e-ad70-27483213715e-kube-api-access-2jdc9\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:36 crc kubenswrapper[4831]: I1203 08:02:36.568433 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f4836bb4-0fd6-403e-ad70-27483213715e-config" (OuterVolumeSpecName: "config") pod "f4836bb4-0fd6-403e-ad70-27483213715e" (UID: "f4836bb4-0fd6-403e-ad70-27483213715e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:02:36 crc kubenswrapper[4831]: I1203 08:02:36.669088 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f4836bb4-0fd6-403e-ad70-27483213715e-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:36 crc kubenswrapper[4831]: I1203 08:02:36.992963 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-kg6v2" event={"ID":"f4836bb4-0fd6-403e-ad70-27483213715e","Type":"ContainerDied","Data":"ad9e0b81a657c76ca1ff034ed62bd16ab5a0d5b345ca995fab0fd8bc0280ac1e"} Dec 03 08:02:36 crc kubenswrapper[4831]: I1203 08:02:36.993008 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad9e0b81a657c76ca1ff034ed62bd16ab5a0d5b345ca995fab0fd8bc0280ac1e" Dec 03 08:02:36 crc kubenswrapper[4831]: I1203 08:02:36.993094 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-kg6v2" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.152115 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd99b764c-m9sbd"] Dec 03 08:02:37 crc kubenswrapper[4831]: E1203 08:02:37.152573 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4836bb4-0fd6-403e-ad70-27483213715e" containerName="neutron-db-sync" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.152598 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4836bb4-0fd6-403e-ad70-27483213715e" containerName="neutron-db-sync" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.152834 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4836bb4-0fd6-403e-ad70-27483213715e" containerName="neutron-db-sync" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.153880 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.161736 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd99b764c-m9sbd"] Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.289733 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-config\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.289904 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-ovsdbserver-nb\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.290036 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n74c4\" (UniqueName: \"kubernetes.io/projected/d951b914-3516-4abd-83af-6b7fcf91d390-kube-api-access-n74c4\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.290083 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-ovsdbserver-sb\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.290134 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-dns-svc\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.309233 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67c669c55-sljbg"] Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.310612 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.318848 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.319080 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-v2j98" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.319201 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.320980 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67c669c55-sljbg"] Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.391278 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-ovsdbserver-sb\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.391407 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-dns-svc\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.391459 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-config\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.391505 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-ovsdbserver-nb\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.391539 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3f7b2f0-4404-4060-a04c-02da1d8f7c43-combined-ca-bundle\") pod \"neutron-67c669c55-sljbg\" (UID: \"f3f7b2f0-4404-4060-a04c-02da1d8f7c43\") " pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.391602 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmvch\" (UniqueName: \"kubernetes.io/projected/f3f7b2f0-4404-4060-a04c-02da1d8f7c43-kube-api-access-fmvch\") pod \"neutron-67c669c55-sljbg\" (UID: \"f3f7b2f0-4404-4060-a04c-02da1d8f7c43\") " pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.391644 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3f7b2f0-4404-4060-a04c-02da1d8f7c43-config\") pod \"neutron-67c669c55-sljbg\" (UID: \"f3f7b2f0-4404-4060-a04c-02da1d8f7c43\") " pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.391703 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n74c4\" (UniqueName: \"kubernetes.io/projected/d951b914-3516-4abd-83af-6b7fcf91d390-kube-api-access-n74c4\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.391724 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3f7b2f0-4404-4060-a04c-02da1d8f7c43-httpd-config\") pod \"neutron-67c669c55-sljbg\" (UID: \"f3f7b2f0-4404-4060-a04c-02da1d8f7c43\") " pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.392205 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-ovsdbserver-sb\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.392486 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-dns-svc\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.392934 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-config\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.393425 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-ovsdbserver-nb\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.409253 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n74c4\" (UniqueName: \"kubernetes.io/projected/d951b914-3516-4abd-83af-6b7fcf91d390-kube-api-access-n74c4\") pod \"dnsmasq-dns-cd99b764c-m9sbd\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.476930 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.493538 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmvch\" (UniqueName: \"kubernetes.io/projected/f3f7b2f0-4404-4060-a04c-02da1d8f7c43-kube-api-access-fmvch\") pod \"neutron-67c669c55-sljbg\" (UID: \"f3f7b2f0-4404-4060-a04c-02da1d8f7c43\") " pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.493604 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3f7b2f0-4404-4060-a04c-02da1d8f7c43-config\") pod \"neutron-67c669c55-sljbg\" (UID: \"f3f7b2f0-4404-4060-a04c-02da1d8f7c43\") " pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.493648 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3f7b2f0-4404-4060-a04c-02da1d8f7c43-httpd-config\") pod \"neutron-67c669c55-sljbg\" (UID: \"f3f7b2f0-4404-4060-a04c-02da1d8f7c43\") " pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.493737 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3f7b2f0-4404-4060-a04c-02da1d8f7c43-combined-ca-bundle\") pod \"neutron-67c669c55-sljbg\" (UID: \"f3f7b2f0-4404-4060-a04c-02da1d8f7c43\") " pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.499104 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f3f7b2f0-4404-4060-a04c-02da1d8f7c43-config\") pod \"neutron-67c669c55-sljbg\" (UID: \"f3f7b2f0-4404-4060-a04c-02da1d8f7c43\") " pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.499160 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f3f7b2f0-4404-4060-a04c-02da1d8f7c43-combined-ca-bundle\") pod \"neutron-67c669c55-sljbg\" (UID: \"f3f7b2f0-4404-4060-a04c-02da1d8f7c43\") " pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.513217 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f3f7b2f0-4404-4060-a04c-02da1d8f7c43-httpd-config\") pod \"neutron-67c669c55-sljbg\" (UID: \"f3f7b2f0-4404-4060-a04c-02da1d8f7c43\") " pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.554080 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmvch\" (UniqueName: \"kubernetes.io/projected/f3f7b2f0-4404-4060-a04c-02da1d8f7c43-kube-api-access-fmvch\") pod \"neutron-67c669c55-sljbg\" (UID: \"f3f7b2f0-4404-4060-a04c-02da1d8f7c43\") " pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:37 crc kubenswrapper[4831]: I1203 08:02:37.642724 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:38 crc kubenswrapper[4831]: I1203 08:02:38.055555 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd99b764c-m9sbd"] Dec 03 08:02:38 crc kubenswrapper[4831]: I1203 08:02:38.252128 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67c669c55-sljbg"] Dec 03 08:02:38 crc kubenswrapper[4831]: W1203 08:02:38.259265 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3f7b2f0_4404_4060_a04c_02da1d8f7c43.slice/crio-f69f17af536153f435a9267dbe87147c615b3de77b1fb2a2dfc23c5d084aca65 WatchSource:0}: Error finding container f69f17af536153f435a9267dbe87147c615b3de77b1fb2a2dfc23c5d084aca65: Status 404 returned error can't find the container with id f69f17af536153f435a9267dbe87147c615b3de77b1fb2a2dfc23c5d084aca65 Dec 03 08:02:39 crc kubenswrapper[4831]: I1203 08:02:39.021265 4831 generic.go:334] "Generic (PLEG): container finished" podID="d951b914-3516-4abd-83af-6b7fcf91d390" containerID="63e9cfeb1a5cfb3fac1bcd97a46757cbcf6b9551dcd5ce0918a87fe130081593" exitCode=0 Dec 03 08:02:39 crc kubenswrapper[4831]: I1203 08:02:39.192114 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c669c55-sljbg" event={"ID":"f3f7b2f0-4404-4060-a04c-02da1d8f7c43","Type":"ContainerStarted","Data":"2f5f2aa0d732db14b4865ed064597d73e400e9684b2fe399e0ae1da5c1805731"} Dec 03 08:02:39 crc kubenswrapper[4831]: I1203 08:02:39.192305 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:02:39 crc kubenswrapper[4831]: I1203 08:02:39.196622 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c669c55-sljbg" event={"ID":"f3f7b2f0-4404-4060-a04c-02da1d8f7c43","Type":"ContainerStarted","Data":"f0931e5dfd979de5c1ade50b33b0e79a2d36cbcaeab2f1c45297151f767939fe"} Dec 03 08:02:39 crc kubenswrapper[4831]: I1203 08:02:39.196657 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67c669c55-sljbg" event={"ID":"f3f7b2f0-4404-4060-a04c-02da1d8f7c43","Type":"ContainerStarted","Data":"f69f17af536153f435a9267dbe87147c615b3de77b1fb2a2dfc23c5d084aca65"} Dec 03 08:02:39 crc kubenswrapper[4831]: I1203 08:02:39.196675 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" event={"ID":"d951b914-3516-4abd-83af-6b7fcf91d390","Type":"ContainerDied","Data":"63e9cfeb1a5cfb3fac1bcd97a46757cbcf6b9551dcd5ce0918a87fe130081593"} Dec 03 08:02:39 crc kubenswrapper[4831]: I1203 08:02:39.196691 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" event={"ID":"d951b914-3516-4abd-83af-6b7fcf91d390","Type":"ContainerStarted","Data":"5c1b728879e01edcc6a0fc48d90292bcf1669177a6fe4bba7e64fdf8500736ce"} Dec 03 08:02:39 crc kubenswrapper[4831]: I1203 08:02:39.236492 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67c669c55-sljbg" podStartSLOduration=2.236469314 podStartE2EDuration="2.236469314s" podCreationTimestamp="2025-12-03 08:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:02:39.086720914 +0000 UTC m=+5496.430304422" watchObservedRunningTime="2025-12-03 08:02:39.236469314 +0000 UTC m=+5496.580052822" Dec 03 08:02:40 crc kubenswrapper[4831]: I1203 08:02:40.033463 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" event={"ID":"d951b914-3516-4abd-83af-6b7fcf91d390","Type":"ContainerStarted","Data":"fa2a63d762e981110f8efa36740d4f2fe113ec8a3cb221e3825d026d8033a04e"} Dec 03 08:02:40 crc kubenswrapper[4831]: I1203 08:02:40.060391 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" podStartSLOduration=3.060367433 podStartE2EDuration="3.060367433s" podCreationTimestamp="2025-12-03 08:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:02:40.054491271 +0000 UTC m=+5497.398074789" watchObservedRunningTime="2025-12-03 08:02:40.060367433 +0000 UTC m=+5497.403950941" Dec 03 08:02:41 crc kubenswrapper[4831]: I1203 08:02:41.041137 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:47 crc kubenswrapper[4831]: I1203 08:02:47.479491 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:02:47 crc kubenswrapper[4831]: I1203 08:02:47.557697 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74b69dd9c7-k8bw7"] Dec 03 08:02:47 crc kubenswrapper[4831]: I1203 08:02:47.557939 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" podUID="be5b6fe2-0642-45a1-b2ae-943f4ac7f553" containerName="dnsmasq-dns" containerID="cri-o://48f9b69f6b09e57e772f32297bda2ffa0e76c14ea2fc7b9ca48655110b721e7a" gracePeriod=10 Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.031629 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.109444 4831 generic.go:334] "Generic (PLEG): container finished" podID="be5b6fe2-0642-45a1-b2ae-943f4ac7f553" containerID="48f9b69f6b09e57e772f32297bda2ffa0e76c14ea2fc7b9ca48655110b721e7a" exitCode=0 Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.109490 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" event={"ID":"be5b6fe2-0642-45a1-b2ae-943f4ac7f553","Type":"ContainerDied","Data":"48f9b69f6b09e57e772f32297bda2ffa0e76c14ea2fc7b9ca48655110b721e7a"} Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.109520 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" event={"ID":"be5b6fe2-0642-45a1-b2ae-943f4ac7f553","Type":"ContainerDied","Data":"9ad4801def2edbe702ca49aa03641db6186a2aeddc32e8bcfb107785db5c3c4b"} Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.109542 4831 scope.go:117] "RemoveContainer" containerID="48f9b69f6b09e57e772f32297bda2ffa0e76c14ea2fc7b9ca48655110b721e7a" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.109549 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-74b69dd9c7-k8bw7" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.134702 4831 scope.go:117] "RemoveContainer" containerID="091a687bdb89cea965370d56df3be61911f21b83c66bf57a4cde5d68b262299b" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.173301 4831 scope.go:117] "RemoveContainer" containerID="48f9b69f6b09e57e772f32297bda2ffa0e76c14ea2fc7b9ca48655110b721e7a" Dec 03 08:02:48 crc kubenswrapper[4831]: E1203 08:02:48.180560 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48f9b69f6b09e57e772f32297bda2ffa0e76c14ea2fc7b9ca48655110b721e7a\": container with ID starting with 48f9b69f6b09e57e772f32297bda2ffa0e76c14ea2fc7b9ca48655110b721e7a not found: ID does not exist" containerID="48f9b69f6b09e57e772f32297bda2ffa0e76c14ea2fc7b9ca48655110b721e7a" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.180620 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48f9b69f6b09e57e772f32297bda2ffa0e76c14ea2fc7b9ca48655110b721e7a"} err="failed to get container status \"48f9b69f6b09e57e772f32297bda2ffa0e76c14ea2fc7b9ca48655110b721e7a\": rpc error: code = NotFound desc = could not find container \"48f9b69f6b09e57e772f32297bda2ffa0e76c14ea2fc7b9ca48655110b721e7a\": container with ID starting with 48f9b69f6b09e57e772f32297bda2ffa0e76c14ea2fc7b9ca48655110b721e7a not found: ID does not exist" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.180654 4831 scope.go:117] "RemoveContainer" containerID="091a687bdb89cea965370d56df3be61911f21b83c66bf57a4cde5d68b262299b" Dec 03 08:02:48 crc kubenswrapper[4831]: E1203 08:02:48.180996 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"091a687bdb89cea965370d56df3be61911f21b83c66bf57a4cde5d68b262299b\": container with ID starting with 091a687bdb89cea965370d56df3be61911f21b83c66bf57a4cde5d68b262299b not found: ID does not exist" containerID="091a687bdb89cea965370d56df3be61911f21b83c66bf57a4cde5d68b262299b" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.181018 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"091a687bdb89cea965370d56df3be61911f21b83c66bf57a4cde5d68b262299b"} err="failed to get container status \"091a687bdb89cea965370d56df3be61911f21b83c66bf57a4cde5d68b262299b\": rpc error: code = NotFound desc = could not find container \"091a687bdb89cea965370d56df3be61911f21b83c66bf57a4cde5d68b262299b\": container with ID starting with 091a687bdb89cea965370d56df3be61911f21b83c66bf57a4cde5d68b262299b not found: ID does not exist" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.206568 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-config\") pod \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.206645 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-ovsdbserver-sb\") pod \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.206714 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-ovsdbserver-nb\") pod \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.206766 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bh5k\" (UniqueName: \"kubernetes.io/projected/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-kube-api-access-2bh5k\") pod \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.206800 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-dns-svc\") pod \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\" (UID: \"be5b6fe2-0642-45a1-b2ae-943f4ac7f553\") " Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.211826 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-kube-api-access-2bh5k" (OuterVolumeSpecName: "kube-api-access-2bh5k") pod "be5b6fe2-0642-45a1-b2ae-943f4ac7f553" (UID: "be5b6fe2-0642-45a1-b2ae-943f4ac7f553"). InnerVolumeSpecName "kube-api-access-2bh5k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.248171 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "be5b6fe2-0642-45a1-b2ae-943f4ac7f553" (UID: "be5b6fe2-0642-45a1-b2ae-943f4ac7f553"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.254872 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "be5b6fe2-0642-45a1-b2ae-943f4ac7f553" (UID: "be5b6fe2-0642-45a1-b2ae-943f4ac7f553"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.265882 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-config" (OuterVolumeSpecName: "config") pod "be5b6fe2-0642-45a1-b2ae-943f4ac7f553" (UID: "be5b6fe2-0642-45a1-b2ae-943f4ac7f553"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.271804 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "be5b6fe2-0642-45a1-b2ae-943f4ac7f553" (UID: "be5b6fe2-0642-45a1-b2ae-943f4ac7f553"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.308486 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.308522 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.308531 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.308541 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.308551 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bh5k\" (UniqueName: \"kubernetes.io/projected/be5b6fe2-0642-45a1-b2ae-943f4ac7f553-kube-api-access-2bh5k\") on node \"crc\" DevicePath \"\"" Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.439908 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-74b69dd9c7-k8bw7"] Dec 03 08:02:48 crc kubenswrapper[4831]: I1203 08:02:48.446859 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-74b69dd9c7-k8bw7"] Dec 03 08:02:49 crc kubenswrapper[4831]: I1203 08:02:49.048739 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be5b6fe2-0642-45a1-b2ae-943f4ac7f553" path="/var/lib/kubelet/pods/be5b6fe2-0642-45a1-b2ae-943f4ac7f553/volumes" Dec 03 08:02:57 crc kubenswrapper[4831]: I1203 08:02:57.596426 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:02:57 crc kubenswrapper[4831]: I1203 08:02:57.597063 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:02:57 crc kubenswrapper[4831]: I1203 08:02:57.597130 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 08:02:57 crc kubenswrapper[4831]: I1203 08:02:57.598135 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b048367eeeb7f28a9954ddb05cb80ad0ca3f94ca3e078cded79e040a0e5a5a5d"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:02:57 crc kubenswrapper[4831]: I1203 08:02:57.598233 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://b048367eeeb7f28a9954ddb05cb80ad0ca3f94ca3e078cded79e040a0e5a5a5d" gracePeriod=600 Dec 03 08:02:58 crc kubenswrapper[4831]: I1203 08:02:58.227250 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="b048367eeeb7f28a9954ddb05cb80ad0ca3f94ca3e078cded79e040a0e5a5a5d" exitCode=0 Dec 03 08:02:58 crc kubenswrapper[4831]: I1203 08:02:58.227453 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"b048367eeeb7f28a9954ddb05cb80ad0ca3f94ca3e078cded79e040a0e5a5a5d"} Dec 03 08:02:58 crc kubenswrapper[4831]: I1203 08:02:58.227809 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342"} Dec 03 08:02:58 crc kubenswrapper[4831]: I1203 08:02:58.227831 4831 scope.go:117] "RemoveContainer" containerID="1c0fbbc394e102fead34a710dc577e3f3d2fa0ce5a9822be3890db22f5cb9c28" Dec 03 08:03:07 crc kubenswrapper[4831]: I1203 08:03:07.659480 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67c669c55-sljbg" Dec 03 08:03:19 crc kubenswrapper[4831]: I1203 08:03:19.984064 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-4tpvx"] Dec 03 08:03:19 crc kubenswrapper[4831]: E1203 08:03:19.985039 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5b6fe2-0642-45a1-b2ae-943f4ac7f553" containerName="dnsmasq-dns" Dec 03 08:03:19 crc kubenswrapper[4831]: I1203 08:03:19.985056 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5b6fe2-0642-45a1-b2ae-943f4ac7f553" containerName="dnsmasq-dns" Dec 03 08:03:19 crc kubenswrapper[4831]: E1203 08:03:19.985069 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be5b6fe2-0642-45a1-b2ae-943f4ac7f553" containerName="init" Dec 03 08:03:19 crc kubenswrapper[4831]: I1203 08:03:19.985076 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="be5b6fe2-0642-45a1-b2ae-943f4ac7f553" containerName="init" Dec 03 08:03:19 crc kubenswrapper[4831]: I1203 08:03:19.985303 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="be5b6fe2-0642-45a1-b2ae-943f4ac7f553" containerName="dnsmasq-dns" Dec 03 08:03:19 crc kubenswrapper[4831]: I1203 08:03:19.986036 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4tpvx" Dec 03 08:03:19 crc kubenswrapper[4831]: I1203 08:03:19.998631 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4tpvx"] Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.083629 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-45f9-account-create-update-5m2gl"] Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.085056 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-45f9-account-create-update-5m2gl" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.128050 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.132761 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-45f9-account-create-update-5m2gl"] Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.166560 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbjgq\" (UniqueName: \"kubernetes.io/projected/2743a206-e77a-445a-bbbc-66987e3357a8-kube-api-access-gbjgq\") pod \"glance-db-create-4tpvx\" (UID: \"2743a206-e77a-445a-bbbc-66987e3357a8\") " pod="openstack/glance-db-create-4tpvx" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.166648 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4825m\" (UniqueName: \"kubernetes.io/projected/e5d56d55-d26e-4957-9997-825ddfb815b9-kube-api-access-4825m\") pod \"glance-45f9-account-create-update-5m2gl\" (UID: \"e5d56d55-d26e-4957-9997-825ddfb815b9\") " pod="openstack/glance-45f9-account-create-update-5m2gl" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.166745 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d56d55-d26e-4957-9997-825ddfb815b9-operator-scripts\") pod \"glance-45f9-account-create-update-5m2gl\" (UID: \"e5d56d55-d26e-4957-9997-825ddfb815b9\") " pod="openstack/glance-45f9-account-create-update-5m2gl" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.166828 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2743a206-e77a-445a-bbbc-66987e3357a8-operator-scripts\") pod \"glance-db-create-4tpvx\" (UID: \"2743a206-e77a-445a-bbbc-66987e3357a8\") " pod="openstack/glance-db-create-4tpvx" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.268604 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d56d55-d26e-4957-9997-825ddfb815b9-operator-scripts\") pod \"glance-45f9-account-create-update-5m2gl\" (UID: \"e5d56d55-d26e-4957-9997-825ddfb815b9\") " pod="openstack/glance-45f9-account-create-update-5m2gl" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.269184 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2743a206-e77a-445a-bbbc-66987e3357a8-operator-scripts\") pod \"glance-db-create-4tpvx\" (UID: \"2743a206-e77a-445a-bbbc-66987e3357a8\") " pod="openstack/glance-db-create-4tpvx" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.269334 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbjgq\" (UniqueName: \"kubernetes.io/projected/2743a206-e77a-445a-bbbc-66987e3357a8-kube-api-access-gbjgq\") pod \"glance-db-create-4tpvx\" (UID: \"2743a206-e77a-445a-bbbc-66987e3357a8\") " pod="openstack/glance-db-create-4tpvx" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.269427 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4825m\" (UniqueName: \"kubernetes.io/projected/e5d56d55-d26e-4957-9997-825ddfb815b9-kube-api-access-4825m\") pod \"glance-45f9-account-create-update-5m2gl\" (UID: \"e5d56d55-d26e-4957-9997-825ddfb815b9\") " pod="openstack/glance-45f9-account-create-update-5m2gl" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.269503 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d56d55-d26e-4957-9997-825ddfb815b9-operator-scripts\") pod \"glance-45f9-account-create-update-5m2gl\" (UID: \"e5d56d55-d26e-4957-9997-825ddfb815b9\") " pod="openstack/glance-45f9-account-create-update-5m2gl" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.270568 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2743a206-e77a-445a-bbbc-66987e3357a8-operator-scripts\") pod \"glance-db-create-4tpvx\" (UID: \"2743a206-e77a-445a-bbbc-66987e3357a8\") " pod="openstack/glance-db-create-4tpvx" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.293530 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbjgq\" (UniqueName: \"kubernetes.io/projected/2743a206-e77a-445a-bbbc-66987e3357a8-kube-api-access-gbjgq\") pod \"glance-db-create-4tpvx\" (UID: \"2743a206-e77a-445a-bbbc-66987e3357a8\") " pod="openstack/glance-db-create-4tpvx" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.299077 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4825m\" (UniqueName: \"kubernetes.io/projected/e5d56d55-d26e-4957-9997-825ddfb815b9-kube-api-access-4825m\") pod \"glance-45f9-account-create-update-5m2gl\" (UID: \"e5d56d55-d26e-4957-9997-825ddfb815b9\") " pod="openstack/glance-45f9-account-create-update-5m2gl" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.313305 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4tpvx" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.437842 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-45f9-account-create-update-5m2gl" Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.711001 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-45f9-account-create-update-5m2gl"] Dec 03 08:03:20 crc kubenswrapper[4831]: W1203 08:03:20.714019 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode5d56d55_d26e_4957_9997_825ddfb815b9.slice/crio-43b443814fd3dec45d2b180dd10bb4f8daeadf359f80bf01374a9bb494615904 WatchSource:0}: Error finding container 43b443814fd3dec45d2b180dd10bb4f8daeadf359f80bf01374a9bb494615904: Status 404 returned error can't find the container with id 43b443814fd3dec45d2b180dd10bb4f8daeadf359f80bf01374a9bb494615904 Dec 03 08:03:20 crc kubenswrapper[4831]: I1203 08:03:20.795949 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-4tpvx"] Dec 03 08:03:21 crc kubenswrapper[4831]: I1203 08:03:21.488581 4831 generic.go:334] "Generic (PLEG): container finished" podID="e5d56d55-d26e-4957-9997-825ddfb815b9" containerID="098d6a615ad31864eb4583e6aa8c0497c2cb4eb26eed7d8c44d604ad34cda9ba" exitCode=0 Dec 03 08:03:21 crc kubenswrapper[4831]: I1203 08:03:21.488675 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-45f9-account-create-update-5m2gl" event={"ID":"e5d56d55-d26e-4957-9997-825ddfb815b9","Type":"ContainerDied","Data":"098d6a615ad31864eb4583e6aa8c0497c2cb4eb26eed7d8c44d604ad34cda9ba"} Dec 03 08:03:21 crc kubenswrapper[4831]: I1203 08:03:21.489137 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-45f9-account-create-update-5m2gl" event={"ID":"e5d56d55-d26e-4957-9997-825ddfb815b9","Type":"ContainerStarted","Data":"43b443814fd3dec45d2b180dd10bb4f8daeadf359f80bf01374a9bb494615904"} Dec 03 08:03:21 crc kubenswrapper[4831]: I1203 08:03:21.492040 4831 generic.go:334] "Generic (PLEG): container finished" podID="2743a206-e77a-445a-bbbc-66987e3357a8" containerID="f1b6f8adcaf9d7621b93036f3120bada0ab61cc07f41f8a61bcfc70ba9fbe979" exitCode=0 Dec 03 08:03:21 crc kubenswrapper[4831]: I1203 08:03:21.492107 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4tpvx" event={"ID":"2743a206-e77a-445a-bbbc-66987e3357a8","Type":"ContainerDied","Data":"f1b6f8adcaf9d7621b93036f3120bada0ab61cc07f41f8a61bcfc70ba9fbe979"} Dec 03 08:03:21 crc kubenswrapper[4831]: I1203 08:03:21.492150 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4tpvx" event={"ID":"2743a206-e77a-445a-bbbc-66987e3357a8","Type":"ContainerStarted","Data":"a304c79bb48e6ad4f17e8c117a143ba5d50cc1f2d3ef9810f2211bda2cd422af"} Dec 03 08:03:22 crc kubenswrapper[4831]: I1203 08:03:22.996154 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4tpvx" Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.003443 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-45f9-account-create-update-5m2gl" Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.114601 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2743a206-e77a-445a-bbbc-66987e3357a8-operator-scripts\") pod \"2743a206-e77a-445a-bbbc-66987e3357a8\" (UID: \"2743a206-e77a-445a-bbbc-66987e3357a8\") " Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.114714 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d56d55-d26e-4957-9997-825ddfb815b9-operator-scripts\") pod \"e5d56d55-d26e-4957-9997-825ddfb815b9\" (UID: \"e5d56d55-d26e-4957-9997-825ddfb815b9\") " Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.114776 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbjgq\" (UniqueName: \"kubernetes.io/projected/2743a206-e77a-445a-bbbc-66987e3357a8-kube-api-access-gbjgq\") pod \"2743a206-e77a-445a-bbbc-66987e3357a8\" (UID: \"2743a206-e77a-445a-bbbc-66987e3357a8\") " Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.114886 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4825m\" (UniqueName: \"kubernetes.io/projected/e5d56d55-d26e-4957-9997-825ddfb815b9-kube-api-access-4825m\") pod \"e5d56d55-d26e-4957-9997-825ddfb815b9\" (UID: \"e5d56d55-d26e-4957-9997-825ddfb815b9\") " Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.115913 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2743a206-e77a-445a-bbbc-66987e3357a8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2743a206-e77a-445a-bbbc-66987e3357a8" (UID: "2743a206-e77a-445a-bbbc-66987e3357a8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.115947 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e5d56d55-d26e-4957-9997-825ddfb815b9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e5d56d55-d26e-4957-9997-825ddfb815b9" (UID: "e5d56d55-d26e-4957-9997-825ddfb815b9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.120456 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5d56d55-d26e-4957-9997-825ddfb815b9-kube-api-access-4825m" (OuterVolumeSpecName: "kube-api-access-4825m") pod "e5d56d55-d26e-4957-9997-825ddfb815b9" (UID: "e5d56d55-d26e-4957-9997-825ddfb815b9"). InnerVolumeSpecName "kube-api-access-4825m". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.120522 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2743a206-e77a-445a-bbbc-66987e3357a8-kube-api-access-gbjgq" (OuterVolumeSpecName: "kube-api-access-gbjgq") pod "2743a206-e77a-445a-bbbc-66987e3357a8" (UID: "2743a206-e77a-445a-bbbc-66987e3357a8"). InnerVolumeSpecName "kube-api-access-gbjgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.216857 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2743a206-e77a-445a-bbbc-66987e3357a8-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.216913 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e5d56d55-d26e-4957-9997-825ddfb815b9-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.216933 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gbjgq\" (UniqueName: \"kubernetes.io/projected/2743a206-e77a-445a-bbbc-66987e3357a8-kube-api-access-gbjgq\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.216953 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4825m\" (UniqueName: \"kubernetes.io/projected/e5d56d55-d26e-4957-9997-825ddfb815b9-kube-api-access-4825m\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.519139 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-45f9-account-create-update-5m2gl" Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.519134 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-45f9-account-create-update-5m2gl" event={"ID":"e5d56d55-d26e-4957-9997-825ddfb815b9","Type":"ContainerDied","Data":"43b443814fd3dec45d2b180dd10bb4f8daeadf359f80bf01374a9bb494615904"} Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.519731 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="43b443814fd3dec45d2b180dd10bb4f8daeadf359f80bf01374a9bb494615904" Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.521009 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-4tpvx" event={"ID":"2743a206-e77a-445a-bbbc-66987e3357a8","Type":"ContainerDied","Data":"a304c79bb48e6ad4f17e8c117a143ba5d50cc1f2d3ef9810f2211bda2cd422af"} Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.521057 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a304c79bb48e6ad4f17e8c117a143ba5d50cc1f2d3ef9810f2211bda2cd422af" Dec 03 08:03:23 crc kubenswrapper[4831]: I1203 08:03:23.521039 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-4tpvx" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.291879 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-lzg9j"] Dec 03 08:03:25 crc kubenswrapper[4831]: E1203 08:03:25.292947 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2743a206-e77a-445a-bbbc-66987e3357a8" containerName="mariadb-database-create" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.292972 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2743a206-e77a-445a-bbbc-66987e3357a8" containerName="mariadb-database-create" Dec 03 08:03:25 crc kubenswrapper[4831]: E1203 08:03:25.293047 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5d56d55-d26e-4957-9997-825ddfb815b9" containerName="mariadb-account-create-update" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.293063 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5d56d55-d26e-4957-9997-825ddfb815b9" containerName="mariadb-account-create-update" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.293383 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5d56d55-d26e-4957-9997-825ddfb815b9" containerName="mariadb-account-create-update" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.293438 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2743a206-e77a-445a-bbbc-66987e3357a8" containerName="mariadb-database-create" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.294483 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.299013 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.299207 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vqglp" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.305107 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lzg9j"] Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.352789 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvwdf\" (UniqueName: \"kubernetes.io/projected/4a5d905f-87c5-4c09-8825-4938cda62ee7-kube-api-access-kvwdf\") pod \"glance-db-sync-lzg9j\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.352882 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-combined-ca-bundle\") pod \"glance-db-sync-lzg9j\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.352908 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-db-sync-config-data\") pod \"glance-db-sync-lzg9j\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.352933 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-config-data\") pod \"glance-db-sync-lzg9j\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.454139 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-combined-ca-bundle\") pod \"glance-db-sync-lzg9j\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.454216 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-db-sync-config-data\") pod \"glance-db-sync-lzg9j\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.454272 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-config-data\") pod \"glance-db-sync-lzg9j\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.454399 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvwdf\" (UniqueName: \"kubernetes.io/projected/4a5d905f-87c5-4c09-8825-4938cda62ee7-kube-api-access-kvwdf\") pod \"glance-db-sync-lzg9j\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.460070 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-db-sync-config-data\") pod \"glance-db-sync-lzg9j\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.465544 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-config-data\") pod \"glance-db-sync-lzg9j\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.473224 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-combined-ca-bundle\") pod \"glance-db-sync-lzg9j\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.474432 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvwdf\" (UniqueName: \"kubernetes.io/projected/4a5d905f-87c5-4c09-8825-4938cda62ee7-kube-api-access-kvwdf\") pod \"glance-db-sync-lzg9j\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.627632 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:25 crc kubenswrapper[4831]: I1203 08:03:25.991720 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-lzg9j"] Dec 03 08:03:26 crc kubenswrapper[4831]: I1203 08:03:26.549692 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lzg9j" event={"ID":"4a5d905f-87c5-4c09-8825-4938cda62ee7","Type":"ContainerStarted","Data":"11c105232d78505ef4f74999eb163125243e8bb7f52638ae83bb1aeaff58b517"} Dec 03 08:03:27 crc kubenswrapper[4831]: I1203 08:03:27.591019 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lzg9j" event={"ID":"4a5d905f-87c5-4c09-8825-4938cda62ee7","Type":"ContainerStarted","Data":"dbeed3b0cf0f639fe8f509483e0df43eceb9bb8b54d9db23191dc5256fd851e8"} Dec 03 08:03:27 crc kubenswrapper[4831]: I1203 08:03:27.639811 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-lzg9j" podStartSLOduration=2.639790434 podStartE2EDuration="2.639790434s" podCreationTimestamp="2025-12-03 08:03:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:03:27.617477679 +0000 UTC m=+5544.961061197" watchObservedRunningTime="2025-12-03 08:03:27.639790434 +0000 UTC m=+5544.983373962" Dec 03 08:03:29 crc kubenswrapper[4831]: I1203 08:03:29.613831 4831 generic.go:334] "Generic (PLEG): container finished" podID="4a5d905f-87c5-4c09-8825-4938cda62ee7" containerID="dbeed3b0cf0f639fe8f509483e0df43eceb9bb8b54d9db23191dc5256fd851e8" exitCode=0 Dec 03 08:03:29 crc kubenswrapper[4831]: I1203 08:03:29.613961 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lzg9j" event={"ID":"4a5d905f-87c5-4c09-8825-4938cda62ee7","Type":"ContainerDied","Data":"dbeed3b0cf0f639fe8f509483e0df43eceb9bb8b54d9db23191dc5256fd851e8"} Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.100029 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.150423 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kvwdf\" (UniqueName: \"kubernetes.io/projected/4a5d905f-87c5-4c09-8825-4938cda62ee7-kube-api-access-kvwdf\") pod \"4a5d905f-87c5-4c09-8825-4938cda62ee7\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.150532 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-combined-ca-bundle\") pod \"4a5d905f-87c5-4c09-8825-4938cda62ee7\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.150718 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-config-data\") pod \"4a5d905f-87c5-4c09-8825-4938cda62ee7\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.150947 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-db-sync-config-data\") pod \"4a5d905f-87c5-4c09-8825-4938cda62ee7\" (UID: \"4a5d905f-87c5-4c09-8825-4938cda62ee7\") " Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.157475 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a5d905f-87c5-4c09-8825-4938cda62ee7-kube-api-access-kvwdf" (OuterVolumeSpecName: "kube-api-access-kvwdf") pod "4a5d905f-87c5-4c09-8825-4938cda62ee7" (UID: "4a5d905f-87c5-4c09-8825-4938cda62ee7"). InnerVolumeSpecName "kube-api-access-kvwdf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.157870 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "4a5d905f-87c5-4c09-8825-4938cda62ee7" (UID: "4a5d905f-87c5-4c09-8825-4938cda62ee7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.190221 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4a5d905f-87c5-4c09-8825-4938cda62ee7" (UID: "4a5d905f-87c5-4c09-8825-4938cda62ee7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.202987 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-config-data" (OuterVolumeSpecName: "config-data") pod "4a5d905f-87c5-4c09-8825-4938cda62ee7" (UID: "4a5d905f-87c5-4c09-8825-4938cda62ee7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.253554 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.253603 4831 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.253627 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kvwdf\" (UniqueName: \"kubernetes.io/projected/4a5d905f-87c5-4c09-8825-4938cda62ee7-kube-api-access-kvwdf\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.253644 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a5d905f-87c5-4c09-8825-4938cda62ee7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.636419 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-lzg9j" event={"ID":"4a5d905f-87c5-4c09-8825-4938cda62ee7","Type":"ContainerDied","Data":"11c105232d78505ef4f74999eb163125243e8bb7f52638ae83bb1aeaff58b517"} Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.636456 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11c105232d78505ef4f74999eb163125243e8bb7f52638ae83bb1aeaff58b517" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.636507 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-lzg9j" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.962530 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:03:31 crc kubenswrapper[4831]: E1203 08:03:31.963348 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a5d905f-87c5-4c09-8825-4938cda62ee7" containerName="glance-db-sync" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.963458 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a5d905f-87c5-4c09-8825-4938cda62ee7" containerName="glance-db-sync" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.963755 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a5d905f-87c5-4c09-8825-4938cda62ee7" containerName="glance-db-sync" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.965090 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.967417 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.968165 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.968699 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-vqglp" Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.982021 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:03:31 crc kubenswrapper[4831]: I1203 08:03:31.983917 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.066482 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-dd7cfb74c-5xxmr"] Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.068474 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.089772 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dd7cfb74c-5xxmr"] Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.118429 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd35d32-6f0c-4249-a0a7-153254745624-logs\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.118562 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-scripts\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.118618 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cd35d32-6f0c-4249-a0a7-153254745624-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.118661 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.118689 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-config-data\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.118725 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sq6ll\" (UniqueName: \"kubernetes.io/projected/2cd35d32-6f0c-4249-a0a7-153254745624-kube-api-access-sq6ll\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.118779 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2cd35d32-6f0c-4249-a0a7-153254745624-ceph\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.213984 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.215392 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.217524 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.220300 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-config\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.220388 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5v9x\" (UniqueName: \"kubernetes.io/projected/b9d47adc-4f70-49c6-9211-8718d57aeafc-kube-api-access-r5v9x\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.220443 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd35d32-6f0c-4249-a0a7-153254745624-logs\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.220703 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-scripts\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.220882 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-ovsdbserver-sb\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.220978 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cd35d32-6f0c-4249-a0a7-153254745624-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.220894 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd35d32-6f0c-4249-a0a7-153254745624-logs\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.221165 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.221290 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-config-data\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.221442 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sq6ll\" (UniqueName: \"kubernetes.io/projected/2cd35d32-6f0c-4249-a0a7-153254745624-kube-api-access-sq6ll\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.221530 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-dns-svc\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.221613 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-ovsdbserver-nb\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.221365 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cd35d32-6f0c-4249-a0a7-153254745624-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.221785 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2cd35d32-6f0c-4249-a0a7-153254745624-ceph\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.230955 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-scripts\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.230966 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2cd35d32-6f0c-4249-a0a7-153254745624-ceph\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.234388 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.235499 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-config-data\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.247764 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.249056 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sq6ll\" (UniqueName: \"kubernetes.io/projected/2cd35d32-6f0c-4249-a0a7-153254745624-kube-api-access-sq6ll\") pod \"glance-default-external-api-0\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.283750 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.323985 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-ceph\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.324077 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.324163 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-ovsdbserver-sb\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.324242 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.324300 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-logs\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.324377 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-dns-svc\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.324407 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-ovsdbserver-nb\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.324462 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.324491 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mb7pn\" (UniqueName: \"kubernetes.io/projected/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-kube-api-access-mb7pn\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.324591 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.324637 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-config\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.324684 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5v9x\" (UniqueName: \"kubernetes.io/projected/b9d47adc-4f70-49c6-9211-8718d57aeafc-kube-api-access-r5v9x\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.330988 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-ovsdbserver-nb\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.331878 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-config\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.332093 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-ovsdbserver-sb\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.335303 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-dns-svc\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.349709 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5v9x\" (UniqueName: \"kubernetes.io/projected/b9d47adc-4f70-49c6-9211-8718d57aeafc-kube-api-access-r5v9x\") pod \"dnsmasq-dns-dd7cfb74c-5xxmr\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.389015 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.427390 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.430146 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-ceph\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.430186 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.430275 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.430484 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-logs\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.430542 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.430571 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mb7pn\" (UniqueName: \"kubernetes.io/projected/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-kube-api-access-mb7pn\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.432043 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-logs\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.432760 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.433779 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-scripts\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.435197 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-ceph\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.439419 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-config-data\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.445152 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.451024 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mb7pn\" (UniqueName: \"kubernetes.io/projected/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-kube-api-access-mb7pn\") pod \"glance-default-internal-api-0\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.609787 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.892757 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-dd7cfb74c-5xxmr"] Dec 03 08:03:32 crc kubenswrapper[4831]: I1203 08:03:32.906993 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:03:32 crc kubenswrapper[4831]: W1203 08:03:32.911505 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cd35d32_6f0c_4249_a0a7_153254745624.slice/crio-564885090e894d8eea30ae012fbf70535d24c232f27325086aaba01e3edcb86a WatchSource:0}: Error finding container 564885090e894d8eea30ae012fbf70535d24c232f27325086aaba01e3edcb86a: Status 404 returned error can't find the container with id 564885090e894d8eea30ae012fbf70535d24c232f27325086aaba01e3edcb86a Dec 03 08:03:33 crc kubenswrapper[4831]: I1203 08:03:33.113433 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:03:33 crc kubenswrapper[4831]: I1203 08:03:33.157047 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:03:33 crc kubenswrapper[4831]: W1203 08:03:33.189048 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2eacb379_de6c_48c8_86bd_db9f51bdc5ec.slice/crio-3274819083d0c85f72f575c3795867838bad0505706e890f61d73fe6627ff827 WatchSource:0}: Error finding container 3274819083d0c85f72f575c3795867838bad0505706e890f61d73fe6627ff827: Status 404 returned error can't find the container with id 3274819083d0c85f72f575c3795867838bad0505706e890f61d73fe6627ff827 Dec 03 08:03:33 crc kubenswrapper[4831]: I1203 08:03:33.662452 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2eacb379-de6c-48c8-86bd-db9f51bdc5ec","Type":"ContainerStarted","Data":"3274819083d0c85f72f575c3795867838bad0505706e890f61d73fe6627ff827"} Dec 03 08:03:33 crc kubenswrapper[4831]: I1203 08:03:33.664381 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cd35d32-6f0c-4249-a0a7-153254745624","Type":"ContainerStarted","Data":"dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357"} Dec 03 08:03:33 crc kubenswrapper[4831]: I1203 08:03:33.664439 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cd35d32-6f0c-4249-a0a7-153254745624","Type":"ContainerStarted","Data":"564885090e894d8eea30ae012fbf70535d24c232f27325086aaba01e3edcb86a"} Dec 03 08:03:33 crc kubenswrapper[4831]: I1203 08:03:33.666683 4831 generic.go:334] "Generic (PLEG): container finished" podID="b9d47adc-4f70-49c6-9211-8718d57aeafc" containerID="22ce868330d9c75e7f31bccc280fd896cc9be5642fc95ffc555ba44f020139bc" exitCode=0 Dec 03 08:03:33 crc kubenswrapper[4831]: I1203 08:03:33.666730 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" event={"ID":"b9d47adc-4f70-49c6-9211-8718d57aeafc","Type":"ContainerDied","Data":"22ce868330d9c75e7f31bccc280fd896cc9be5642fc95ffc555ba44f020139bc"} Dec 03 08:03:33 crc kubenswrapper[4831]: I1203 08:03:33.666752 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" event={"ID":"b9d47adc-4f70-49c6-9211-8718d57aeafc","Type":"ContainerStarted","Data":"67f7f1119218f29ddb809726c0bb11bb8199faee4d7028f8bdfd989617c7aa7e"} Dec 03 08:03:34 crc kubenswrapper[4831]: I1203 08:03:34.696760 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2eacb379-de6c-48c8-86bd-db9f51bdc5ec","Type":"ContainerStarted","Data":"f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34"} Dec 03 08:03:34 crc kubenswrapper[4831]: I1203 08:03:34.697073 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2eacb379-de6c-48c8-86bd-db9f51bdc5ec","Type":"ContainerStarted","Data":"4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222"} Dec 03 08:03:34 crc kubenswrapper[4831]: I1203 08:03:34.702188 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cd35d32-6f0c-4249-a0a7-153254745624","Type":"ContainerStarted","Data":"b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1"} Dec 03 08:03:34 crc kubenswrapper[4831]: I1203 08:03:34.702711 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2cd35d32-6f0c-4249-a0a7-153254745624" containerName="glance-log" containerID="cri-o://dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357" gracePeriod=30 Dec 03 08:03:34 crc kubenswrapper[4831]: I1203 08:03:34.702917 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2cd35d32-6f0c-4249-a0a7-153254745624" containerName="glance-httpd" containerID="cri-o://b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1" gracePeriod=30 Dec 03 08:03:34 crc kubenswrapper[4831]: I1203 08:03:34.710753 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" event={"ID":"b9d47adc-4f70-49c6-9211-8718d57aeafc","Type":"ContainerStarted","Data":"fbdfcf6d5f00426b8e5479f94a7ae967f1a591c51409021da72a815607ed2fa6"} Dec 03 08:03:34 crc kubenswrapper[4831]: I1203 08:03:34.710909 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:34 crc kubenswrapper[4831]: I1203 08:03:34.723042 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.723027234 podStartE2EDuration="2.723027234s" podCreationTimestamp="2025-12-03 08:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:03:34.718427481 +0000 UTC m=+5552.062011009" watchObservedRunningTime="2025-12-03 08:03:34.723027234 +0000 UTC m=+5552.066610742" Dec 03 08:03:34 crc kubenswrapper[4831]: I1203 08:03:34.738362 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.738348491 podStartE2EDuration="3.738348491s" podCreationTimestamp="2025-12-03 08:03:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:03:34.735651227 +0000 UTC m=+5552.079234735" watchObservedRunningTime="2025-12-03 08:03:34.738348491 +0000 UTC m=+5552.081931999" Dec 03 08:03:34 crc kubenswrapper[4831]: I1203 08:03:34.755876 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" podStartSLOduration=2.755865285 podStartE2EDuration="2.755865285s" podCreationTimestamp="2025-12-03 08:03:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:03:34.753160651 +0000 UTC m=+5552.096744149" watchObservedRunningTime="2025-12-03 08:03:34.755865285 +0000 UTC m=+5552.099448793" Dec 03 08:03:34 crc kubenswrapper[4831]: I1203 08:03:34.980396 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.423341 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.484824 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd35d32-6f0c-4249-a0a7-153254745624-logs\") pod \"2cd35d32-6f0c-4249-a0a7-153254745624\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.485066 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cd35d32-6f0c-4249-a0a7-153254745624-httpd-run\") pod \"2cd35d32-6f0c-4249-a0a7-153254745624\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.485337 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sq6ll\" (UniqueName: \"kubernetes.io/projected/2cd35d32-6f0c-4249-a0a7-153254745624-kube-api-access-sq6ll\") pod \"2cd35d32-6f0c-4249-a0a7-153254745624\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.485460 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd35d32-6f0c-4249-a0a7-153254745624-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2cd35d32-6f0c-4249-a0a7-153254745624" (UID: "2cd35d32-6f0c-4249-a0a7-153254745624"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.485494 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-scripts\") pod \"2cd35d32-6f0c-4249-a0a7-153254745624\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.485582 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cd35d32-6f0c-4249-a0a7-153254745624-logs" (OuterVolumeSpecName: "logs") pod "2cd35d32-6f0c-4249-a0a7-153254745624" (UID: "2cd35d32-6f0c-4249-a0a7-153254745624"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.485617 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2cd35d32-6f0c-4249-a0a7-153254745624-ceph\") pod \"2cd35d32-6f0c-4249-a0a7-153254745624\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.485788 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-config-data\") pod \"2cd35d32-6f0c-4249-a0a7-153254745624\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.485953 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-combined-ca-bundle\") pod \"2cd35d32-6f0c-4249-a0a7-153254745624\" (UID: \"2cd35d32-6f0c-4249-a0a7-153254745624\") " Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.487084 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cd35d32-6f0c-4249-a0a7-153254745624-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.487312 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cd35d32-6f0c-4249-a0a7-153254745624-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.492017 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-scripts" (OuterVolumeSpecName: "scripts") pod "2cd35d32-6f0c-4249-a0a7-153254745624" (UID: "2cd35d32-6f0c-4249-a0a7-153254745624"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.492019 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd35d32-6f0c-4249-a0a7-153254745624-ceph" (OuterVolumeSpecName: "ceph") pod "2cd35d32-6f0c-4249-a0a7-153254745624" (UID: "2cd35d32-6f0c-4249-a0a7-153254745624"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.492679 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cd35d32-6f0c-4249-a0a7-153254745624-kube-api-access-sq6ll" (OuterVolumeSpecName: "kube-api-access-sq6ll") pod "2cd35d32-6f0c-4249-a0a7-153254745624" (UID: "2cd35d32-6f0c-4249-a0a7-153254745624"). InnerVolumeSpecName "kube-api-access-sq6ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.532805 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cd35d32-6f0c-4249-a0a7-153254745624" (UID: "2cd35d32-6f0c-4249-a0a7-153254745624"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.555486 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-config-data" (OuterVolumeSpecName: "config-data") pod "2cd35d32-6f0c-4249-a0a7-153254745624" (UID: "2cd35d32-6f0c-4249-a0a7-153254745624"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.589833 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sq6ll\" (UniqueName: \"kubernetes.io/projected/2cd35d32-6f0c-4249-a0a7-153254745624-kube-api-access-sq6ll\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.589891 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.589912 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2cd35d32-6f0c-4249-a0a7-153254745624-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.589931 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.589954 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cd35d32-6f0c-4249-a0a7-153254745624-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.720861 4831 generic.go:334] "Generic (PLEG): container finished" podID="2cd35d32-6f0c-4249-a0a7-153254745624" containerID="b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1" exitCode=0 Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.720898 4831 generic.go:334] "Generic (PLEG): container finished" podID="2cd35d32-6f0c-4249-a0a7-153254745624" containerID="dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357" exitCode=143 Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.720943 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.720941 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cd35d32-6f0c-4249-a0a7-153254745624","Type":"ContainerDied","Data":"b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1"} Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.721249 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cd35d32-6f0c-4249-a0a7-153254745624","Type":"ContainerDied","Data":"dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357"} Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.721330 4831 scope.go:117] "RemoveContainer" containerID="b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.721346 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cd35d32-6f0c-4249-a0a7-153254745624","Type":"ContainerDied","Data":"564885090e894d8eea30ae012fbf70535d24c232f27325086aaba01e3edcb86a"} Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.767698 4831 scope.go:117] "RemoveContainer" containerID="dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.767798 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.774960 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.794020 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:03:35 crc kubenswrapper[4831]: E1203 08:03:35.794439 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd35d32-6f0c-4249-a0a7-153254745624" containerName="glance-httpd" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.794457 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd35d32-6f0c-4249-a0a7-153254745624" containerName="glance-httpd" Dec 03 08:03:35 crc kubenswrapper[4831]: E1203 08:03:35.794471 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cd35d32-6f0c-4249-a0a7-153254745624" containerName="glance-log" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.794478 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cd35d32-6f0c-4249-a0a7-153254745624" containerName="glance-log" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.794649 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd35d32-6f0c-4249-a0a7-153254745624" containerName="glance-log" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.794668 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cd35d32-6f0c-4249-a0a7-153254745624" containerName="glance-httpd" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.796180 4831 scope.go:117] "RemoveContainer" containerID="b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1" Dec 03 08:03:35 crc kubenswrapper[4831]: E1203 08:03:35.797190 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1\": container with ID starting with b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1 not found: ID does not exist" containerID="b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.797281 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1"} err="failed to get container status \"b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1\": rpc error: code = NotFound desc = could not find container \"b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1\": container with ID starting with b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1 not found: ID does not exist" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.797338 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.797359 4831 scope.go:117] "RemoveContainer" containerID="dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357" Dec 03 08:03:35 crc kubenswrapper[4831]: E1203 08:03:35.797697 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357\": container with ID starting with dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357 not found: ID does not exist" containerID="dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.797733 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357"} err="failed to get container status \"dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357\": rpc error: code = NotFound desc = could not find container \"dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357\": container with ID starting with dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357 not found: ID does not exist" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.797747 4831 scope.go:117] "RemoveContainer" containerID="b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.802720 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1"} err="failed to get container status \"b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1\": rpc error: code = NotFound desc = could not find container \"b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1\": container with ID starting with b588a67a3955596906ae4be550eaedbf38aee48e7449dfa37ecfb3683b3a1ba1 not found: ID does not exist" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.802803 4831 scope.go:117] "RemoveContainer" containerID="dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.803297 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357"} err="failed to get container status \"dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357\": rpc error: code = NotFound desc = could not find container \"dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357\": container with ID starting with dde79f4bf5ed38249c7b1495ea1e72e8300c364abdcb2bf290b86851a5bc9357 not found: ID does not exist" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.804503 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.812084 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.895997 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-config-data\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.896052 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-scripts\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.896117 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cfd386b-0830-4781-ba21-109800ca3b3a-logs\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.896135 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46k8j\" (UniqueName: \"kubernetes.io/projected/2cfd386b-0830-4781-ba21-109800ca3b3a-kube-api-access-46k8j\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.896184 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.896287 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2cfd386b-0830-4781-ba21-109800ca3b3a-ceph\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.896391 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cfd386b-0830-4781-ba21-109800ca3b3a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.997938 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2cfd386b-0830-4781-ba21-109800ca3b3a-ceph\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.997990 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cfd386b-0830-4781-ba21-109800ca3b3a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.998063 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-config-data\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.998083 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-scripts\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.998124 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cfd386b-0830-4781-ba21-109800ca3b3a-logs\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.998148 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46k8j\" (UniqueName: \"kubernetes.io/projected/2cfd386b-0830-4781-ba21-109800ca3b3a-kube-api-access-46k8j\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.998187 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.998951 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cfd386b-0830-4781-ba21-109800ca3b3a-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:35 crc kubenswrapper[4831]: I1203 08:03:35.998969 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cfd386b-0830-4781-ba21-109800ca3b3a-logs\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:36 crc kubenswrapper[4831]: I1203 08:03:36.002180 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2cfd386b-0830-4781-ba21-109800ca3b3a-ceph\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:36 crc kubenswrapper[4831]: I1203 08:03:36.002517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-scripts\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:36 crc kubenswrapper[4831]: I1203 08:03:36.003248 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:36 crc kubenswrapper[4831]: I1203 08:03:36.003478 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-config-data\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:36 crc kubenswrapper[4831]: I1203 08:03:36.016821 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46k8j\" (UniqueName: \"kubernetes.io/projected/2cfd386b-0830-4781-ba21-109800ca3b3a-kube-api-access-46k8j\") pod \"glance-default-external-api-0\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " pod="openstack/glance-default-external-api-0" Dec 03 08:03:36 crc kubenswrapper[4831]: I1203 08:03:36.132244 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 08:03:36 crc kubenswrapper[4831]: I1203 08:03:36.732896 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2eacb379-de6c-48c8-86bd-db9f51bdc5ec" containerName="glance-log" containerID="cri-o://4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222" gracePeriod=30 Dec 03 08:03:36 crc kubenswrapper[4831]: I1203 08:03:36.732980 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="2eacb379-de6c-48c8-86bd-db9f51bdc5ec" containerName="glance-httpd" containerID="cri-o://f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34" gracePeriod=30 Dec 03 08:03:36 crc kubenswrapper[4831]: I1203 08:03:36.780608 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:03:36 crc kubenswrapper[4831]: W1203 08:03:36.790360 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cfd386b_0830_4781_ba21_109800ca3b3a.slice/crio-782ddc0118cfb6e3f3715dca6f541629bf48f9e38ecf204d54d06643b7c9498d WatchSource:0}: Error finding container 782ddc0118cfb6e3f3715dca6f541629bf48f9e38ecf204d54d06643b7c9498d: Status 404 returned error can't find the container with id 782ddc0118cfb6e3f3715dca6f541629bf48f9e38ecf204d54d06643b7c9498d Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.023109 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cd35d32-6f0c-4249-a0a7-153254745624" path="/var/lib/kubelet/pods/2cd35d32-6f0c-4249-a0a7-153254745624/volumes" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.241705 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.336594 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-combined-ca-bundle\") pod \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.336764 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mb7pn\" (UniqueName: \"kubernetes.io/projected/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-kube-api-access-mb7pn\") pod \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.336855 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-ceph\") pod \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.336918 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-httpd-run\") pod \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.337001 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-scripts\") pod \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.337025 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-config-data\") pod \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.337047 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-logs\") pod \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\" (UID: \"2eacb379-de6c-48c8-86bd-db9f51bdc5ec\") " Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.338053 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2eacb379-de6c-48c8-86bd-db9f51bdc5ec" (UID: "2eacb379-de6c-48c8-86bd-db9f51bdc5ec"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.338070 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-logs" (OuterVolumeSpecName: "logs") pod "2eacb379-de6c-48c8-86bd-db9f51bdc5ec" (UID: "2eacb379-de6c-48c8-86bd-db9f51bdc5ec"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.344931 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-ceph" (OuterVolumeSpecName: "ceph") pod "2eacb379-de6c-48c8-86bd-db9f51bdc5ec" (UID: "2eacb379-de6c-48c8-86bd-db9f51bdc5ec"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.349550 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-kube-api-access-mb7pn" (OuterVolumeSpecName: "kube-api-access-mb7pn") pod "2eacb379-de6c-48c8-86bd-db9f51bdc5ec" (UID: "2eacb379-de6c-48c8-86bd-db9f51bdc5ec"). InnerVolumeSpecName "kube-api-access-mb7pn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.355293 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-scripts" (OuterVolumeSpecName: "scripts") pod "2eacb379-de6c-48c8-86bd-db9f51bdc5ec" (UID: "2eacb379-de6c-48c8-86bd-db9f51bdc5ec"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.403424 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2eacb379-de6c-48c8-86bd-db9f51bdc5ec" (UID: "2eacb379-de6c-48c8-86bd-db9f51bdc5ec"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.417226 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-config-data" (OuterVolumeSpecName: "config-data") pod "2eacb379-de6c-48c8-86bd-db9f51bdc5ec" (UID: "2eacb379-de6c-48c8-86bd-db9f51bdc5ec"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.439050 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mb7pn\" (UniqueName: \"kubernetes.io/projected/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-kube-api-access-mb7pn\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.439087 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.439097 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.439107 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.439116 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.439125 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.439132 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2eacb379-de6c-48c8-86bd-db9f51bdc5ec-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.759815 4831 generic.go:334] "Generic (PLEG): container finished" podID="2eacb379-de6c-48c8-86bd-db9f51bdc5ec" containerID="f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34" exitCode=0 Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.760092 4831 generic.go:334] "Generic (PLEG): container finished" podID="2eacb379-de6c-48c8-86bd-db9f51bdc5ec" containerID="4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222" exitCode=143 Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.759947 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.760009 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2eacb379-de6c-48c8-86bd-db9f51bdc5ec","Type":"ContainerDied","Data":"f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34"} Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.765573 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2eacb379-de6c-48c8-86bd-db9f51bdc5ec","Type":"ContainerDied","Data":"4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222"} Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.765603 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"2eacb379-de6c-48c8-86bd-db9f51bdc5ec","Type":"ContainerDied","Data":"3274819083d0c85f72f575c3795867838bad0505706e890f61d73fe6627ff827"} Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.765626 4831 scope.go:117] "RemoveContainer" containerID="f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.769654 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cfd386b-0830-4781-ba21-109800ca3b3a","Type":"ContainerStarted","Data":"bf6e5a37fe4aade2d59fbb09879b583c1ef42ed33d5da75d3f75bc16643922b7"} Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.769698 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cfd386b-0830-4781-ba21-109800ca3b3a","Type":"ContainerStarted","Data":"782ddc0118cfb6e3f3715dca6f541629bf48f9e38ecf204d54d06643b7c9498d"} Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.833967 4831 scope.go:117] "RemoveContainer" containerID="4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.840935 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.849494 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.871114 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:03:37 crc kubenswrapper[4831]: E1203 08:03:37.871617 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eacb379-de6c-48c8-86bd-db9f51bdc5ec" containerName="glance-httpd" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.871641 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eacb379-de6c-48c8-86bd-db9f51bdc5ec" containerName="glance-httpd" Dec 03 08:03:37 crc kubenswrapper[4831]: E1203 08:03:37.871677 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2eacb379-de6c-48c8-86bd-db9f51bdc5ec" containerName="glance-log" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.871686 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2eacb379-de6c-48c8-86bd-db9f51bdc5ec" containerName="glance-log" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.871910 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eacb379-de6c-48c8-86bd-db9f51bdc5ec" containerName="glance-log" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.871937 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2eacb379-de6c-48c8-86bd-db9f51bdc5ec" containerName="glance-httpd" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.873113 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.877629 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.878733 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.890560 4831 scope.go:117] "RemoveContainer" containerID="f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34" Dec 03 08:03:37 crc kubenswrapper[4831]: E1203 08:03:37.892479 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34\": container with ID starting with f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34 not found: ID does not exist" containerID="f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.892520 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34"} err="failed to get container status \"f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34\": rpc error: code = NotFound desc = could not find container \"f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34\": container with ID starting with f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34 not found: ID does not exist" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.892547 4831 scope.go:117] "RemoveContainer" containerID="4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222" Dec 03 08:03:37 crc kubenswrapper[4831]: E1203 08:03:37.898919 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222\": container with ID starting with 4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222 not found: ID does not exist" containerID="4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.899100 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222"} err="failed to get container status \"4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222\": rpc error: code = NotFound desc = could not find container \"4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222\": container with ID starting with 4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222 not found: ID does not exist" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.899239 4831 scope.go:117] "RemoveContainer" containerID="f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.899753 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34"} err="failed to get container status \"f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34\": rpc error: code = NotFound desc = could not find container \"f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34\": container with ID starting with f02ccf0fbc2289c5626f4744b33bc60c8cfc810f67b79f6c87a9c0f31e86cf34 not found: ID does not exist" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.899903 4831 scope.go:117] "RemoveContainer" containerID="4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.900762 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222"} err="failed to get container status \"4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222\": rpc error: code = NotFound desc = could not find container \"4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222\": container with ID starting with 4be01853cafcda02333799a1dbdfe64af0128b59df9908d9cf59804d1447b222 not found: ID does not exist" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.952237 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-logs\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.952310 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.952409 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.952441 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.952474 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.952511 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72rw2\" (UniqueName: \"kubernetes.io/projected/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-kube-api-access-72rw2\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:37 crc kubenswrapper[4831]: I1203 08:03:37.952584 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.053838 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-logs\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.053898 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.053949 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.053975 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.054000 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.054026 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72rw2\" (UniqueName: \"kubernetes.io/projected/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-kube-api-access-72rw2\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.054084 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.054480 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-logs\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.054504 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.057489 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.057743 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-ceph\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.059978 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-config-data\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.062925 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-scripts\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.082493 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72rw2\" (UniqueName: \"kubernetes.io/projected/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-kube-api-access-72rw2\") pod \"glance-default-internal-api-0\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.191425 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.553901 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.779929 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cfd386b-0830-4781-ba21-109800ca3b3a","Type":"ContainerStarted","Data":"6690fb5d84a0049fedb1a76e47388552d7104c6d764e7ba8724feb99cf5a3c36"} Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.781112 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567","Type":"ContainerStarted","Data":"52e1de03c0bf5638b37329693fbb73f5c59d41d3285b11f533718bf1cf38d3dc"} Dec 03 08:03:38 crc kubenswrapper[4831]: I1203 08:03:38.809619 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.809599408 podStartE2EDuration="3.809599408s" podCreationTimestamp="2025-12-03 08:03:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:03:38.798802762 +0000 UTC m=+5556.142386280" watchObservedRunningTime="2025-12-03 08:03:38.809599408 +0000 UTC m=+5556.153182916" Dec 03 08:03:39 crc kubenswrapper[4831]: I1203 08:03:39.032252 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2eacb379-de6c-48c8-86bd-db9f51bdc5ec" path="/var/lib/kubelet/pods/2eacb379-de6c-48c8-86bd-db9f51bdc5ec/volumes" Dec 03 08:03:39 crc kubenswrapper[4831]: I1203 08:03:39.800364 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567","Type":"ContainerStarted","Data":"5c399a1e6ba809edc2fde19b39403671975d551377ce2d2380d31f78998d1f97"} Dec 03 08:03:39 crc kubenswrapper[4831]: I1203 08:03:39.800658 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567","Type":"ContainerStarted","Data":"98b142fd3b5b021abe0d2d78e7c248a25da7614f73931058ce4310073980f966"} Dec 03 08:03:39 crc kubenswrapper[4831]: I1203 08:03:39.841902 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=2.8418530520000003 podStartE2EDuration="2.841853052s" podCreationTimestamp="2025-12-03 08:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:03:39.831480068 +0000 UTC m=+5557.175063586" watchObservedRunningTime="2025-12-03 08:03:39.841853052 +0000 UTC m=+5557.185436600" Dec 03 08:03:42 crc kubenswrapper[4831]: I1203 08:03:42.589476 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:03:42 crc kubenswrapper[4831]: I1203 08:03:42.659816 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd99b764c-m9sbd"] Dec 03 08:03:42 crc kubenswrapper[4831]: I1203 08:03:42.660146 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" podUID="d951b914-3516-4abd-83af-6b7fcf91d390" containerName="dnsmasq-dns" containerID="cri-o://fa2a63d762e981110f8efa36740d4f2fe113ec8a3cb221e3825d026d8033a04e" gracePeriod=10 Dec 03 08:03:42 crc kubenswrapper[4831]: I1203 08:03:42.839272 4831 generic.go:334] "Generic (PLEG): container finished" podID="d951b914-3516-4abd-83af-6b7fcf91d390" containerID="fa2a63d762e981110f8efa36740d4f2fe113ec8a3cb221e3825d026d8033a04e" exitCode=0 Dec 03 08:03:42 crc kubenswrapper[4831]: I1203 08:03:42.839384 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" event={"ID":"d951b914-3516-4abd-83af-6b7fcf91d390","Type":"ContainerDied","Data":"fa2a63d762e981110f8efa36740d4f2fe113ec8a3cb221e3825d026d8033a04e"} Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.213865 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.306861 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n74c4\" (UniqueName: \"kubernetes.io/projected/d951b914-3516-4abd-83af-6b7fcf91d390-kube-api-access-n74c4\") pod \"d951b914-3516-4abd-83af-6b7fcf91d390\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.307121 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-dns-svc\") pod \"d951b914-3516-4abd-83af-6b7fcf91d390\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.307344 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-ovsdbserver-nb\") pod \"d951b914-3516-4abd-83af-6b7fcf91d390\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.307397 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-ovsdbserver-sb\") pod \"d951b914-3516-4abd-83af-6b7fcf91d390\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.307444 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-config\") pod \"d951b914-3516-4abd-83af-6b7fcf91d390\" (UID: \"d951b914-3516-4abd-83af-6b7fcf91d390\") " Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.314246 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d951b914-3516-4abd-83af-6b7fcf91d390-kube-api-access-n74c4" (OuterVolumeSpecName: "kube-api-access-n74c4") pod "d951b914-3516-4abd-83af-6b7fcf91d390" (UID: "d951b914-3516-4abd-83af-6b7fcf91d390"). InnerVolumeSpecName "kube-api-access-n74c4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.363522 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-config" (OuterVolumeSpecName: "config") pod "d951b914-3516-4abd-83af-6b7fcf91d390" (UID: "d951b914-3516-4abd-83af-6b7fcf91d390"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.365009 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d951b914-3516-4abd-83af-6b7fcf91d390" (UID: "d951b914-3516-4abd-83af-6b7fcf91d390"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.365722 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d951b914-3516-4abd-83af-6b7fcf91d390" (UID: "d951b914-3516-4abd-83af-6b7fcf91d390"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.381898 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d951b914-3516-4abd-83af-6b7fcf91d390" (UID: "d951b914-3516-4abd-83af-6b7fcf91d390"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.411226 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.411273 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.411289 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.411303 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n74c4\" (UniqueName: \"kubernetes.io/projected/d951b914-3516-4abd-83af-6b7fcf91d390-kube-api-access-n74c4\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.411333 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d951b914-3516-4abd-83af-6b7fcf91d390-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.850961 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" event={"ID":"d951b914-3516-4abd-83af-6b7fcf91d390","Type":"ContainerDied","Data":"5c1b728879e01edcc6a0fc48d90292bcf1669177a6fe4bba7e64fdf8500736ce"} Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.851016 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd99b764c-m9sbd" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.851031 4831 scope.go:117] "RemoveContainer" containerID="fa2a63d762e981110f8efa36740d4f2fe113ec8a3cb221e3825d026d8033a04e" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.888040 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd99b764c-m9sbd"] Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.888469 4831 scope.go:117] "RemoveContainer" containerID="63e9cfeb1a5cfb3fac1bcd97a46757cbcf6b9551dcd5ce0918a87fe130081593" Dec 03 08:03:43 crc kubenswrapper[4831]: I1203 08:03:43.897699 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd99b764c-m9sbd"] Dec 03 08:03:45 crc kubenswrapper[4831]: I1203 08:03:45.031177 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d951b914-3516-4abd-83af-6b7fcf91d390" path="/var/lib/kubelet/pods/d951b914-3516-4abd-83af-6b7fcf91d390/volumes" Dec 03 08:03:46 crc kubenswrapper[4831]: I1203 08:03:46.132530 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 08:03:46 crc kubenswrapper[4831]: I1203 08:03:46.132592 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 08:03:46 crc kubenswrapper[4831]: I1203 08:03:46.194228 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 08:03:46 crc kubenswrapper[4831]: I1203 08:03:46.208216 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 08:03:46 crc kubenswrapper[4831]: I1203 08:03:46.888633 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 08:03:46 crc kubenswrapper[4831]: I1203 08:03:46.888676 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 08:03:48 crc kubenswrapper[4831]: I1203 08:03:48.192149 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:48 crc kubenswrapper[4831]: I1203 08:03:48.192461 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:48 crc kubenswrapper[4831]: I1203 08:03:48.241872 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:48 crc kubenswrapper[4831]: I1203 08:03:48.258466 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:48 crc kubenswrapper[4831]: I1203 08:03:48.876638 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 08:03:48 crc kubenswrapper[4831]: I1203 08:03:48.880127 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 08:03:48 crc kubenswrapper[4831]: I1203 08:03:48.916907 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:48 crc kubenswrapper[4831]: I1203 08:03:48.916961 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:50 crc kubenswrapper[4831]: I1203 08:03:50.836415 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:50 crc kubenswrapper[4831]: I1203 08:03:50.916920 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.297273 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-rmdsf"] Dec 03 08:03:59 crc kubenswrapper[4831]: E1203 08:03:59.298342 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d951b914-3516-4abd-83af-6b7fcf91d390" containerName="dnsmasq-dns" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.298364 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d951b914-3516-4abd-83af-6b7fcf91d390" containerName="dnsmasq-dns" Dec 03 08:03:59 crc kubenswrapper[4831]: E1203 08:03:59.298394 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d951b914-3516-4abd-83af-6b7fcf91d390" containerName="init" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.298406 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d951b914-3516-4abd-83af-6b7fcf91d390" containerName="init" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.298674 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d951b914-3516-4abd-83af-6b7fcf91d390" containerName="dnsmasq-dns" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.300241 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rmdsf" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.339054 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rmdsf"] Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.386794 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-d99f-account-create-update-7hbdc"] Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.388238 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d99f-account-create-update-7hbdc" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.390484 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.395828 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5hk6\" (UniqueName: \"kubernetes.io/projected/be898fbd-2ab0-4c28-8889-dc773c95348e-kube-api-access-m5hk6\") pod \"placement-db-create-rmdsf\" (UID: \"be898fbd-2ab0-4c28-8889-dc773c95348e\") " pod="openstack/placement-db-create-rmdsf" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.395932 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be898fbd-2ab0-4c28-8889-dc773c95348e-operator-scripts\") pod \"placement-db-create-rmdsf\" (UID: \"be898fbd-2ab0-4c28-8889-dc773c95348e\") " pod="openstack/placement-db-create-rmdsf" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.415918 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d99f-account-create-update-7hbdc"] Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.497415 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62186717-0243-43c0-be16-bc066648e2ac-operator-scripts\") pod \"placement-d99f-account-create-update-7hbdc\" (UID: \"62186717-0243-43c0-be16-bc066648e2ac\") " pod="openstack/placement-d99f-account-create-update-7hbdc" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.497473 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dhjv8\" (UniqueName: \"kubernetes.io/projected/62186717-0243-43c0-be16-bc066648e2ac-kube-api-access-dhjv8\") pod \"placement-d99f-account-create-update-7hbdc\" (UID: \"62186717-0243-43c0-be16-bc066648e2ac\") " pod="openstack/placement-d99f-account-create-update-7hbdc" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.497528 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5hk6\" (UniqueName: \"kubernetes.io/projected/be898fbd-2ab0-4c28-8889-dc773c95348e-kube-api-access-m5hk6\") pod \"placement-db-create-rmdsf\" (UID: \"be898fbd-2ab0-4c28-8889-dc773c95348e\") " pod="openstack/placement-db-create-rmdsf" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.497596 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be898fbd-2ab0-4c28-8889-dc773c95348e-operator-scripts\") pod \"placement-db-create-rmdsf\" (UID: \"be898fbd-2ab0-4c28-8889-dc773c95348e\") " pod="openstack/placement-db-create-rmdsf" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.498426 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be898fbd-2ab0-4c28-8889-dc773c95348e-operator-scripts\") pod \"placement-db-create-rmdsf\" (UID: \"be898fbd-2ab0-4c28-8889-dc773c95348e\") " pod="openstack/placement-db-create-rmdsf" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.544894 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5hk6\" (UniqueName: \"kubernetes.io/projected/be898fbd-2ab0-4c28-8889-dc773c95348e-kube-api-access-m5hk6\") pod \"placement-db-create-rmdsf\" (UID: \"be898fbd-2ab0-4c28-8889-dc773c95348e\") " pod="openstack/placement-db-create-rmdsf" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.599213 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62186717-0243-43c0-be16-bc066648e2ac-operator-scripts\") pod \"placement-d99f-account-create-update-7hbdc\" (UID: \"62186717-0243-43c0-be16-bc066648e2ac\") " pod="openstack/placement-d99f-account-create-update-7hbdc" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.599299 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dhjv8\" (UniqueName: \"kubernetes.io/projected/62186717-0243-43c0-be16-bc066648e2ac-kube-api-access-dhjv8\") pod \"placement-d99f-account-create-update-7hbdc\" (UID: \"62186717-0243-43c0-be16-bc066648e2ac\") " pod="openstack/placement-d99f-account-create-update-7hbdc" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.600509 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62186717-0243-43c0-be16-bc066648e2ac-operator-scripts\") pod \"placement-d99f-account-create-update-7hbdc\" (UID: \"62186717-0243-43c0-be16-bc066648e2ac\") " pod="openstack/placement-d99f-account-create-update-7hbdc" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.617679 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dhjv8\" (UniqueName: \"kubernetes.io/projected/62186717-0243-43c0-be16-bc066648e2ac-kube-api-access-dhjv8\") pod \"placement-d99f-account-create-update-7hbdc\" (UID: \"62186717-0243-43c0-be16-bc066648e2ac\") " pod="openstack/placement-d99f-account-create-update-7hbdc" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.630651 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rmdsf" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.707496 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d99f-account-create-update-7hbdc" Dec 03 08:03:59 crc kubenswrapper[4831]: I1203 08:03:59.912948 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-rmdsf"] Dec 03 08:04:00 crc kubenswrapper[4831]: I1203 08:04:00.047967 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rmdsf" event={"ID":"be898fbd-2ab0-4c28-8889-dc773c95348e","Type":"ContainerStarted","Data":"068918d2c0095f17b2ca916de15a61ef5562ccb7fcb1e704e699f162b0034fd1"} Dec 03 08:04:00 crc kubenswrapper[4831]: W1203 08:04:00.247222 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62186717_0243_43c0_be16_bc066648e2ac.slice/crio-2a6a7e3d6cbc0244c6acbfd7be24feb38b49de199d49c2f58d061781d607356f WatchSource:0}: Error finding container 2a6a7e3d6cbc0244c6acbfd7be24feb38b49de199d49c2f58d061781d607356f: Status 404 returned error can't find the container with id 2a6a7e3d6cbc0244c6acbfd7be24feb38b49de199d49c2f58d061781d607356f Dec 03 08:04:00 crc kubenswrapper[4831]: I1203 08:04:00.249813 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-d99f-account-create-update-7hbdc"] Dec 03 08:04:00 crc kubenswrapper[4831]: E1203 08:04:00.925700 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62186717_0243_43c0_be16_bc066648e2ac.slice/crio-conmon-4e01e64ab1f88f36dd171cfe9a36cd281b44d569786258a1401e360ac84ea1bb.scope\": RecentStats: unable to find data in memory cache]" Dec 03 08:04:01 crc kubenswrapper[4831]: I1203 08:04:01.060604 4831 generic.go:334] "Generic (PLEG): container finished" podID="62186717-0243-43c0-be16-bc066648e2ac" containerID="4e01e64ab1f88f36dd171cfe9a36cd281b44d569786258a1401e360ac84ea1bb" exitCode=0 Dec 03 08:04:01 crc kubenswrapper[4831]: I1203 08:04:01.060663 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d99f-account-create-update-7hbdc" event={"ID":"62186717-0243-43c0-be16-bc066648e2ac","Type":"ContainerDied","Data":"4e01e64ab1f88f36dd171cfe9a36cd281b44d569786258a1401e360ac84ea1bb"} Dec 03 08:04:01 crc kubenswrapper[4831]: I1203 08:04:01.061002 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d99f-account-create-update-7hbdc" event={"ID":"62186717-0243-43c0-be16-bc066648e2ac","Type":"ContainerStarted","Data":"2a6a7e3d6cbc0244c6acbfd7be24feb38b49de199d49c2f58d061781d607356f"} Dec 03 08:04:01 crc kubenswrapper[4831]: I1203 08:04:01.064223 4831 generic.go:334] "Generic (PLEG): container finished" podID="be898fbd-2ab0-4c28-8889-dc773c95348e" containerID="63d6d01accef261a28212cf6f728bfc2b8a9c76e1d5eb7bf598a62e5d3341ae2" exitCode=0 Dec 03 08:04:01 crc kubenswrapper[4831]: I1203 08:04:01.064295 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rmdsf" event={"ID":"be898fbd-2ab0-4c28-8889-dc773c95348e","Type":"ContainerDied","Data":"63d6d01accef261a28212cf6f728bfc2b8a9c76e1d5eb7bf598a62e5d3341ae2"} Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.587071 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d99f-account-create-update-7hbdc" Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.595807 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rmdsf" Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.690030 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dhjv8\" (UniqueName: \"kubernetes.io/projected/62186717-0243-43c0-be16-bc066648e2ac-kube-api-access-dhjv8\") pod \"62186717-0243-43c0-be16-bc066648e2ac\" (UID: \"62186717-0243-43c0-be16-bc066648e2ac\") " Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.690194 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62186717-0243-43c0-be16-bc066648e2ac-operator-scripts\") pod \"62186717-0243-43c0-be16-bc066648e2ac\" (UID: \"62186717-0243-43c0-be16-bc066648e2ac\") " Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.691174 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m5hk6\" (UniqueName: \"kubernetes.io/projected/be898fbd-2ab0-4c28-8889-dc773c95348e-kube-api-access-m5hk6\") pod \"be898fbd-2ab0-4c28-8889-dc773c95348e\" (UID: \"be898fbd-2ab0-4c28-8889-dc773c95348e\") " Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.691300 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be898fbd-2ab0-4c28-8889-dc773c95348e-operator-scripts\") pod \"be898fbd-2ab0-4c28-8889-dc773c95348e\" (UID: \"be898fbd-2ab0-4c28-8889-dc773c95348e\") " Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.692995 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/62186717-0243-43c0-be16-bc066648e2ac-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "62186717-0243-43c0-be16-bc066648e2ac" (UID: "62186717-0243-43c0-be16-bc066648e2ac"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.693010 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be898fbd-2ab0-4c28-8889-dc773c95348e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be898fbd-2ab0-4c28-8889-dc773c95348e" (UID: "be898fbd-2ab0-4c28-8889-dc773c95348e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.701545 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be898fbd-2ab0-4c28-8889-dc773c95348e-kube-api-access-m5hk6" (OuterVolumeSpecName: "kube-api-access-m5hk6") pod "be898fbd-2ab0-4c28-8889-dc773c95348e" (UID: "be898fbd-2ab0-4c28-8889-dc773c95348e"). InnerVolumeSpecName "kube-api-access-m5hk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.701597 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62186717-0243-43c0-be16-bc066648e2ac-kube-api-access-dhjv8" (OuterVolumeSpecName: "kube-api-access-dhjv8") pod "62186717-0243-43c0-be16-bc066648e2ac" (UID: "62186717-0243-43c0-be16-bc066648e2ac"). InnerVolumeSpecName "kube-api-access-dhjv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.795961 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dhjv8\" (UniqueName: \"kubernetes.io/projected/62186717-0243-43c0-be16-bc066648e2ac-kube-api-access-dhjv8\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.796010 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/62186717-0243-43c0-be16-bc066648e2ac-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.796023 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m5hk6\" (UniqueName: \"kubernetes.io/projected/be898fbd-2ab0-4c28-8889-dc773c95348e-kube-api-access-m5hk6\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:02 crc kubenswrapper[4831]: I1203 08:04:02.796035 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be898fbd-2ab0-4c28-8889-dc773c95348e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:03 crc kubenswrapper[4831]: I1203 08:04:03.091149 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-rmdsf" event={"ID":"be898fbd-2ab0-4c28-8889-dc773c95348e","Type":"ContainerDied","Data":"068918d2c0095f17b2ca916de15a61ef5562ccb7fcb1e704e699f162b0034fd1"} Dec 03 08:04:03 crc kubenswrapper[4831]: I1203 08:04:03.091192 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-rmdsf" Dec 03 08:04:03 crc kubenswrapper[4831]: I1203 08:04:03.091232 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="068918d2c0095f17b2ca916de15a61ef5562ccb7fcb1e704e699f162b0034fd1" Dec 03 08:04:03 crc kubenswrapper[4831]: I1203 08:04:03.093507 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-d99f-account-create-update-7hbdc" event={"ID":"62186717-0243-43c0-be16-bc066648e2ac","Type":"ContainerDied","Data":"2a6a7e3d6cbc0244c6acbfd7be24feb38b49de199d49c2f58d061781d607356f"} Dec 03 08:04:03 crc kubenswrapper[4831]: I1203 08:04:03.093551 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a6a7e3d6cbc0244c6acbfd7be24feb38b49de199d49c2f58d061781d607356f" Dec 03 08:04:03 crc kubenswrapper[4831]: I1203 08:04:03.093570 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-d99f-account-create-update-7hbdc" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.684390 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58577dbd7f-rl7pg"] Dec 03 08:04:04 crc kubenswrapper[4831]: E1203 08:04:04.685116 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62186717-0243-43c0-be16-bc066648e2ac" containerName="mariadb-account-create-update" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.685134 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="62186717-0243-43c0-be16-bc066648e2ac" containerName="mariadb-account-create-update" Dec 03 08:04:04 crc kubenswrapper[4831]: E1203 08:04:04.685175 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be898fbd-2ab0-4c28-8889-dc773c95348e" containerName="mariadb-database-create" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.685183 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="be898fbd-2ab0-4c28-8889-dc773c95348e" containerName="mariadb-database-create" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.686510 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="62186717-0243-43c0-be16-bc066648e2ac" containerName="mariadb-account-create-update" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.686548 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="be898fbd-2ab0-4c28-8889-dc773c95348e" containerName="mariadb-database-create" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.687768 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.702280 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58577dbd7f-rl7pg"] Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.749241 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-st2kv"] Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.750821 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.756043 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-65gts" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.756152 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.756060 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.759434 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-st2kv"] Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.840556 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-logs\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.840644 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgwfw\" (UniqueName: \"kubernetes.io/projected/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-kube-api-access-fgwfw\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.840672 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-config\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.840697 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-scripts\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.840878 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2twms\" (UniqueName: \"kubernetes.io/projected/fa9bf6d6-c8e0-4326-b177-e79139d03937-kube-api-access-2twms\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.840941 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-ovsdbserver-sb\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.840976 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-ovsdbserver-nb\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.841775 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-config-data\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.841966 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-dns-svc\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.842013 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-combined-ca-bundle\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.943114 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgwfw\" (UniqueName: \"kubernetes.io/projected/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-kube-api-access-fgwfw\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.943161 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-config\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.943179 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-scripts\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.943214 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2twms\" (UniqueName: \"kubernetes.io/projected/fa9bf6d6-c8e0-4326-b177-e79139d03937-kube-api-access-2twms\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.943237 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-ovsdbserver-sb\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.943254 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-ovsdbserver-nb\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.943289 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-config-data\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.943350 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-dns-svc\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.943367 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-combined-ca-bundle\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.943389 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-logs\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.943898 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-logs\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.944025 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-config\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.944693 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-dns-svc\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.944739 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-ovsdbserver-sb\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.944755 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-ovsdbserver-nb\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.950498 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-combined-ca-bundle\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.952032 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-config-data\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.952388 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-scripts\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.962900 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2twms\" (UniqueName: \"kubernetes.io/projected/fa9bf6d6-c8e0-4326-b177-e79139d03937-kube-api-access-2twms\") pod \"dnsmasq-dns-58577dbd7f-rl7pg\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:04 crc kubenswrapper[4831]: I1203 08:04:04.964206 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgwfw\" (UniqueName: \"kubernetes.io/projected/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-kube-api-access-fgwfw\") pod \"placement-db-sync-st2kv\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:05 crc kubenswrapper[4831]: I1203 08:04:05.047531 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:05 crc kubenswrapper[4831]: I1203 08:04:05.107871 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:05 crc kubenswrapper[4831]: I1203 08:04:05.418927 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58577dbd7f-rl7pg"] Dec 03 08:04:05 crc kubenswrapper[4831]: I1203 08:04:05.767545 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-st2kv"] Dec 03 08:04:05 crc kubenswrapper[4831]: W1203 08:04:05.769193 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda2d08214_4528_4f01_ab7b_70ae27f6bd7f.slice/crio-9b44e6390e1bd94617e658e67a676958861f060572916e6159d5b2c8ec6872da WatchSource:0}: Error finding container 9b44e6390e1bd94617e658e67a676958861f060572916e6159d5b2c8ec6872da: Status 404 returned error can't find the container with id 9b44e6390e1bd94617e658e67a676958861f060572916e6159d5b2c8ec6872da Dec 03 08:04:06 crc kubenswrapper[4831]: I1203 08:04:06.249950 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-st2kv" event={"ID":"a2d08214-4528-4f01-ab7b-70ae27f6bd7f","Type":"ContainerStarted","Data":"98b5520b362a0a5d305ff43584b662103223bc53c4b1e3de871396757d8d30b4"} Dec 03 08:04:06 crc kubenswrapper[4831]: I1203 08:04:06.250948 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-st2kv" event={"ID":"a2d08214-4528-4f01-ab7b-70ae27f6bd7f","Type":"ContainerStarted","Data":"9b44e6390e1bd94617e658e67a676958861f060572916e6159d5b2c8ec6872da"} Dec 03 08:04:06 crc kubenswrapper[4831]: I1203 08:04:06.257782 4831 generic.go:334] "Generic (PLEG): container finished" podID="fa9bf6d6-c8e0-4326-b177-e79139d03937" containerID="d74941ed6a0efa7b1657daaf7679f4b97520630b721e2316d02314679abd0a2c" exitCode=0 Dec 03 08:04:06 crc kubenswrapper[4831]: I1203 08:04:06.257933 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" event={"ID":"fa9bf6d6-c8e0-4326-b177-e79139d03937","Type":"ContainerDied","Data":"d74941ed6a0efa7b1657daaf7679f4b97520630b721e2316d02314679abd0a2c"} Dec 03 08:04:06 crc kubenswrapper[4831]: I1203 08:04:06.258003 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" event={"ID":"fa9bf6d6-c8e0-4326-b177-e79139d03937","Type":"ContainerStarted","Data":"d9d6478fa677ca946398269e09e5d623c07526d6a36fc293822bfb511d6ec230"} Dec 03 08:04:06 crc kubenswrapper[4831]: I1203 08:04:06.302964 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-st2kv" podStartSLOduration=2.30293783 podStartE2EDuration="2.30293783s" podCreationTimestamp="2025-12-03 08:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:04:06.268113166 +0000 UTC m=+5583.611696674" watchObservedRunningTime="2025-12-03 08:04:06.30293783 +0000 UTC m=+5583.646521348" Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.275378 4831 generic.go:334] "Generic (PLEG): container finished" podID="a2d08214-4528-4f01-ab7b-70ae27f6bd7f" containerID="98b5520b362a0a5d305ff43584b662103223bc53c4b1e3de871396757d8d30b4" exitCode=0 Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.275612 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-st2kv" event={"ID":"a2d08214-4528-4f01-ab7b-70ae27f6bd7f","Type":"ContainerDied","Data":"98b5520b362a0a5d305ff43584b662103223bc53c4b1e3de871396757d8d30b4"} Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.279131 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" event={"ID":"fa9bf6d6-c8e0-4326-b177-e79139d03937","Type":"ContainerStarted","Data":"52f8930b14a6ddaec0c4894c8222932387cf4257cfa44dbb75771337bc0f9100"} Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.279402 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.337136 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" podStartSLOduration=3.337103833 podStartE2EDuration="3.337103833s" podCreationTimestamp="2025-12-03 08:04:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:04:07.323383826 +0000 UTC m=+5584.666967344" watchObservedRunningTime="2025-12-03 08:04:07.337103833 +0000 UTC m=+5584.680687391" Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.711903 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-szjzk"] Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.713863 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.719936 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-szjzk"] Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.817733 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-catalog-content\") pod \"certified-operators-szjzk\" (UID: \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\") " pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.817832 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrcs5\" (UniqueName: \"kubernetes.io/projected/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-kube-api-access-xrcs5\") pod \"certified-operators-szjzk\" (UID: \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\") " pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.817872 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-utilities\") pod \"certified-operators-szjzk\" (UID: \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\") " pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.920103 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrcs5\" (UniqueName: \"kubernetes.io/projected/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-kube-api-access-xrcs5\") pod \"certified-operators-szjzk\" (UID: \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\") " pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.920235 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-utilities\") pod \"certified-operators-szjzk\" (UID: \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\") " pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.920388 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-catalog-content\") pod \"certified-operators-szjzk\" (UID: \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\") " pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.921708 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-catalog-content\") pod \"certified-operators-szjzk\" (UID: \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\") " pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.921728 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-utilities\") pod \"certified-operators-szjzk\" (UID: \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\") " pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:07 crc kubenswrapper[4831]: I1203 08:04:07.944261 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrcs5\" (UniqueName: \"kubernetes.io/projected/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-kube-api-access-xrcs5\") pod \"certified-operators-szjzk\" (UID: \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\") " pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.031278 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.644397 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-szjzk"] Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.813137 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.848084 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-scripts\") pod \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.848225 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-config-data\") pod \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.848280 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-combined-ca-bundle\") pod \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.848382 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-logs\") pod \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.848415 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgwfw\" (UniqueName: \"kubernetes.io/projected/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-kube-api-access-fgwfw\") pod \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\" (UID: \"a2d08214-4528-4f01-ab7b-70ae27f6bd7f\") " Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.850556 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-logs" (OuterVolumeSpecName: "logs") pod "a2d08214-4528-4f01-ab7b-70ae27f6bd7f" (UID: "a2d08214-4528-4f01-ab7b-70ae27f6bd7f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.854843 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-kube-api-access-fgwfw" (OuterVolumeSpecName: "kube-api-access-fgwfw") pod "a2d08214-4528-4f01-ab7b-70ae27f6bd7f" (UID: "a2d08214-4528-4f01-ab7b-70ae27f6bd7f"). InnerVolumeSpecName "kube-api-access-fgwfw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.855264 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-scripts" (OuterVolumeSpecName: "scripts") pod "a2d08214-4528-4f01-ab7b-70ae27f6bd7f" (UID: "a2d08214-4528-4f01-ab7b-70ae27f6bd7f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.873803 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-config-data" (OuterVolumeSpecName: "config-data") pod "a2d08214-4528-4f01-ab7b-70ae27f6bd7f" (UID: "a2d08214-4528-4f01-ab7b-70ae27f6bd7f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.874671 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a2d08214-4528-4f01-ab7b-70ae27f6bd7f" (UID: "a2d08214-4528-4f01-ab7b-70ae27f6bd7f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.950834 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.950905 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.950926 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.950946 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:08 crc kubenswrapper[4831]: I1203 08:04:08.950964 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgwfw\" (UniqueName: \"kubernetes.io/projected/a2d08214-4528-4f01-ab7b-70ae27f6bd7f-kube-api-access-fgwfw\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:09 crc kubenswrapper[4831]: I1203 08:04:09.324913 4831 generic.go:334] "Generic (PLEG): container finished" podID="174db1b1-c4bc-4d29-9c84-f5eea263a9e9" containerID="07922e231a6659b26f0077b32e5ea57830cfe298f672e299891884a5419a0851" exitCode=0 Dec 03 08:04:09 crc kubenswrapper[4831]: I1203 08:04:09.325034 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szjzk" event={"ID":"174db1b1-c4bc-4d29-9c84-f5eea263a9e9","Type":"ContainerDied","Data":"07922e231a6659b26f0077b32e5ea57830cfe298f672e299891884a5419a0851"} Dec 03 08:04:09 crc kubenswrapper[4831]: I1203 08:04:09.325525 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szjzk" event={"ID":"174db1b1-c4bc-4d29-9c84-f5eea263a9e9","Type":"ContainerStarted","Data":"dfe3898739520ca726b77cd7f899ce91ddc33a889c63fa0f7bf838cfa42848bf"} Dec 03 08:04:09 crc kubenswrapper[4831]: I1203 08:04:09.336754 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-st2kv" event={"ID":"a2d08214-4528-4f01-ab7b-70ae27f6bd7f","Type":"ContainerDied","Data":"9b44e6390e1bd94617e658e67a676958861f060572916e6159d5b2c8ec6872da"} Dec 03 08:04:09 crc kubenswrapper[4831]: I1203 08:04:09.336817 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9b44e6390e1bd94617e658e67a676958861f060572916e6159d5b2c8ec6872da" Dec 03 08:04:09 crc kubenswrapper[4831]: I1203 08:04:09.336844 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-st2kv" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.038866 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-7647b977b-m9c66"] Dec 03 08:04:10 crc kubenswrapper[4831]: E1203 08:04:10.039571 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2d08214-4528-4f01-ab7b-70ae27f6bd7f" containerName="placement-db-sync" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.039587 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2d08214-4528-4f01-ab7b-70ae27f6bd7f" containerName="placement-db-sync" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.039819 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2d08214-4528-4f01-ab7b-70ae27f6bd7f" containerName="placement-db-sync" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.053033 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.061007 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.061436 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-65gts" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.062417 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.078197 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cd70bab-bce8-47d5-a5cd-115fb729ec02-logs\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.078270 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cd70bab-bce8-47d5-a5cd-115fb729ec02-scripts\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.078556 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd70bab-bce8-47d5-a5cd-115fb729ec02-combined-ca-bundle\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.078595 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd70bab-bce8-47d5-a5cd-115fb729ec02-config-data\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.078621 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr2cg\" (UniqueName: \"kubernetes.io/projected/3cd70bab-bce8-47d5-a5cd-115fb729ec02-kube-api-access-xr2cg\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.102681 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7647b977b-m9c66"] Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.180398 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd70bab-bce8-47d5-a5cd-115fb729ec02-combined-ca-bundle\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.180452 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd70bab-bce8-47d5-a5cd-115fb729ec02-config-data\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.180475 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xr2cg\" (UniqueName: \"kubernetes.io/projected/3cd70bab-bce8-47d5-a5cd-115fb729ec02-kube-api-access-xr2cg\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.180523 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cd70bab-bce8-47d5-a5cd-115fb729ec02-logs\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.180562 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cd70bab-bce8-47d5-a5cd-115fb729ec02-scripts\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.181246 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3cd70bab-bce8-47d5-a5cd-115fb729ec02-logs\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.185706 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3cd70bab-bce8-47d5-a5cd-115fb729ec02-combined-ca-bundle\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.205097 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3cd70bab-bce8-47d5-a5cd-115fb729ec02-config-data\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.205462 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3cd70bab-bce8-47d5-a5cd-115fb729ec02-scripts\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.205742 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr2cg\" (UniqueName: \"kubernetes.io/projected/3cd70bab-bce8-47d5-a5cd-115fb729ec02-kube-api-access-xr2cg\") pod \"placement-7647b977b-m9c66\" (UID: \"3cd70bab-bce8-47d5-a5cd-115fb729ec02\") " pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.393988 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:10 crc kubenswrapper[4831]: I1203 08:04:10.926665 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-7647b977b-m9c66"] Dec 03 08:04:11 crc kubenswrapper[4831]: I1203 08:04:11.356801 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7647b977b-m9c66" event={"ID":"3cd70bab-bce8-47d5-a5cd-115fb729ec02","Type":"ContainerStarted","Data":"6cc4de28d49d7f99087961c0eb2830565298a28e15c92f19ac15926e178697be"} Dec 03 08:04:11 crc kubenswrapper[4831]: I1203 08:04:11.359419 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7647b977b-m9c66" event={"ID":"3cd70bab-bce8-47d5-a5cd-115fb729ec02","Type":"ContainerStarted","Data":"5cb823bbab6961ba1623fa95b46408dd601eebba686a3bc1002085a17efe3740"} Dec 03 08:04:11 crc kubenswrapper[4831]: I1203 08:04:11.362610 4831 generic.go:334] "Generic (PLEG): container finished" podID="174db1b1-c4bc-4d29-9c84-f5eea263a9e9" containerID="376fb2d1e5a1b1ba707bd6871e86d6b81ca17b9224708cabd171ef57d10b4a25" exitCode=0 Dec 03 08:04:11 crc kubenswrapper[4831]: I1203 08:04:11.362682 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szjzk" event={"ID":"174db1b1-c4bc-4d29-9c84-f5eea263a9e9","Type":"ContainerDied","Data":"376fb2d1e5a1b1ba707bd6871e86d6b81ca17b9224708cabd171ef57d10b4a25"} Dec 03 08:04:12 crc kubenswrapper[4831]: I1203 08:04:12.376663 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szjzk" event={"ID":"174db1b1-c4bc-4d29-9c84-f5eea263a9e9","Type":"ContainerStarted","Data":"488a72333ddb5b7103ba04b6b6d2d94fd2f595931c514f70235ab089d4d1c0f4"} Dec 03 08:04:12 crc kubenswrapper[4831]: I1203 08:04:12.378939 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-7647b977b-m9c66" event={"ID":"3cd70bab-bce8-47d5-a5cd-115fb729ec02","Type":"ContainerStarted","Data":"f57db41171d1e105e511700fef8e0404852cb8911e9b53d9a6a424bb6fffdcdf"} Dec 03 08:04:12 crc kubenswrapper[4831]: I1203 08:04:12.379092 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:12 crc kubenswrapper[4831]: I1203 08:04:12.412428 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-szjzk" podStartSLOduration=2.927780564 podStartE2EDuration="5.412399996s" podCreationTimestamp="2025-12-03 08:04:07 +0000 UTC" firstStartedPulling="2025-12-03 08:04:09.336283627 +0000 UTC m=+5586.679867135" lastFinishedPulling="2025-12-03 08:04:11.820903019 +0000 UTC m=+5589.164486567" observedRunningTime="2025-12-03 08:04:12.40194737 +0000 UTC m=+5589.745530958" watchObservedRunningTime="2025-12-03 08:04:12.412399996 +0000 UTC m=+5589.755983544" Dec 03 08:04:12 crc kubenswrapper[4831]: I1203 08:04:12.433071 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-7647b977b-m9c66" podStartSLOduration=2.433033108 podStartE2EDuration="2.433033108s" podCreationTimestamp="2025-12-03 08:04:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:04:12.423919384 +0000 UTC m=+5589.767502932" watchObservedRunningTime="2025-12-03 08:04:12.433033108 +0000 UTC m=+5589.776616686" Dec 03 08:04:13 crc kubenswrapper[4831]: I1203 08:04:13.415047 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.049651 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.143606 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dd7cfb74c-5xxmr"] Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.144010 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" podUID="b9d47adc-4f70-49c6-9211-8718d57aeafc" containerName="dnsmasq-dns" containerID="cri-o://fbdfcf6d5f00426b8e5479f94a7ae967f1a591c51409021da72a815607ed2fa6" gracePeriod=10 Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.440005 4831 generic.go:334] "Generic (PLEG): container finished" podID="b9d47adc-4f70-49c6-9211-8718d57aeafc" containerID="fbdfcf6d5f00426b8e5479f94a7ae967f1a591c51409021da72a815607ed2fa6" exitCode=0 Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.440058 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" event={"ID":"b9d47adc-4f70-49c6-9211-8718d57aeafc","Type":"ContainerDied","Data":"fbdfcf6d5f00426b8e5479f94a7ae967f1a591c51409021da72a815607ed2fa6"} Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.696225 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.854285 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-ovsdbserver-sb\") pod \"b9d47adc-4f70-49c6-9211-8718d57aeafc\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.854377 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5v9x\" (UniqueName: \"kubernetes.io/projected/b9d47adc-4f70-49c6-9211-8718d57aeafc-kube-api-access-r5v9x\") pod \"b9d47adc-4f70-49c6-9211-8718d57aeafc\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.854407 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-dns-svc\") pod \"b9d47adc-4f70-49c6-9211-8718d57aeafc\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.854576 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-config\") pod \"b9d47adc-4f70-49c6-9211-8718d57aeafc\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.854606 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-ovsdbserver-nb\") pod \"b9d47adc-4f70-49c6-9211-8718d57aeafc\" (UID: \"b9d47adc-4f70-49c6-9211-8718d57aeafc\") " Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.862893 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9d47adc-4f70-49c6-9211-8718d57aeafc-kube-api-access-r5v9x" (OuterVolumeSpecName: "kube-api-access-r5v9x") pod "b9d47adc-4f70-49c6-9211-8718d57aeafc" (UID: "b9d47adc-4f70-49c6-9211-8718d57aeafc"). InnerVolumeSpecName "kube-api-access-r5v9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.906976 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b9d47adc-4f70-49c6-9211-8718d57aeafc" (UID: "b9d47adc-4f70-49c6-9211-8718d57aeafc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.909362 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-config" (OuterVolumeSpecName: "config") pod "b9d47adc-4f70-49c6-9211-8718d57aeafc" (UID: "b9d47adc-4f70-49c6-9211-8718d57aeafc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.913217 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b9d47adc-4f70-49c6-9211-8718d57aeafc" (UID: "b9d47adc-4f70-49c6-9211-8718d57aeafc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.923066 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b9d47adc-4f70-49c6-9211-8718d57aeafc" (UID: "b9d47adc-4f70-49c6-9211-8718d57aeafc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.956840 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.956876 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.956890 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.956906 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5v9x\" (UniqueName: \"kubernetes.io/projected/b9d47adc-4f70-49c6-9211-8718d57aeafc-kube-api-access-r5v9x\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:15 crc kubenswrapper[4831]: I1203 08:04:15.956921 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b9d47adc-4f70-49c6-9211-8718d57aeafc-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:16 crc kubenswrapper[4831]: I1203 08:04:16.453228 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" event={"ID":"b9d47adc-4f70-49c6-9211-8718d57aeafc","Type":"ContainerDied","Data":"67f7f1119218f29ddb809726c0bb11bb8199faee4d7028f8bdfd989617c7aa7e"} Dec 03 08:04:16 crc kubenswrapper[4831]: I1203 08:04:16.453596 4831 scope.go:117] "RemoveContainer" containerID="fbdfcf6d5f00426b8e5479f94a7ae967f1a591c51409021da72a815607ed2fa6" Dec 03 08:04:16 crc kubenswrapper[4831]: I1203 08:04:16.453422 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-dd7cfb74c-5xxmr" Dec 03 08:04:16 crc kubenswrapper[4831]: I1203 08:04:16.493814 4831 scope.go:117] "RemoveContainer" containerID="22ce868330d9c75e7f31bccc280fd896cc9be5642fc95ffc555ba44f020139bc" Dec 03 08:04:16 crc kubenswrapper[4831]: I1203 08:04:16.510335 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-dd7cfb74c-5xxmr"] Dec 03 08:04:16 crc kubenswrapper[4831]: I1203 08:04:16.520447 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-dd7cfb74c-5xxmr"] Dec 03 08:04:17 crc kubenswrapper[4831]: I1203 08:04:17.030489 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9d47adc-4f70-49c6-9211-8718d57aeafc" path="/var/lib/kubelet/pods/b9d47adc-4f70-49c6-9211-8718d57aeafc/volumes" Dec 03 08:04:18 crc kubenswrapper[4831]: I1203 08:04:18.031676 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:18 crc kubenswrapper[4831]: I1203 08:04:18.031780 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:18 crc kubenswrapper[4831]: I1203 08:04:18.119413 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:18 crc kubenswrapper[4831]: I1203 08:04:18.554526 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:18 crc kubenswrapper[4831]: I1203 08:04:18.613442 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-szjzk"] Dec 03 08:04:19 crc kubenswrapper[4831]: I1203 08:04:19.432247 4831 scope.go:117] "RemoveContainer" containerID="3c6d705e4d2ae977f2dd90a037a2aae5787f76589898c4b2620b499a8514f34a" Dec 03 08:04:19 crc kubenswrapper[4831]: I1203 08:04:19.472754 4831 scope.go:117] "RemoveContainer" containerID="7011694c744a2a9af9fe6a52114969cdc7e7b33736f2062ec3de97055b6631c0" Dec 03 08:04:20 crc kubenswrapper[4831]: I1203 08:04:20.502763 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-szjzk" podUID="174db1b1-c4bc-4d29-9c84-f5eea263a9e9" containerName="registry-server" containerID="cri-o://488a72333ddb5b7103ba04b6b6d2d94fd2f595931c514f70235ab089d4d1c0f4" gracePeriod=2 Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.023858 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.160819 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrcs5\" (UniqueName: \"kubernetes.io/projected/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-kube-api-access-xrcs5\") pod \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\" (UID: \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\") " Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.161057 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-utilities\") pod \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\" (UID: \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\") " Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.161126 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-catalog-content\") pod \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\" (UID: \"174db1b1-c4bc-4d29-9c84-f5eea263a9e9\") " Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.163698 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-utilities" (OuterVolumeSpecName: "utilities") pod "174db1b1-c4bc-4d29-9c84-f5eea263a9e9" (UID: "174db1b1-c4bc-4d29-9c84-f5eea263a9e9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.169312 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-kube-api-access-xrcs5" (OuterVolumeSpecName: "kube-api-access-xrcs5") pod "174db1b1-c4bc-4d29-9c84-f5eea263a9e9" (UID: "174db1b1-c4bc-4d29-9c84-f5eea263a9e9"). InnerVolumeSpecName "kube-api-access-xrcs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.262681 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrcs5\" (UniqueName: \"kubernetes.io/projected/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-kube-api-access-xrcs5\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.262764 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.440646 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "174db1b1-c4bc-4d29-9c84-f5eea263a9e9" (UID: "174db1b1-c4bc-4d29-9c84-f5eea263a9e9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.465033 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/174db1b1-c4bc-4d29-9c84-f5eea263a9e9-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.516065 4831 generic.go:334] "Generic (PLEG): container finished" podID="174db1b1-c4bc-4d29-9c84-f5eea263a9e9" containerID="488a72333ddb5b7103ba04b6b6d2d94fd2f595931c514f70235ab089d4d1c0f4" exitCode=0 Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.516143 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szjzk" event={"ID":"174db1b1-c4bc-4d29-9c84-f5eea263a9e9","Type":"ContainerDied","Data":"488a72333ddb5b7103ba04b6b6d2d94fd2f595931c514f70235ab089d4d1c0f4"} Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.516158 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-szjzk" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.516179 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-szjzk" event={"ID":"174db1b1-c4bc-4d29-9c84-f5eea263a9e9","Type":"ContainerDied","Data":"dfe3898739520ca726b77cd7f899ce91ddc33a889c63fa0f7bf838cfa42848bf"} Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.516202 4831 scope.go:117] "RemoveContainer" containerID="488a72333ddb5b7103ba04b6b6d2d94fd2f595931c514f70235ab089d4d1c0f4" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.557190 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-szjzk"] Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.558262 4831 scope.go:117] "RemoveContainer" containerID="376fb2d1e5a1b1ba707bd6871e86d6b81ca17b9224708cabd171ef57d10b4a25" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.586072 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-szjzk"] Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.595838 4831 scope.go:117] "RemoveContainer" containerID="07922e231a6659b26f0077b32e5ea57830cfe298f672e299891884a5419a0851" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.629994 4831 scope.go:117] "RemoveContainer" containerID="488a72333ddb5b7103ba04b6b6d2d94fd2f595931c514f70235ab089d4d1c0f4" Dec 03 08:04:21 crc kubenswrapper[4831]: E1203 08:04:21.630580 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"488a72333ddb5b7103ba04b6b6d2d94fd2f595931c514f70235ab089d4d1c0f4\": container with ID starting with 488a72333ddb5b7103ba04b6b6d2d94fd2f595931c514f70235ab089d4d1c0f4 not found: ID does not exist" containerID="488a72333ddb5b7103ba04b6b6d2d94fd2f595931c514f70235ab089d4d1c0f4" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.630638 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"488a72333ddb5b7103ba04b6b6d2d94fd2f595931c514f70235ab089d4d1c0f4"} err="failed to get container status \"488a72333ddb5b7103ba04b6b6d2d94fd2f595931c514f70235ab089d4d1c0f4\": rpc error: code = NotFound desc = could not find container \"488a72333ddb5b7103ba04b6b6d2d94fd2f595931c514f70235ab089d4d1c0f4\": container with ID starting with 488a72333ddb5b7103ba04b6b6d2d94fd2f595931c514f70235ab089d4d1c0f4 not found: ID does not exist" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.630674 4831 scope.go:117] "RemoveContainer" containerID="376fb2d1e5a1b1ba707bd6871e86d6b81ca17b9224708cabd171ef57d10b4a25" Dec 03 08:04:21 crc kubenswrapper[4831]: E1203 08:04:21.631001 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"376fb2d1e5a1b1ba707bd6871e86d6b81ca17b9224708cabd171ef57d10b4a25\": container with ID starting with 376fb2d1e5a1b1ba707bd6871e86d6b81ca17b9224708cabd171ef57d10b4a25 not found: ID does not exist" containerID="376fb2d1e5a1b1ba707bd6871e86d6b81ca17b9224708cabd171ef57d10b4a25" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.631040 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"376fb2d1e5a1b1ba707bd6871e86d6b81ca17b9224708cabd171ef57d10b4a25"} err="failed to get container status \"376fb2d1e5a1b1ba707bd6871e86d6b81ca17b9224708cabd171ef57d10b4a25\": rpc error: code = NotFound desc = could not find container \"376fb2d1e5a1b1ba707bd6871e86d6b81ca17b9224708cabd171ef57d10b4a25\": container with ID starting with 376fb2d1e5a1b1ba707bd6871e86d6b81ca17b9224708cabd171ef57d10b4a25 not found: ID does not exist" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.631061 4831 scope.go:117] "RemoveContainer" containerID="07922e231a6659b26f0077b32e5ea57830cfe298f672e299891884a5419a0851" Dec 03 08:04:21 crc kubenswrapper[4831]: E1203 08:04:21.631402 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07922e231a6659b26f0077b32e5ea57830cfe298f672e299891884a5419a0851\": container with ID starting with 07922e231a6659b26f0077b32e5ea57830cfe298f672e299891884a5419a0851 not found: ID does not exist" containerID="07922e231a6659b26f0077b32e5ea57830cfe298f672e299891884a5419a0851" Dec 03 08:04:21 crc kubenswrapper[4831]: I1203 08:04:21.631469 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07922e231a6659b26f0077b32e5ea57830cfe298f672e299891884a5419a0851"} err="failed to get container status \"07922e231a6659b26f0077b32e5ea57830cfe298f672e299891884a5419a0851\": rpc error: code = NotFound desc = could not find container \"07922e231a6659b26f0077b32e5ea57830cfe298f672e299891884a5419a0851\": container with ID starting with 07922e231a6659b26f0077b32e5ea57830cfe298f672e299891884a5419a0851 not found: ID does not exist" Dec 03 08:04:23 crc kubenswrapper[4831]: I1203 08:04:23.036528 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="174db1b1-c4bc-4d29-9c84-f5eea263a9e9" path="/var/lib/kubelet/pods/174db1b1-c4bc-4d29-9c84-f5eea263a9e9/volumes" Dec 03 08:04:42 crc kubenswrapper[4831]: I1203 08:04:42.418074 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:43 crc kubenswrapper[4831]: I1203 08:04:43.427958 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-7647b977b-m9c66" Dec 03 08:04:57 crc kubenswrapper[4831]: I1203 08:04:57.596730 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:04:57 crc kubenswrapper[4831]: I1203 08:04:57.597635 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.147978 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-mzhpz"] Dec 03 08:05:07 crc kubenswrapper[4831]: E1203 08:05:07.149418 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174db1b1-c4bc-4d29-9c84-f5eea263a9e9" containerName="extract-utilities" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.149443 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="174db1b1-c4bc-4d29-9c84-f5eea263a9e9" containerName="extract-utilities" Dec 03 08:05:07 crc kubenswrapper[4831]: E1203 08:05:07.149475 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174db1b1-c4bc-4d29-9c84-f5eea263a9e9" containerName="extract-content" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.149484 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="174db1b1-c4bc-4d29-9c84-f5eea263a9e9" containerName="extract-content" Dec 03 08:05:07 crc kubenswrapper[4831]: E1203 08:05:07.149508 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9d47adc-4f70-49c6-9211-8718d57aeafc" containerName="init" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.149515 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9d47adc-4f70-49c6-9211-8718d57aeafc" containerName="init" Dec 03 08:05:07 crc kubenswrapper[4831]: E1203 08:05:07.149544 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="174db1b1-c4bc-4d29-9c84-f5eea263a9e9" containerName="registry-server" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.149554 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="174db1b1-c4bc-4d29-9c84-f5eea263a9e9" containerName="registry-server" Dec 03 08:05:07 crc kubenswrapper[4831]: E1203 08:05:07.149579 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b9d47adc-4f70-49c6-9211-8718d57aeafc" containerName="dnsmasq-dns" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.149597 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b9d47adc-4f70-49c6-9211-8718d57aeafc" containerName="dnsmasq-dns" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.150050 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="174db1b1-c4bc-4d29-9c84-f5eea263a9e9" containerName="registry-server" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.150081 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b9d47adc-4f70-49c6-9211-8718d57aeafc" containerName="dnsmasq-dns" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.151428 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mzhpz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.174299 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mzhpz"] Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.239679 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-7w2vb"] Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.240888 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7w2vb" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.250865 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7w2vb"] Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.254965 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssbx2\" (UniqueName: \"kubernetes.io/projected/826d082d-eb2b-448b-bcdb-5b74e20e492a-kube-api-access-ssbx2\") pod \"nova-api-db-create-mzhpz\" (UID: \"826d082d-eb2b-448b-bcdb-5b74e20e492a\") " pod="openstack/nova-api-db-create-mzhpz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.255001 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/826d082d-eb2b-448b-bcdb-5b74e20e492a-operator-scripts\") pod \"nova-api-db-create-mzhpz\" (UID: \"826d082d-eb2b-448b-bcdb-5b74e20e492a\") " pod="openstack/nova-api-db-create-mzhpz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.357061 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ssbx2\" (UniqueName: \"kubernetes.io/projected/826d082d-eb2b-448b-bcdb-5b74e20e492a-kube-api-access-ssbx2\") pod \"nova-api-db-create-mzhpz\" (UID: \"826d082d-eb2b-448b-bcdb-5b74e20e492a\") " pod="openstack/nova-api-db-create-mzhpz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.357135 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/826d082d-eb2b-448b-bcdb-5b74e20e492a-operator-scripts\") pod \"nova-api-db-create-mzhpz\" (UID: \"826d082d-eb2b-448b-bcdb-5b74e20e492a\") " pod="openstack/nova-api-db-create-mzhpz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.357216 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc239b6-ab23-4d2e-a970-ca26af1e40b2-operator-scripts\") pod \"nova-cell0-db-create-7w2vb\" (UID: \"6bc239b6-ab23-4d2e-a970-ca26af1e40b2\") " pod="openstack/nova-cell0-db-create-7w2vb" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.357258 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjlb9\" (UniqueName: \"kubernetes.io/projected/6bc239b6-ab23-4d2e-a970-ca26af1e40b2-kube-api-access-kjlb9\") pod \"nova-cell0-db-create-7w2vb\" (UID: \"6bc239b6-ab23-4d2e-a970-ca26af1e40b2\") " pod="openstack/nova-cell0-db-create-7w2vb" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.359197 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/826d082d-eb2b-448b-bcdb-5b74e20e492a-operator-scripts\") pod \"nova-api-db-create-mzhpz\" (UID: \"826d082d-eb2b-448b-bcdb-5b74e20e492a\") " pod="openstack/nova-api-db-create-mzhpz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.364118 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-799b-account-create-update-bs5mz"] Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.368821 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-799b-account-create-update-bs5mz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.369878 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-799b-account-create-update-bs5mz"] Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.370839 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.393484 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssbx2\" (UniqueName: \"kubernetes.io/projected/826d082d-eb2b-448b-bcdb-5b74e20e492a-kube-api-access-ssbx2\") pod \"nova-api-db-create-mzhpz\" (UID: \"826d082d-eb2b-448b-bcdb-5b74e20e492a\") " pod="openstack/nova-api-db-create-mzhpz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.434805 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-tqqqc"] Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.435901 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tqqqc" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.450165 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tqqqc"] Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.467126 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltwbw\" (UniqueName: \"kubernetes.io/projected/0c80b66c-c8e7-4252-9a57-dbd41a97b743-kube-api-access-ltwbw\") pod \"nova-api-799b-account-create-update-bs5mz\" (UID: \"0c80b66c-c8e7-4252-9a57-dbd41a97b743\") " pod="openstack/nova-api-799b-account-create-update-bs5mz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.467193 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c80b66c-c8e7-4252-9a57-dbd41a97b743-operator-scripts\") pod \"nova-api-799b-account-create-update-bs5mz\" (UID: \"0c80b66c-c8e7-4252-9a57-dbd41a97b743\") " pod="openstack/nova-api-799b-account-create-update-bs5mz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.467242 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc239b6-ab23-4d2e-a970-ca26af1e40b2-operator-scripts\") pod \"nova-cell0-db-create-7w2vb\" (UID: \"6bc239b6-ab23-4d2e-a970-ca26af1e40b2\") " pod="openstack/nova-cell0-db-create-7w2vb" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.467261 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjlb9\" (UniqueName: \"kubernetes.io/projected/6bc239b6-ab23-4d2e-a970-ca26af1e40b2-kube-api-access-kjlb9\") pod \"nova-cell0-db-create-7w2vb\" (UID: \"6bc239b6-ab23-4d2e-a970-ca26af1e40b2\") " pod="openstack/nova-cell0-db-create-7w2vb" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.467907 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc239b6-ab23-4d2e-a970-ca26af1e40b2-operator-scripts\") pod \"nova-cell0-db-create-7w2vb\" (UID: \"6bc239b6-ab23-4d2e-a970-ca26af1e40b2\") " pod="openstack/nova-cell0-db-create-7w2vb" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.480726 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mzhpz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.482898 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjlb9\" (UniqueName: \"kubernetes.io/projected/6bc239b6-ab23-4d2e-a970-ca26af1e40b2-kube-api-access-kjlb9\") pod \"nova-cell0-db-create-7w2vb\" (UID: \"6bc239b6-ab23-4d2e-a970-ca26af1e40b2\") " pod="openstack/nova-cell0-db-create-7w2vb" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.553404 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-602f-account-create-update-zq45g"] Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.556950 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-602f-account-create-update-zq45g" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.559707 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.560212 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-602f-account-create-update-zq45g"] Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.565676 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7w2vb" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.569414 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cptkk\" (UniqueName: \"kubernetes.io/projected/7a657f01-40c8-4311-9929-1c8f616fdbd2-kube-api-access-cptkk\") pod \"nova-cell1-db-create-tqqqc\" (UID: \"7a657f01-40c8-4311-9929-1c8f616fdbd2\") " pod="openstack/nova-cell1-db-create-tqqqc" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.569490 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltwbw\" (UniqueName: \"kubernetes.io/projected/0c80b66c-c8e7-4252-9a57-dbd41a97b743-kube-api-access-ltwbw\") pod \"nova-api-799b-account-create-update-bs5mz\" (UID: \"0c80b66c-c8e7-4252-9a57-dbd41a97b743\") " pod="openstack/nova-api-799b-account-create-update-bs5mz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.569527 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c80b66c-c8e7-4252-9a57-dbd41a97b743-operator-scripts\") pod \"nova-api-799b-account-create-update-bs5mz\" (UID: \"0c80b66c-c8e7-4252-9a57-dbd41a97b743\") " pod="openstack/nova-api-799b-account-create-update-bs5mz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.569556 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a657f01-40c8-4311-9929-1c8f616fdbd2-operator-scripts\") pod \"nova-cell1-db-create-tqqqc\" (UID: \"7a657f01-40c8-4311-9929-1c8f616fdbd2\") " pod="openstack/nova-cell1-db-create-tqqqc" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.573507 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c80b66c-c8e7-4252-9a57-dbd41a97b743-operator-scripts\") pod \"nova-api-799b-account-create-update-bs5mz\" (UID: \"0c80b66c-c8e7-4252-9a57-dbd41a97b743\") " pod="openstack/nova-api-799b-account-create-update-bs5mz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.588457 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltwbw\" (UniqueName: \"kubernetes.io/projected/0c80b66c-c8e7-4252-9a57-dbd41a97b743-kube-api-access-ltwbw\") pod \"nova-api-799b-account-create-update-bs5mz\" (UID: \"0c80b66c-c8e7-4252-9a57-dbd41a97b743\") " pod="openstack/nova-api-799b-account-create-update-bs5mz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.670992 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98f2f011-b635-4508-bc7a-9898bffb6ca4-operator-scripts\") pod \"nova-cell0-602f-account-create-update-zq45g\" (UID: \"98f2f011-b635-4508-bc7a-9898bffb6ca4\") " pod="openstack/nova-cell0-602f-account-create-update-zq45g" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.671255 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwf4h\" (UniqueName: \"kubernetes.io/projected/98f2f011-b635-4508-bc7a-9898bffb6ca4-kube-api-access-qwf4h\") pod \"nova-cell0-602f-account-create-update-zq45g\" (UID: \"98f2f011-b635-4508-bc7a-9898bffb6ca4\") " pod="openstack/nova-cell0-602f-account-create-update-zq45g" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.671298 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a657f01-40c8-4311-9929-1c8f616fdbd2-operator-scripts\") pod \"nova-cell1-db-create-tqqqc\" (UID: \"7a657f01-40c8-4311-9929-1c8f616fdbd2\") " pod="openstack/nova-cell1-db-create-tqqqc" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.671442 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cptkk\" (UniqueName: \"kubernetes.io/projected/7a657f01-40c8-4311-9929-1c8f616fdbd2-kube-api-access-cptkk\") pod \"nova-cell1-db-create-tqqqc\" (UID: \"7a657f01-40c8-4311-9929-1c8f616fdbd2\") " pod="openstack/nova-cell1-db-create-tqqqc" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.673242 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-1d5b-account-create-update-8pz46"] Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.673788 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a657f01-40c8-4311-9929-1c8f616fdbd2-operator-scripts\") pod \"nova-cell1-db-create-tqqqc\" (UID: \"7a657f01-40c8-4311-9929-1c8f616fdbd2\") " pod="openstack/nova-cell1-db-create-tqqqc" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.674254 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.676652 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.691398 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-799b-account-create-update-bs5mz" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.699708 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1d5b-account-create-update-8pz46"] Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.702668 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cptkk\" (UniqueName: \"kubernetes.io/projected/7a657f01-40c8-4311-9929-1c8f616fdbd2-kube-api-access-cptkk\") pod \"nova-cell1-db-create-tqqqc\" (UID: \"7a657f01-40c8-4311-9929-1c8f616fdbd2\") " pod="openstack/nova-cell1-db-create-tqqqc" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.766503 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tqqqc" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.775463 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98f2f011-b635-4508-bc7a-9898bffb6ca4-operator-scripts\") pod \"nova-cell0-602f-account-create-update-zq45g\" (UID: \"98f2f011-b635-4508-bc7a-9898bffb6ca4\") " pod="openstack/nova-cell0-602f-account-create-update-zq45g" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.775526 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwf4h\" (UniqueName: \"kubernetes.io/projected/98f2f011-b635-4508-bc7a-9898bffb6ca4-kube-api-access-qwf4h\") pod \"nova-cell0-602f-account-create-update-zq45g\" (UID: \"98f2f011-b635-4508-bc7a-9898bffb6ca4\") " pod="openstack/nova-cell0-602f-account-create-update-zq45g" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.775551 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5q2zl\" (UniqueName: \"kubernetes.io/projected/408a6d5e-0406-4be5-8618-55c294287e17-kube-api-access-5q2zl\") pod \"nova-cell1-1d5b-account-create-update-8pz46\" (UID: \"408a6d5e-0406-4be5-8618-55c294287e17\") " pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.775691 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/408a6d5e-0406-4be5-8618-55c294287e17-operator-scripts\") pod \"nova-cell1-1d5b-account-create-update-8pz46\" (UID: \"408a6d5e-0406-4be5-8618-55c294287e17\") " pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.776729 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98f2f011-b635-4508-bc7a-9898bffb6ca4-operator-scripts\") pod \"nova-cell0-602f-account-create-update-zq45g\" (UID: \"98f2f011-b635-4508-bc7a-9898bffb6ca4\") " pod="openstack/nova-cell0-602f-account-create-update-zq45g" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.794580 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwf4h\" (UniqueName: \"kubernetes.io/projected/98f2f011-b635-4508-bc7a-9898bffb6ca4-kube-api-access-qwf4h\") pod \"nova-cell0-602f-account-create-update-zq45g\" (UID: \"98f2f011-b635-4508-bc7a-9898bffb6ca4\") " pod="openstack/nova-cell0-602f-account-create-update-zq45g" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.877166 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/408a6d5e-0406-4be5-8618-55c294287e17-operator-scripts\") pod \"nova-cell1-1d5b-account-create-update-8pz46\" (UID: \"408a6d5e-0406-4be5-8618-55c294287e17\") " pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.877243 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5q2zl\" (UniqueName: \"kubernetes.io/projected/408a6d5e-0406-4be5-8618-55c294287e17-kube-api-access-5q2zl\") pod \"nova-cell1-1d5b-account-create-update-8pz46\" (UID: \"408a6d5e-0406-4be5-8618-55c294287e17\") " pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.877913 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/408a6d5e-0406-4be5-8618-55c294287e17-operator-scripts\") pod \"nova-cell1-1d5b-account-create-update-8pz46\" (UID: \"408a6d5e-0406-4be5-8618-55c294287e17\") " pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.892150 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5q2zl\" (UniqueName: \"kubernetes.io/projected/408a6d5e-0406-4be5-8618-55c294287e17-kube-api-access-5q2zl\") pod \"nova-cell1-1d5b-account-create-update-8pz46\" (UID: \"408a6d5e-0406-4be5-8618-55c294287e17\") " pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.937120 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-602f-account-create-update-zq45g" Dec 03 08:05:07 crc kubenswrapper[4831]: I1203 08:05:07.994672 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" Dec 03 08:05:08 crc kubenswrapper[4831]: I1203 08:05:08.016905 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-mzhpz"] Dec 03 08:05:08 crc kubenswrapper[4831]: I1203 08:05:08.086898 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7w2vb"] Dec 03 08:05:08 crc kubenswrapper[4831]: W1203 08:05:08.143126 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6bc239b6_ab23_4d2e_a970_ca26af1e40b2.slice/crio-e46992b69257c84199ceacf0b2cf67e13dc0411e9a0aa98a8b4033e3b4357555 WatchSource:0}: Error finding container e46992b69257c84199ceacf0b2cf67e13dc0411e9a0aa98a8b4033e3b4357555: Status 404 returned error can't find the container with id e46992b69257c84199ceacf0b2cf67e13dc0411e9a0aa98a8b4033e3b4357555 Dec 03 08:05:08 crc kubenswrapper[4831]: I1203 08:05:08.193394 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-799b-account-create-update-bs5mz"] Dec 03 08:05:08 crc kubenswrapper[4831]: W1203 08:05:08.204529 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c80b66c_c8e7_4252_9a57_dbd41a97b743.slice/crio-7a896392c5b85e90241e8d97c501c6e4b1db699dde267b555fdfa05d8049dcf3 WatchSource:0}: Error finding container 7a896392c5b85e90241e8d97c501c6e4b1db699dde267b555fdfa05d8049dcf3: Status 404 returned error can't find the container with id 7a896392c5b85e90241e8d97c501c6e4b1db699dde267b555fdfa05d8049dcf3 Dec 03 08:05:08 crc kubenswrapper[4831]: I1203 08:05:08.311083 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-tqqqc"] Dec 03 08:05:08 crc kubenswrapper[4831]: W1203 08:05:08.320992 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a657f01_40c8_4311_9929_1c8f616fdbd2.slice/crio-3bd396c82397d6797d9dafbc6358305c4f8daad8953f42435c849a3fa247228e WatchSource:0}: Error finding container 3bd396c82397d6797d9dafbc6358305c4f8daad8953f42435c849a3fa247228e: Status 404 returned error can't find the container with id 3bd396c82397d6797d9dafbc6358305c4f8daad8953f42435c849a3fa247228e Dec 03 08:05:08 crc kubenswrapper[4831]: I1203 08:05:08.434368 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-602f-account-create-update-zq45g"] Dec 03 08:05:08 crc kubenswrapper[4831]: I1203 08:05:08.536985 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-1d5b-account-create-update-8pz46"] Dec 03 08:05:08 crc kubenswrapper[4831]: W1203 08:05:08.614648 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod408a6d5e_0406_4be5_8618_55c294287e17.slice/crio-940821970d1f7a1f080f7b0b03cb4841147ddd36f599074f09df4d822f4bdf7e WatchSource:0}: Error finding container 940821970d1f7a1f080f7b0b03cb4841147ddd36f599074f09df4d822f4bdf7e: Status 404 returned error can't find the container with id 940821970d1f7a1f080f7b0b03cb4841147ddd36f599074f09df4d822f4bdf7e Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.019457 4831 generic.go:334] "Generic (PLEG): container finished" podID="826d082d-eb2b-448b-bcdb-5b74e20e492a" containerID="83afc0e752a5df8dcb286c53d8a6dcd5ae6353db2e2fafb318b6362af7991333" exitCode=0 Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.025298 4831 generic.go:334] "Generic (PLEG): container finished" podID="0c80b66c-c8e7-4252-9a57-dbd41a97b743" containerID="3aed52b3ebebf32f3f8db0d19b0d4bfdfc0a3f87d91edf8e0d473bfa37e94e9a" exitCode=0 Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.037141 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-602f-account-create-update-zq45g" podStartSLOduration=2.037117466 podStartE2EDuration="2.037117466s" podCreationTimestamp="2025-12-03 08:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:09.031953225 +0000 UTC m=+5646.375536753" watchObservedRunningTime="2025-12-03 08:05:09.037117466 +0000 UTC m=+5646.380700974" Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.038931 4831 generic.go:334] "Generic (PLEG): container finished" podID="6bc239b6-ab23-4d2e-a970-ca26af1e40b2" containerID="f1b858e32684363fea0633275d5005c29951501d24ea444d785f268121f09709" exitCode=0 Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.041162 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-602f-account-create-update-zq45g" event={"ID":"98f2f011-b635-4508-bc7a-9898bffb6ca4","Type":"ContainerStarted","Data":"82e2feb07f726b72e240005680cfdd8168a870bd48d8cc354c83df66a6570e1b"} Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.041203 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-602f-account-create-update-zq45g" event={"ID":"98f2f011-b635-4508-bc7a-9898bffb6ca4","Type":"ContainerStarted","Data":"ed72814aed3901af68c437821914898cc8d7dd0830356113b973ea5501143b62"} Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.041234 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mzhpz" event={"ID":"826d082d-eb2b-448b-bcdb-5b74e20e492a","Type":"ContainerDied","Data":"83afc0e752a5df8dcb286c53d8a6dcd5ae6353db2e2fafb318b6362af7991333"} Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.041248 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mzhpz" event={"ID":"826d082d-eb2b-448b-bcdb-5b74e20e492a","Type":"ContainerStarted","Data":"5c13a4c70928316c31f6e43cef8df07f0e631367b6971671d882aa0129ea47c8"} Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.041257 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-799b-account-create-update-bs5mz" event={"ID":"0c80b66c-c8e7-4252-9a57-dbd41a97b743","Type":"ContainerDied","Data":"3aed52b3ebebf32f3f8db0d19b0d4bfdfc0a3f87d91edf8e0d473bfa37e94e9a"} Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.041269 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-799b-account-create-update-bs5mz" event={"ID":"0c80b66c-c8e7-4252-9a57-dbd41a97b743","Type":"ContainerStarted","Data":"7a896392c5b85e90241e8d97c501c6e4b1db699dde267b555fdfa05d8049dcf3"} Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.041278 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7w2vb" event={"ID":"6bc239b6-ab23-4d2e-a970-ca26af1e40b2","Type":"ContainerDied","Data":"f1b858e32684363fea0633275d5005c29951501d24ea444d785f268121f09709"} Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.041287 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7w2vb" event={"ID":"6bc239b6-ab23-4d2e-a970-ca26af1e40b2","Type":"ContainerStarted","Data":"e46992b69257c84199ceacf0b2cf67e13dc0411e9a0aa98a8b4033e3b4357555"} Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.052695 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tqqqc" event={"ID":"7a657f01-40c8-4311-9929-1c8f616fdbd2","Type":"ContainerStarted","Data":"ac915b22360dd455aaeec441b056b79e0b914d84f303fbb9ca2670a8acae304a"} Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.052911 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tqqqc" event={"ID":"7a657f01-40c8-4311-9929-1c8f616fdbd2","Type":"ContainerStarted","Data":"3bd396c82397d6797d9dafbc6358305c4f8daad8953f42435c849a3fa247228e"} Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.056644 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" event={"ID":"408a6d5e-0406-4be5-8618-55c294287e17","Type":"ContainerStarted","Data":"f59fdcd959907bf12a3096cfae5d1674e40b384b1ee88597ba2592f88e4267e2"} Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.056707 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" event={"ID":"408a6d5e-0406-4be5-8618-55c294287e17","Type":"ContainerStarted","Data":"940821970d1f7a1f080f7b0b03cb4841147ddd36f599074f09df4d822f4bdf7e"} Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.087370 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-tqqqc" podStartSLOduration=2.087353829 podStartE2EDuration="2.087353829s" podCreationTimestamp="2025-12-03 08:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:09.081636711 +0000 UTC m=+5646.425220229" watchObservedRunningTime="2025-12-03 08:05:09.087353829 +0000 UTC m=+5646.430937347" Dec 03 08:05:09 crc kubenswrapper[4831]: I1203 08:05:09.137881 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" podStartSLOduration=2.137862531 podStartE2EDuration="2.137862531s" podCreationTimestamp="2025-12-03 08:05:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:09.117629472 +0000 UTC m=+5646.461212970" watchObservedRunningTime="2025-12-03 08:05:09.137862531 +0000 UTC m=+5646.481446039" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.070827 4831 generic.go:334] "Generic (PLEG): container finished" podID="98f2f011-b635-4508-bc7a-9898bffb6ca4" containerID="82e2feb07f726b72e240005680cfdd8168a870bd48d8cc354c83df66a6570e1b" exitCode=0 Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.070950 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-602f-account-create-update-zq45g" event={"ID":"98f2f011-b635-4508-bc7a-9898bffb6ca4","Type":"ContainerDied","Data":"82e2feb07f726b72e240005680cfdd8168a870bd48d8cc354c83df66a6570e1b"} Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.074018 4831 generic.go:334] "Generic (PLEG): container finished" podID="7a657f01-40c8-4311-9929-1c8f616fdbd2" containerID="ac915b22360dd455aaeec441b056b79e0b914d84f303fbb9ca2670a8acae304a" exitCode=0 Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.074131 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tqqqc" event={"ID":"7a657f01-40c8-4311-9929-1c8f616fdbd2","Type":"ContainerDied","Data":"ac915b22360dd455aaeec441b056b79e0b914d84f303fbb9ca2670a8acae304a"} Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.076896 4831 generic.go:334] "Generic (PLEG): container finished" podID="408a6d5e-0406-4be5-8618-55c294287e17" containerID="f59fdcd959907bf12a3096cfae5d1674e40b384b1ee88597ba2592f88e4267e2" exitCode=0 Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.076965 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" event={"ID":"408a6d5e-0406-4be5-8618-55c294287e17","Type":"ContainerDied","Data":"f59fdcd959907bf12a3096cfae5d1674e40b384b1ee88597ba2592f88e4267e2"} Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.569431 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7w2vb" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.578791 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-799b-account-create-update-bs5mz" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.598584 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mzhpz" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.642355 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc239b6-ab23-4d2e-a970-ca26af1e40b2-operator-scripts\") pod \"6bc239b6-ab23-4d2e-a970-ca26af1e40b2\" (UID: \"6bc239b6-ab23-4d2e-a970-ca26af1e40b2\") " Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.642485 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c80b66c-c8e7-4252-9a57-dbd41a97b743-operator-scripts\") pod \"0c80b66c-c8e7-4252-9a57-dbd41a97b743\" (UID: \"0c80b66c-c8e7-4252-9a57-dbd41a97b743\") " Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.642652 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltwbw\" (UniqueName: \"kubernetes.io/projected/0c80b66c-c8e7-4252-9a57-dbd41a97b743-kube-api-access-ltwbw\") pod \"0c80b66c-c8e7-4252-9a57-dbd41a97b743\" (UID: \"0c80b66c-c8e7-4252-9a57-dbd41a97b743\") " Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.643094 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bc239b6-ab23-4d2e-a970-ca26af1e40b2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bc239b6-ab23-4d2e-a970-ca26af1e40b2" (UID: "6bc239b6-ab23-4d2e-a970-ca26af1e40b2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.643109 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c80b66c-c8e7-4252-9a57-dbd41a97b743-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c80b66c-c8e7-4252-9a57-dbd41a97b743" (UID: "0c80b66c-c8e7-4252-9a57-dbd41a97b743"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.643584 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjlb9\" (UniqueName: \"kubernetes.io/projected/6bc239b6-ab23-4d2e-a970-ca26af1e40b2-kube-api-access-kjlb9\") pod \"6bc239b6-ab23-4d2e-a970-ca26af1e40b2\" (UID: \"6bc239b6-ab23-4d2e-a970-ca26af1e40b2\") " Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.644014 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bc239b6-ab23-4d2e-a970-ca26af1e40b2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.644031 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c80b66c-c8e7-4252-9a57-dbd41a97b743-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.648499 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bc239b6-ab23-4d2e-a970-ca26af1e40b2-kube-api-access-kjlb9" (OuterVolumeSpecName: "kube-api-access-kjlb9") pod "6bc239b6-ab23-4d2e-a970-ca26af1e40b2" (UID: "6bc239b6-ab23-4d2e-a970-ca26af1e40b2"). InnerVolumeSpecName "kube-api-access-kjlb9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.648543 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c80b66c-c8e7-4252-9a57-dbd41a97b743-kube-api-access-ltwbw" (OuterVolumeSpecName: "kube-api-access-ltwbw") pod "0c80b66c-c8e7-4252-9a57-dbd41a97b743" (UID: "0c80b66c-c8e7-4252-9a57-dbd41a97b743"). InnerVolumeSpecName "kube-api-access-ltwbw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.745527 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/826d082d-eb2b-448b-bcdb-5b74e20e492a-operator-scripts\") pod \"826d082d-eb2b-448b-bcdb-5b74e20e492a\" (UID: \"826d082d-eb2b-448b-bcdb-5b74e20e492a\") " Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.745807 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ssbx2\" (UniqueName: \"kubernetes.io/projected/826d082d-eb2b-448b-bcdb-5b74e20e492a-kube-api-access-ssbx2\") pod \"826d082d-eb2b-448b-bcdb-5b74e20e492a\" (UID: \"826d082d-eb2b-448b-bcdb-5b74e20e492a\") " Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.746155 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltwbw\" (UniqueName: \"kubernetes.io/projected/0c80b66c-c8e7-4252-9a57-dbd41a97b743-kube-api-access-ltwbw\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.746172 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjlb9\" (UniqueName: \"kubernetes.io/projected/6bc239b6-ab23-4d2e-a970-ca26af1e40b2-kube-api-access-kjlb9\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.746819 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/826d082d-eb2b-448b-bcdb-5b74e20e492a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "826d082d-eb2b-448b-bcdb-5b74e20e492a" (UID: "826d082d-eb2b-448b-bcdb-5b74e20e492a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.750243 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/826d082d-eb2b-448b-bcdb-5b74e20e492a-kube-api-access-ssbx2" (OuterVolumeSpecName: "kube-api-access-ssbx2") pod "826d082d-eb2b-448b-bcdb-5b74e20e492a" (UID: "826d082d-eb2b-448b-bcdb-5b74e20e492a"). InnerVolumeSpecName "kube-api-access-ssbx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.848410 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/826d082d-eb2b-448b-bcdb-5b74e20e492a-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:10 crc kubenswrapper[4831]: I1203 08:05:10.848450 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ssbx2\" (UniqueName: \"kubernetes.io/projected/826d082d-eb2b-448b-bcdb-5b74e20e492a-kube-api-access-ssbx2\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:11 crc kubenswrapper[4831]: I1203 08:05:11.091642 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-mzhpz" event={"ID":"826d082d-eb2b-448b-bcdb-5b74e20e492a","Type":"ContainerDied","Data":"5c13a4c70928316c31f6e43cef8df07f0e631367b6971671d882aa0129ea47c8"} Dec 03 08:05:11 crc kubenswrapper[4831]: I1203 08:05:11.091717 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c13a4c70928316c31f6e43cef8df07f0e631367b6971671d882aa0129ea47c8" Dec 03 08:05:11 crc kubenswrapper[4831]: I1203 08:05:11.091741 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-mzhpz" Dec 03 08:05:11 crc kubenswrapper[4831]: I1203 08:05:11.095312 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-799b-account-create-update-bs5mz" Dec 03 08:05:11 crc kubenswrapper[4831]: I1203 08:05:11.095371 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-799b-account-create-update-bs5mz" event={"ID":"0c80b66c-c8e7-4252-9a57-dbd41a97b743","Type":"ContainerDied","Data":"7a896392c5b85e90241e8d97c501c6e4b1db699dde267b555fdfa05d8049dcf3"} Dec 03 08:05:11 crc kubenswrapper[4831]: I1203 08:05:11.095479 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7a896392c5b85e90241e8d97c501c6e4b1db699dde267b555fdfa05d8049dcf3" Dec 03 08:05:11 crc kubenswrapper[4831]: I1203 08:05:11.098240 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7w2vb" event={"ID":"6bc239b6-ab23-4d2e-a970-ca26af1e40b2","Type":"ContainerDied","Data":"e46992b69257c84199ceacf0b2cf67e13dc0411e9a0aa98a8b4033e3b4357555"} Dec 03 08:05:11 crc kubenswrapper[4831]: I1203 08:05:11.098308 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e46992b69257c84199ceacf0b2cf67e13dc0411e9a0aa98a8b4033e3b4357555" Dec 03 08:05:11 crc kubenswrapper[4831]: I1203 08:05:11.098429 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7w2vb" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.636747 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tqqqc" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.642844 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-602f-account-create-update-zq45g" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.652429 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.797707 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/408a6d5e-0406-4be5-8618-55c294287e17-operator-scripts\") pod \"408a6d5e-0406-4be5-8618-55c294287e17\" (UID: \"408a6d5e-0406-4be5-8618-55c294287e17\") " Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.797843 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a657f01-40c8-4311-9929-1c8f616fdbd2-operator-scripts\") pod \"7a657f01-40c8-4311-9929-1c8f616fdbd2\" (UID: \"7a657f01-40c8-4311-9929-1c8f616fdbd2\") " Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.797895 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5q2zl\" (UniqueName: \"kubernetes.io/projected/408a6d5e-0406-4be5-8618-55c294287e17-kube-api-access-5q2zl\") pod \"408a6d5e-0406-4be5-8618-55c294287e17\" (UID: \"408a6d5e-0406-4be5-8618-55c294287e17\") " Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.797993 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwf4h\" (UniqueName: \"kubernetes.io/projected/98f2f011-b635-4508-bc7a-9898bffb6ca4-kube-api-access-qwf4h\") pod \"98f2f011-b635-4508-bc7a-9898bffb6ca4\" (UID: \"98f2f011-b635-4508-bc7a-9898bffb6ca4\") " Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.798010 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98f2f011-b635-4508-bc7a-9898bffb6ca4-operator-scripts\") pod \"98f2f011-b635-4508-bc7a-9898bffb6ca4\" (UID: \"98f2f011-b635-4508-bc7a-9898bffb6ca4\") " Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.798048 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cptkk\" (UniqueName: \"kubernetes.io/projected/7a657f01-40c8-4311-9929-1c8f616fdbd2-kube-api-access-cptkk\") pod \"7a657f01-40c8-4311-9929-1c8f616fdbd2\" (UID: \"7a657f01-40c8-4311-9929-1c8f616fdbd2\") " Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.798402 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7a657f01-40c8-4311-9929-1c8f616fdbd2-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7a657f01-40c8-4311-9929-1c8f616fdbd2" (UID: "7a657f01-40c8-4311-9929-1c8f616fdbd2"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.798403 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/408a6d5e-0406-4be5-8618-55c294287e17-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "408a6d5e-0406-4be5-8618-55c294287e17" (UID: "408a6d5e-0406-4be5-8618-55c294287e17"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.799163 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98f2f011-b635-4508-bc7a-9898bffb6ca4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "98f2f011-b635-4508-bc7a-9898bffb6ca4" (UID: "98f2f011-b635-4508-bc7a-9898bffb6ca4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.802074 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98f2f011-b635-4508-bc7a-9898bffb6ca4-kube-api-access-qwf4h" (OuterVolumeSpecName: "kube-api-access-qwf4h") pod "98f2f011-b635-4508-bc7a-9898bffb6ca4" (UID: "98f2f011-b635-4508-bc7a-9898bffb6ca4"). InnerVolumeSpecName "kube-api-access-qwf4h". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.802449 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/408a6d5e-0406-4be5-8618-55c294287e17-kube-api-access-5q2zl" (OuterVolumeSpecName: "kube-api-access-5q2zl") pod "408a6d5e-0406-4be5-8618-55c294287e17" (UID: "408a6d5e-0406-4be5-8618-55c294287e17"). InnerVolumeSpecName "kube-api-access-5q2zl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.803027 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a657f01-40c8-4311-9929-1c8f616fdbd2-kube-api-access-cptkk" (OuterVolumeSpecName: "kube-api-access-cptkk") pod "7a657f01-40c8-4311-9929-1c8f616fdbd2" (UID: "7a657f01-40c8-4311-9929-1c8f616fdbd2"). InnerVolumeSpecName "kube-api-access-cptkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.899964 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cptkk\" (UniqueName: \"kubernetes.io/projected/7a657f01-40c8-4311-9929-1c8f616fdbd2-kube-api-access-cptkk\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.900012 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/408a6d5e-0406-4be5-8618-55c294287e17-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.900026 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7a657f01-40c8-4311-9929-1c8f616fdbd2-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.900041 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5q2zl\" (UniqueName: \"kubernetes.io/projected/408a6d5e-0406-4be5-8618-55c294287e17-kube-api-access-5q2zl\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.900054 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwf4h\" (UniqueName: \"kubernetes.io/projected/98f2f011-b635-4508-bc7a-9898bffb6ca4-kube-api-access-qwf4h\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:11.900066 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/98f2f011-b635-4508-bc7a-9898bffb6ca4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.118685 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-tqqqc" event={"ID":"7a657f01-40c8-4311-9929-1c8f616fdbd2","Type":"ContainerDied","Data":"3bd396c82397d6797d9dafbc6358305c4f8daad8953f42435c849a3fa247228e"} Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.118725 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3bd396c82397d6797d9dafbc6358305c4f8daad8953f42435c849a3fa247228e" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.118857 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-tqqqc" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.130227 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.130253 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-1d5b-account-create-update-8pz46" event={"ID":"408a6d5e-0406-4be5-8618-55c294287e17","Type":"ContainerDied","Data":"940821970d1f7a1f080f7b0b03cb4841147ddd36f599074f09df4d822f4bdf7e"} Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.130310 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="940821970d1f7a1f080f7b0b03cb4841147ddd36f599074f09df4d822f4bdf7e" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.133736 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-602f-account-create-update-zq45g" event={"ID":"98f2f011-b635-4508-bc7a-9898bffb6ca4","Type":"ContainerDied","Data":"ed72814aed3901af68c437821914898cc8d7dd0830356113b973ea5501143b62"} Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.133786 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed72814aed3901af68c437821914898cc8d7dd0830356113b973ea5501143b62" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.133853 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-602f-account-create-update-zq45g" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.897685 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zh2bx"] Dec 03 08:05:12 crc kubenswrapper[4831]: E1203 08:05:12.898219 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98f2f011-b635-4508-bc7a-9898bffb6ca4" containerName="mariadb-account-create-update" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.898253 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="98f2f011-b635-4508-bc7a-9898bffb6ca4" containerName="mariadb-account-create-update" Dec 03 08:05:12 crc kubenswrapper[4831]: E1203 08:05:12.898265 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="408a6d5e-0406-4be5-8618-55c294287e17" containerName="mariadb-account-create-update" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.898273 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="408a6d5e-0406-4be5-8618-55c294287e17" containerName="mariadb-account-create-update" Dec 03 08:05:12 crc kubenswrapper[4831]: E1203 08:05:12.898286 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a657f01-40c8-4311-9929-1c8f616fdbd2" containerName="mariadb-database-create" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.898294 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a657f01-40c8-4311-9929-1c8f616fdbd2" containerName="mariadb-database-create" Dec 03 08:05:12 crc kubenswrapper[4831]: E1203 08:05:12.898346 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="826d082d-eb2b-448b-bcdb-5b74e20e492a" containerName="mariadb-database-create" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.898352 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="826d082d-eb2b-448b-bcdb-5b74e20e492a" containerName="mariadb-database-create" Dec 03 08:05:12 crc kubenswrapper[4831]: E1203 08:05:12.898371 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c80b66c-c8e7-4252-9a57-dbd41a97b743" containerName="mariadb-account-create-update" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.898376 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c80b66c-c8e7-4252-9a57-dbd41a97b743" containerName="mariadb-account-create-update" Dec 03 08:05:12 crc kubenswrapper[4831]: E1203 08:05:12.898385 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bc239b6-ab23-4d2e-a970-ca26af1e40b2" containerName="mariadb-database-create" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.898391 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bc239b6-ab23-4d2e-a970-ca26af1e40b2" containerName="mariadb-database-create" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.898584 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="98f2f011-b635-4508-bc7a-9898bffb6ca4" containerName="mariadb-account-create-update" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.898601 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bc239b6-ab23-4d2e-a970-ca26af1e40b2" containerName="mariadb-database-create" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.898611 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="408a6d5e-0406-4be5-8618-55c294287e17" containerName="mariadb-account-create-update" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.898620 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a657f01-40c8-4311-9929-1c8f616fdbd2" containerName="mariadb-database-create" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.898645 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="826d082d-eb2b-448b-bcdb-5b74e20e492a" containerName="mariadb-database-create" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.898655 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c80b66c-c8e7-4252-9a57-dbd41a97b743" containerName="mariadb-account-create-update" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.899346 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.901948 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.902207 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7jthb" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.902283 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Dec 03 08:05:12 crc kubenswrapper[4831]: I1203 08:05:12.922719 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zh2bx"] Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.036767 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-scripts\") pod \"nova-cell0-conductor-db-sync-zh2bx\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.036971 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-config-data\") pod \"nova-cell0-conductor-db-sync-zh2bx\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.037092 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jncbj\" (UniqueName: \"kubernetes.io/projected/6f966f8d-94ec-4e11-9049-35b3b66e192b-kube-api-access-jncbj\") pod \"nova-cell0-conductor-db-sync-zh2bx\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.037146 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zh2bx\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.138792 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jncbj\" (UniqueName: \"kubernetes.io/projected/6f966f8d-94ec-4e11-9049-35b3b66e192b-kube-api-access-jncbj\") pod \"nova-cell0-conductor-db-sync-zh2bx\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.138854 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zh2bx\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.139021 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-scripts\") pod \"nova-cell0-conductor-db-sync-zh2bx\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.139099 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-config-data\") pod \"nova-cell0-conductor-db-sync-zh2bx\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.147949 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-scripts\") pod \"nova-cell0-conductor-db-sync-zh2bx\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.147956 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-zh2bx\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.148098 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-config-data\") pod \"nova-cell0-conductor-db-sync-zh2bx\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.167081 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jncbj\" (UniqueName: \"kubernetes.io/projected/6f966f8d-94ec-4e11-9049-35b3b66e192b-kube-api-access-jncbj\") pod \"nova-cell0-conductor-db-sync-zh2bx\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.214242 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:13 crc kubenswrapper[4831]: I1203 08:05:13.654810 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zh2bx"] Dec 03 08:05:13 crc kubenswrapper[4831]: W1203 08:05:13.664056 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6f966f8d_94ec_4e11_9049_35b3b66e192b.slice/crio-c2c6606c838b7d627ad2ab3f9c6d2587abf054df31627e49163ce223faed8738 WatchSource:0}: Error finding container c2c6606c838b7d627ad2ab3f9c6d2587abf054df31627e49163ce223faed8738: Status 404 returned error can't find the container with id c2c6606c838b7d627ad2ab3f9c6d2587abf054df31627e49163ce223faed8738 Dec 03 08:05:14 crc kubenswrapper[4831]: I1203 08:05:14.152555 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zh2bx" event={"ID":"6f966f8d-94ec-4e11-9049-35b3b66e192b","Type":"ContainerStarted","Data":"0e43dcc618667594635298af43a769444279529d0a4f5f00117293a6ad37d6fb"} Dec 03 08:05:14 crc kubenswrapper[4831]: I1203 08:05:14.152926 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zh2bx" event={"ID":"6f966f8d-94ec-4e11-9049-35b3b66e192b","Type":"ContainerStarted","Data":"c2c6606c838b7d627ad2ab3f9c6d2587abf054df31627e49163ce223faed8738"} Dec 03 08:05:14 crc kubenswrapper[4831]: I1203 08:05:14.190555 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-zh2bx" podStartSLOduration=2.19052391 podStartE2EDuration="2.19052391s" podCreationTimestamp="2025-12-03 08:05:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:14.175401479 +0000 UTC m=+5651.518985007" watchObservedRunningTime="2025-12-03 08:05:14.19052391 +0000 UTC m=+5651.534107458" Dec 03 08:05:20 crc kubenswrapper[4831]: I1203 08:05:20.217541 4831 generic.go:334] "Generic (PLEG): container finished" podID="6f966f8d-94ec-4e11-9049-35b3b66e192b" containerID="0e43dcc618667594635298af43a769444279529d0a4f5f00117293a6ad37d6fb" exitCode=0 Dec 03 08:05:20 crc kubenswrapper[4831]: I1203 08:05:20.217656 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zh2bx" event={"ID":"6f966f8d-94ec-4e11-9049-35b3b66e192b","Type":"ContainerDied","Data":"0e43dcc618667594635298af43a769444279529d0a4f5f00117293a6ad37d6fb"} Dec 03 08:05:21 crc kubenswrapper[4831]: I1203 08:05:21.573222 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:21 crc kubenswrapper[4831]: I1203 08:05:21.719443 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-combined-ca-bundle\") pod \"6f966f8d-94ec-4e11-9049-35b3b66e192b\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " Dec 03 08:05:21 crc kubenswrapper[4831]: I1203 08:05:21.719660 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jncbj\" (UniqueName: \"kubernetes.io/projected/6f966f8d-94ec-4e11-9049-35b3b66e192b-kube-api-access-jncbj\") pod \"6f966f8d-94ec-4e11-9049-35b3b66e192b\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " Dec 03 08:05:21 crc kubenswrapper[4831]: I1203 08:05:21.719830 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-scripts\") pod \"6f966f8d-94ec-4e11-9049-35b3b66e192b\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " Dec 03 08:05:21 crc kubenswrapper[4831]: I1203 08:05:21.719872 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-config-data\") pod \"6f966f8d-94ec-4e11-9049-35b3b66e192b\" (UID: \"6f966f8d-94ec-4e11-9049-35b3b66e192b\") " Dec 03 08:05:21 crc kubenswrapper[4831]: I1203 08:05:21.727956 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f966f8d-94ec-4e11-9049-35b3b66e192b-kube-api-access-jncbj" (OuterVolumeSpecName: "kube-api-access-jncbj") pod "6f966f8d-94ec-4e11-9049-35b3b66e192b" (UID: "6f966f8d-94ec-4e11-9049-35b3b66e192b"). InnerVolumeSpecName "kube-api-access-jncbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:21 crc kubenswrapper[4831]: I1203 08:05:21.728055 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-scripts" (OuterVolumeSpecName: "scripts") pod "6f966f8d-94ec-4e11-9049-35b3b66e192b" (UID: "6f966f8d-94ec-4e11-9049-35b3b66e192b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:21 crc kubenswrapper[4831]: I1203 08:05:21.770785 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-config-data" (OuterVolumeSpecName: "config-data") pod "6f966f8d-94ec-4e11-9049-35b3b66e192b" (UID: "6f966f8d-94ec-4e11-9049-35b3b66e192b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:21 crc kubenswrapper[4831]: I1203 08:05:21.775437 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6f966f8d-94ec-4e11-9049-35b3b66e192b" (UID: "6f966f8d-94ec-4e11-9049-35b3b66e192b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:21 crc kubenswrapper[4831]: I1203 08:05:21.822638 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:21 crc kubenswrapper[4831]: I1203 08:05:21.822680 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:21 crc kubenswrapper[4831]: I1203 08:05:21.822693 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6f966f8d-94ec-4e11-9049-35b3b66e192b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:21 crc kubenswrapper[4831]: I1203 08:05:21.822707 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jncbj\" (UniqueName: \"kubernetes.io/projected/6f966f8d-94ec-4e11-9049-35b3b66e192b-kube-api-access-jncbj\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.245363 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-zh2bx" event={"ID":"6f966f8d-94ec-4e11-9049-35b3b66e192b","Type":"ContainerDied","Data":"c2c6606c838b7d627ad2ab3f9c6d2587abf054df31627e49163ce223faed8738"} Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.245452 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2c6606c838b7d627ad2ab3f9c6d2587abf054df31627e49163ce223faed8738" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.245398 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-zh2bx" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.351691 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:05:22 crc kubenswrapper[4831]: E1203 08:05:22.352204 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f966f8d-94ec-4e11-9049-35b3b66e192b" containerName="nova-cell0-conductor-db-sync" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.352232 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f966f8d-94ec-4e11-9049-35b3b66e192b" containerName="nova-cell0-conductor-db-sync" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.352570 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f966f8d-94ec-4e11-9049-35b3b66e192b" containerName="nova-cell0-conductor-db-sync" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.354170 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.357101 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.357451 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-7jthb" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.370775 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.536588 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b192790-67c2-4112-bd22-9b0abbfb9394-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1b192790-67c2-4112-bd22-9b0abbfb9394\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.536960 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b192790-67c2-4112-bd22-9b0abbfb9394-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1b192790-67c2-4112-bd22-9b0abbfb9394\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.536988 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjfxn\" (UniqueName: \"kubernetes.io/projected/1b192790-67c2-4112-bd22-9b0abbfb9394-kube-api-access-jjfxn\") pod \"nova-cell0-conductor-0\" (UID: \"1b192790-67c2-4112-bd22-9b0abbfb9394\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.638690 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b192790-67c2-4112-bd22-9b0abbfb9394-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1b192790-67c2-4112-bd22-9b0abbfb9394\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.638802 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjfxn\" (UniqueName: \"kubernetes.io/projected/1b192790-67c2-4112-bd22-9b0abbfb9394-kube-api-access-jjfxn\") pod \"nova-cell0-conductor-0\" (UID: \"1b192790-67c2-4112-bd22-9b0abbfb9394\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.638930 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b192790-67c2-4112-bd22-9b0abbfb9394-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1b192790-67c2-4112-bd22-9b0abbfb9394\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.643446 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b192790-67c2-4112-bd22-9b0abbfb9394-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"1b192790-67c2-4112-bd22-9b0abbfb9394\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.644904 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b192790-67c2-4112-bd22-9b0abbfb9394-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"1b192790-67c2-4112-bd22-9b0abbfb9394\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.658233 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjfxn\" (UniqueName: \"kubernetes.io/projected/1b192790-67c2-4112-bd22-9b0abbfb9394-kube-api-access-jjfxn\") pod \"nova-cell0-conductor-0\" (UID: \"1b192790-67c2-4112-bd22-9b0abbfb9394\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:05:22 crc kubenswrapper[4831]: I1203 08:05:22.691129 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 08:05:23 crc kubenswrapper[4831]: I1203 08:05:23.005804 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:05:23 crc kubenswrapper[4831]: I1203 08:05:23.255082 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1b192790-67c2-4112-bd22-9b0abbfb9394","Type":"ContainerStarted","Data":"7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396"} Dec 03 08:05:23 crc kubenswrapper[4831]: I1203 08:05:23.255657 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 08:05:23 crc kubenswrapper[4831]: I1203 08:05:23.255700 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1b192790-67c2-4112-bd22-9b0abbfb9394","Type":"ContainerStarted","Data":"f7642a4e280ad328a6c8bc5839918eee3b3b63f365ceb76ca05adca364353fd8"} Dec 03 08:05:23 crc kubenswrapper[4831]: I1203 08:05:23.280044 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.280022805 podStartE2EDuration="1.280022805s" podCreationTimestamp="2025-12-03 08:05:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:23.273952996 +0000 UTC m=+5660.617536514" watchObservedRunningTime="2025-12-03 08:05:23.280022805 +0000 UTC m=+5660.623606323" Dec 03 08:05:27 crc kubenswrapper[4831]: I1203 08:05:27.596279 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:05:27 crc kubenswrapper[4831]: I1203 08:05:27.596736 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:05:32 crc kubenswrapper[4831]: I1203 08:05:32.740182 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.235838 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-gzxbp"] Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.239571 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.242063 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.242603 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.250872 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gzxbp"] Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.331714 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-scripts\") pod \"nova-cell0-cell-mapping-gzxbp\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.331943 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-config-data\") pod \"nova-cell0-cell-mapping-gzxbp\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.332047 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bcwmk\" (UniqueName: \"kubernetes.io/projected/ddf02d76-b548-4fab-9e1d-690a64c0be2e-kube-api-access-bcwmk\") pod \"nova-cell0-cell-mapping-gzxbp\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.332144 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gzxbp\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.434303 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.434411 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-scripts\") pod \"nova-cell0-cell-mapping-gzxbp\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.434523 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-config-data\") pod \"nova-cell0-cell-mapping-gzxbp\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.434579 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bcwmk\" (UniqueName: \"kubernetes.io/projected/ddf02d76-b548-4fab-9e1d-690a64c0be2e-kube-api-access-bcwmk\") pod \"nova-cell0-cell-mapping-gzxbp\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.434621 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gzxbp\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.435563 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.442885 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-scripts\") pod \"nova-cell0-cell-mapping-gzxbp\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.443557 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.448619 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-gzxbp\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.478091 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bcwmk\" (UniqueName: \"kubernetes.io/projected/ddf02d76-b548-4fab-9e1d-690a64c0be2e-kube-api-access-bcwmk\") pod \"nova-cell0-cell-mapping-gzxbp\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.478110 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-config-data\") pod \"nova-cell0-cell-mapping-gzxbp\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.490215 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.491484 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.499636 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.510341 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.517357 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.536112 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtkxz\" (UniqueName: \"kubernetes.io/projected/ba2d1ea0-7885-43ba-86b0-bee016342826-kube-api-access-jtkxz\") pod \"nova-scheduler-0\" (UID: \"ba2d1ea0-7885-43ba-86b0-bee016342826\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.536165 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2d1ea0-7885-43ba-86b0-bee016342826-config-data\") pod \"nova-scheduler-0\" (UID: \"ba2d1ea0-7885-43ba-86b0-bee016342826\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.536224 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2d1ea0-7885-43ba-86b0-bee016342826-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ba2d1ea0-7885-43ba-86b0-bee016342826\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.549652 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.552350 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.557392 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.585972 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.625441 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.627385 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.629165 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.634669 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.638214 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2d1ea0-7885-43ba-86b0-bee016342826-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ba2d1ea0-7885-43ba-86b0-bee016342826\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.640307 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkspf\" (UniqueName: \"kubernetes.io/projected/527835db-0f53-4146-87b2-e382437f5014-kube-api-access-nkspf\") pod \"nova-cell1-novncproxy-0\" (UID: \"527835db-0f53-4146-87b2-e382437f5014\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.640350 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5960168a-c93f-4f82-b663-9e1bc8108758-logs\") pod \"nova-api-0\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.640380 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527835db-0f53-4146-87b2-e382437f5014-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"527835db-0f53-4146-87b2-e382437f5014\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.640423 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtkxz\" (UniqueName: \"kubernetes.io/projected/ba2d1ea0-7885-43ba-86b0-bee016342826-kube-api-access-jtkxz\") pod \"nova-scheduler-0\" (UID: \"ba2d1ea0-7885-43ba-86b0-bee016342826\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.640854 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2d1ea0-7885-43ba-86b0-bee016342826-config-data\") pod \"nova-scheduler-0\" (UID: \"ba2d1ea0-7885-43ba-86b0-bee016342826\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.640881 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527835db-0f53-4146-87b2-e382437f5014-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"527835db-0f53-4146-87b2-e382437f5014\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.640901 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f89z4\" (UniqueName: \"kubernetes.io/projected/5960168a-c93f-4f82-b663-9e1bc8108758-kube-api-access-f89z4\") pod \"nova-api-0\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.640940 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5960168a-c93f-4f82-b663-9e1bc8108758-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.640960 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5960168a-c93f-4f82-b663-9e1bc8108758-config-data\") pod \"nova-api-0\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.661041 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2d1ea0-7885-43ba-86b0-bee016342826-config-data\") pod \"nova-scheduler-0\" (UID: \"ba2d1ea0-7885-43ba-86b0-bee016342826\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.696119 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2d1ea0-7885-43ba-86b0-bee016342826-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ba2d1ea0-7885-43ba-86b0-bee016342826\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.706924 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtkxz\" (UniqueName: \"kubernetes.io/projected/ba2d1ea0-7885-43ba-86b0-bee016342826-kube-api-access-jtkxz\") pod \"nova-scheduler-0\" (UID: \"ba2d1ea0-7885-43ba-86b0-bee016342826\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.713415 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.746204 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-logs\") pod \"nova-metadata-0\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " pod="openstack/nova-metadata-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.746260 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkspf\" (UniqueName: \"kubernetes.io/projected/527835db-0f53-4146-87b2-e382437f5014-kube-api-access-nkspf\") pod \"nova-cell1-novncproxy-0\" (UID: \"527835db-0f53-4146-87b2-e382437f5014\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.746285 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5960168a-c93f-4f82-b663-9e1bc8108758-logs\") pod \"nova-api-0\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.746313 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527835db-0f53-4146-87b2-e382437f5014-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"527835db-0f53-4146-87b2-e382437f5014\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.746350 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " pod="openstack/nova-metadata-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.746377 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-config-data\") pod \"nova-metadata-0\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " pod="openstack/nova-metadata-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.746399 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl2l8\" (UniqueName: \"kubernetes.io/projected/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-kube-api-access-fl2l8\") pod \"nova-metadata-0\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " pod="openstack/nova-metadata-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.746423 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527835db-0f53-4146-87b2-e382437f5014-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"527835db-0f53-4146-87b2-e382437f5014\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.746440 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f89z4\" (UniqueName: \"kubernetes.io/projected/5960168a-c93f-4f82-b663-9e1bc8108758-kube-api-access-f89z4\") pod \"nova-api-0\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.746475 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5960168a-c93f-4f82-b663-9e1bc8108758-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.746491 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5960168a-c93f-4f82-b663-9e1bc8108758-config-data\") pod \"nova-api-0\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.748184 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5960168a-c93f-4f82-b663-9e1bc8108758-logs\") pod \"nova-api-0\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.765058 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5960168a-c93f-4f82-b663-9e1bc8108758-config-data\") pod \"nova-api-0\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.767400 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527835db-0f53-4146-87b2-e382437f5014-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"527835db-0f53-4146-87b2-e382437f5014\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.782914 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkspf\" (UniqueName: \"kubernetes.io/projected/527835db-0f53-4146-87b2-e382437f5014-kube-api-access-nkspf\") pod \"nova-cell1-novncproxy-0\" (UID: \"527835db-0f53-4146-87b2-e382437f5014\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.783986 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f89z4\" (UniqueName: \"kubernetes.io/projected/5960168a-c93f-4f82-b663-9e1bc8108758-kube-api-access-f89z4\") pod \"nova-api-0\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.795029 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5960168a-c93f-4f82-b663-9e1bc8108758-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.797855 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527835db-0f53-4146-87b2-e382437f5014-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"527835db-0f53-4146-87b2-e382437f5014\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.825828 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-77595d75f7-z7pp2"] Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.827549 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.846768 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.847973 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-logs\") pod \"nova-metadata-0\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " pod="openstack/nova-metadata-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.848029 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " pod="openstack/nova-metadata-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.848056 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-config-data\") pod \"nova-metadata-0\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " pod="openstack/nova-metadata-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.848079 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl2l8\" (UniqueName: \"kubernetes.io/projected/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-kube-api-access-fl2l8\") pod \"nova-metadata-0\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " pod="openstack/nova-metadata-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.857818 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-logs\") pod \"nova-metadata-0\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " pod="openstack/nova-metadata-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.857983 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " pod="openstack/nova-metadata-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.873086 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl2l8\" (UniqueName: \"kubernetes.io/projected/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-kube-api-access-fl2l8\") pod \"nova-metadata-0\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " pod="openstack/nova-metadata-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.876185 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-config-data\") pod \"nova-metadata-0\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " pod="openstack/nova-metadata-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.881746 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77595d75f7-z7pp2"] Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.885339 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.908784 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.951889 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfzg4\" (UniqueName: \"kubernetes.io/projected/a8de247c-1cb7-448b-9071-0742bea10b51-kube-api-access-gfzg4\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.952260 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-ovsdbserver-sb\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.952289 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-config\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.952344 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-ovsdbserver-nb\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:33 crc kubenswrapper[4831]: I1203 08:05:33.952372 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-dns-svc\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.054119 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-ovsdbserver-nb\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.054170 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-dns-svc\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.054269 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfzg4\" (UniqueName: \"kubernetes.io/projected/a8de247c-1cb7-448b-9071-0742bea10b51-kube-api-access-gfzg4\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.054302 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-ovsdbserver-sb\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.054342 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-config\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.055274 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-dns-svc\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.055530 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-config\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.055549 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-ovsdbserver-sb\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.056064 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-ovsdbserver-nb\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.077991 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfzg4\" (UniqueName: \"kubernetes.io/projected/a8de247c-1cb7-448b-9071-0742bea10b51-kube-api-access-gfzg4\") pod \"dnsmasq-dns-77595d75f7-z7pp2\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.159054 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.190993 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.319403 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-gzxbp"] Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.330327 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:34 crc kubenswrapper[4831]: W1203 08:05:34.338196 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podddf02d76_b548_4fab_9e1d_690a64c0be2e.slice/crio-ae7a94cfe963ca02c3ae459121cd5a969dad8247e9e32bde67c9d889f70bf87c WatchSource:0}: Error finding container ae7a94cfe963ca02c3ae459121cd5a969dad8247e9e32bde67c9d889f70bf87c: Status 404 returned error can't find the container with id ae7a94cfe963ca02c3ae459121cd5a969dad8247e9e32bde67c9d889f70bf87c Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.412181 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hjj9z"] Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.413405 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.415402 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.418198 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.422033 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gzxbp" event={"ID":"ddf02d76-b548-4fab-9e1d-690a64c0be2e","Type":"ContainerStarted","Data":"ae7a94cfe963ca02c3ae459121cd5a969dad8247e9e32bde67c9d889f70bf87c"} Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.424515 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba2d1ea0-7885-43ba-86b0-bee016342826","Type":"ContainerStarted","Data":"64b658b062375f00f4f95afef60b8b7dc289e249e78021596570eed2ec953f86"} Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.427015 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hjj9z"] Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.459340 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.459393 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 08:05:34 crc kubenswrapper[4831]: W1203 08:05:34.467836 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod527835db_0f53_4146_87b2_e382437f5014.slice/crio-67b051f83eb139b5ee2cc97e66249af597b23c0a9fadba4252fb9afe9f188264 WatchSource:0}: Error finding container 67b051f83eb139b5ee2cc97e66249af597b23c0a9fadba4252fb9afe9f188264: Status 404 returned error can't find the container with id 67b051f83eb139b5ee2cc97e66249af597b23c0a9fadba4252fb9afe9f188264 Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.567858 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfslq\" (UniqueName: \"kubernetes.io/projected/4689e9b9-ff30-42db-b689-949dd272945f-kube-api-access-lfslq\") pod \"nova-cell1-conductor-db-sync-hjj9z\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.567971 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hjj9z\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.568069 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-config-data\") pod \"nova-cell1-conductor-db-sync-hjj9z\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.568117 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-scripts\") pod \"nova-cell1-conductor-db-sync-hjj9z\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.649799 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:05:34 crc kubenswrapper[4831]: W1203 08:05:34.661054 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7800ac0c_ba24_4fa3_b34c_f6e48bff7efd.slice/crio-18f10174971f3f49ff70b141119822bd7d465336f8d8bcbd1a2e47e1148557ff WatchSource:0}: Error finding container 18f10174971f3f49ff70b141119822bd7d465336f8d8bcbd1a2e47e1148557ff: Status 404 returned error can't find the container with id 18f10174971f3f49ff70b141119822bd7d465336f8d8bcbd1a2e47e1148557ff Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.669816 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hjj9z\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.669899 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-config-data\") pod \"nova-cell1-conductor-db-sync-hjj9z\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.669925 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-scripts\") pod \"nova-cell1-conductor-db-sync-hjj9z\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.669969 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfslq\" (UniqueName: \"kubernetes.io/projected/4689e9b9-ff30-42db-b689-949dd272945f-kube-api-access-lfslq\") pod \"nova-cell1-conductor-db-sync-hjj9z\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.673657 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-scripts\") pod \"nova-cell1-conductor-db-sync-hjj9z\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.675059 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-hjj9z\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.687069 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-config-data\") pod \"nova-cell1-conductor-db-sync-hjj9z\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.687978 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfslq\" (UniqueName: \"kubernetes.io/projected/4689e9b9-ff30-42db-b689-949dd272945f-kube-api-access-lfslq\") pod \"nova-cell1-conductor-db-sync-hjj9z\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.744498 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:34 crc kubenswrapper[4831]: W1203 08:05:34.757104 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8de247c_1cb7_448b_9071_0742bea10b51.slice/crio-b0ddbea2f7c49c2a1ce4b9b17164b6df0697e86903c49dae5ccb2a92771da182 WatchSource:0}: Error finding container b0ddbea2f7c49c2a1ce4b9b17164b6df0697e86903c49dae5ccb2a92771da182: Status 404 returned error can't find the container with id b0ddbea2f7c49c2a1ce4b9b17164b6df0697e86903c49dae5ccb2a92771da182 Dec 03 08:05:34 crc kubenswrapper[4831]: I1203 08:05:34.761364 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-77595d75f7-z7pp2"] Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.187142 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hjj9z"] Dec 03 08:05:35 crc kubenswrapper[4831]: W1203 08:05:35.192443 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4689e9b9_ff30_42db_b689_949dd272945f.slice/crio-57256129eea7dab8edc6d067578dbc9fbfebb9ab84c402a798f5f78764d5534b WatchSource:0}: Error finding container 57256129eea7dab8edc6d067578dbc9fbfebb9ab84c402a798f5f78764d5534b: Status 404 returned error can't find the container with id 57256129eea7dab8edc6d067578dbc9fbfebb9ab84c402a798f5f78764d5534b Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.450634 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"527835db-0f53-4146-87b2-e382437f5014","Type":"ContainerStarted","Data":"31c8fd4d995f9dd8deec526786a8bc03999bb1a71defd07122d9a19b78801df4"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.451077 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"527835db-0f53-4146-87b2-e382437f5014","Type":"ContainerStarted","Data":"67b051f83eb139b5ee2cc97e66249af597b23c0a9fadba4252fb9afe9f188264"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.478658 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5960168a-c93f-4f82-b663-9e1bc8108758","Type":"ContainerStarted","Data":"d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.478706 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5960168a-c93f-4f82-b663-9e1bc8108758","Type":"ContainerStarted","Data":"b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.478716 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5960168a-c93f-4f82-b663-9e1bc8108758","Type":"ContainerStarted","Data":"f1a50b65295ca6bd622fffbcf1b4532bd29ee3ada63ddc84c49357187d807591"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.480922 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.480909166 podStartE2EDuration="2.480909166s" podCreationTimestamp="2025-12-03 08:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:35.476078805 +0000 UTC m=+5672.819662313" watchObservedRunningTime="2025-12-03 08:05:35.480909166 +0000 UTC m=+5672.824492674" Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.496552 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hjj9z" event={"ID":"4689e9b9-ff30-42db-b689-949dd272945f","Type":"ContainerStarted","Data":"49742facba31fb15c7ee2566e3fbcab42abd2e795fff55e7a1dcf02453e23451"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.496596 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hjj9z" event={"ID":"4689e9b9-ff30-42db-b689-949dd272945f","Type":"ContainerStarted","Data":"57256129eea7dab8edc6d067578dbc9fbfebb9ab84c402a798f5f78764d5534b"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.506885 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.506867313 podStartE2EDuration="2.506867313s" podCreationTimestamp="2025-12-03 08:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:35.505487211 +0000 UTC m=+5672.849070719" watchObservedRunningTime="2025-12-03 08:05:35.506867313 +0000 UTC m=+5672.850450821" Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.507890 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gzxbp" event={"ID":"ddf02d76-b548-4fab-9e1d-690a64c0be2e","Type":"ContainerStarted","Data":"2b04056302db6bd7faa4fc4e590fcd62c93780a8569813067249092a58c255ad"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.516613 4831 generic.go:334] "Generic (PLEG): container finished" podID="a8de247c-1cb7-448b-9071-0742bea10b51" containerID="f06c2836bde642573a46ebd630af0e45a7fc19dbe484c13278edd6f3cf95bb99" exitCode=0 Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.516676 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" event={"ID":"a8de247c-1cb7-448b-9071-0742bea10b51","Type":"ContainerDied","Data":"f06c2836bde642573a46ebd630af0e45a7fc19dbe484c13278edd6f3cf95bb99"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.516702 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" event={"ID":"a8de247c-1cb7-448b-9071-0742bea10b51","Type":"ContainerStarted","Data":"b0ddbea2f7c49c2a1ce4b9b17164b6df0697e86903c49dae5ccb2a92771da182"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.526554 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd","Type":"ContainerStarted","Data":"12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.526600 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd","Type":"ContainerStarted","Data":"8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.526609 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd","Type":"ContainerStarted","Data":"18f10174971f3f49ff70b141119822bd7d465336f8d8bcbd1a2e47e1148557ff"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.530261 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-hjj9z" podStartSLOduration=1.530244361 podStartE2EDuration="1.530244361s" podCreationTimestamp="2025-12-03 08:05:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:35.521059325 +0000 UTC m=+5672.864642833" watchObservedRunningTime="2025-12-03 08:05:35.530244361 +0000 UTC m=+5672.873827859" Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.532377 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba2d1ea0-7885-43ba-86b0-bee016342826","Type":"ContainerStarted","Data":"f2db86eafc47956a4017c9a91aefd8df080a693ea3bd8adca618989d3e6dc08d"} Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.603222 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-gzxbp" podStartSLOduration=2.603202272 podStartE2EDuration="2.603202272s" podCreationTimestamp="2025-12-03 08:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:35.561904637 +0000 UTC m=+5672.905488135" watchObservedRunningTime="2025-12-03 08:05:35.603202272 +0000 UTC m=+5672.946785780" Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.618657 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.618632732 podStartE2EDuration="2.618632732s" podCreationTimestamp="2025-12-03 08:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:35.579273428 +0000 UTC m=+5672.922856936" watchObservedRunningTime="2025-12-03 08:05:35.618632732 +0000 UTC m=+5672.962216240" Dec 03 08:05:35 crc kubenswrapper[4831]: I1203 08:05:35.638528 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.63850488 podStartE2EDuration="2.63850488s" podCreationTimestamp="2025-12-03 08:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:35.596528764 +0000 UTC m=+5672.940112262" watchObservedRunningTime="2025-12-03 08:05:35.63850488 +0000 UTC m=+5672.982088408" Dec 03 08:05:36 crc kubenswrapper[4831]: I1203 08:05:36.546881 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" event={"ID":"a8de247c-1cb7-448b-9071-0742bea10b51","Type":"ContainerStarted","Data":"c2aa2133cc51efb73d16b353a60c8108bd5985f29ff0bcd06041b48f162bbefc"} Dec 03 08:05:36 crc kubenswrapper[4831]: I1203 08:05:36.548351 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:36 crc kubenswrapper[4831]: I1203 08:05:36.582997 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" podStartSLOduration=3.582971152 podStartE2EDuration="3.582971152s" podCreationTimestamp="2025-12-03 08:05:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:36.57454229 +0000 UTC m=+5673.918125808" watchObservedRunningTime="2025-12-03 08:05:36.582971152 +0000 UTC m=+5673.926554690" Dec 03 08:05:38 crc kubenswrapper[4831]: I1203 08:05:38.575292 4831 generic.go:334] "Generic (PLEG): container finished" podID="4689e9b9-ff30-42db-b689-949dd272945f" containerID="49742facba31fb15c7ee2566e3fbcab42abd2e795fff55e7a1dcf02453e23451" exitCode=0 Dec 03 08:05:38 crc kubenswrapper[4831]: I1203 08:05:38.575810 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hjj9z" event={"ID":"4689e9b9-ff30-42db-b689-949dd272945f","Type":"ContainerDied","Data":"49742facba31fb15c7ee2566e3fbcab42abd2e795fff55e7a1dcf02453e23451"} Dec 03 08:05:38 crc kubenswrapper[4831]: I1203 08:05:38.848019 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 08:05:38 crc kubenswrapper[4831]: I1203 08:05:38.886093 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:39 crc kubenswrapper[4831]: I1203 08:05:39.160550 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 08:05:39 crc kubenswrapper[4831]: I1203 08:05:39.161170 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 08:05:39 crc kubenswrapper[4831]: I1203 08:05:39.591893 4831 generic.go:334] "Generic (PLEG): container finished" podID="ddf02d76-b548-4fab-9e1d-690a64c0be2e" containerID="2b04056302db6bd7faa4fc4e590fcd62c93780a8569813067249092a58c255ad" exitCode=0 Dec 03 08:05:39 crc kubenswrapper[4831]: I1203 08:05:39.592922 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gzxbp" event={"ID":"ddf02d76-b548-4fab-9e1d-690a64c0be2e","Type":"ContainerDied","Data":"2b04056302db6bd7faa4fc4e590fcd62c93780a8569813067249092a58c255ad"} Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.132645 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.182165 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-combined-ca-bundle\") pod \"4689e9b9-ff30-42db-b689-949dd272945f\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.182364 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfslq\" (UniqueName: \"kubernetes.io/projected/4689e9b9-ff30-42db-b689-949dd272945f-kube-api-access-lfslq\") pod \"4689e9b9-ff30-42db-b689-949dd272945f\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.182570 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-scripts\") pod \"4689e9b9-ff30-42db-b689-949dd272945f\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.182768 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-config-data\") pod \"4689e9b9-ff30-42db-b689-949dd272945f\" (UID: \"4689e9b9-ff30-42db-b689-949dd272945f\") " Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.190940 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4689e9b9-ff30-42db-b689-949dd272945f-kube-api-access-lfslq" (OuterVolumeSpecName: "kube-api-access-lfslq") pod "4689e9b9-ff30-42db-b689-949dd272945f" (UID: "4689e9b9-ff30-42db-b689-949dd272945f"). InnerVolumeSpecName "kube-api-access-lfslq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.192502 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-scripts" (OuterVolumeSpecName: "scripts") pod "4689e9b9-ff30-42db-b689-949dd272945f" (UID: "4689e9b9-ff30-42db-b689-949dd272945f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.227183 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4689e9b9-ff30-42db-b689-949dd272945f" (UID: "4689e9b9-ff30-42db-b689-949dd272945f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.232474 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-config-data" (OuterVolumeSpecName: "config-data") pod "4689e9b9-ff30-42db-b689-949dd272945f" (UID: "4689e9b9-ff30-42db-b689-949dd272945f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.285950 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.285981 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.285996 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfslq\" (UniqueName: \"kubernetes.io/projected/4689e9b9-ff30-42db-b689-949dd272945f-kube-api-access-lfslq\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.286007 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4689e9b9-ff30-42db-b689-949dd272945f-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.604397 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-hjj9z" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.604649 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-hjj9z" event={"ID":"4689e9b9-ff30-42db-b689-949dd272945f","Type":"ContainerDied","Data":"57256129eea7dab8edc6d067578dbc9fbfebb9ab84c402a798f5f78764d5534b"} Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.604834 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57256129eea7dab8edc6d067578dbc9fbfebb9ab84c402a798f5f78764d5534b" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.795575 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:05:40 crc kubenswrapper[4831]: E1203 08:05:40.796067 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4689e9b9-ff30-42db-b689-949dd272945f" containerName="nova-cell1-conductor-db-sync" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.796084 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4689e9b9-ff30-42db-b689-949dd272945f" containerName="nova-cell1-conductor-db-sync" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.796365 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4689e9b9-ff30-42db-b689-949dd272945f" containerName="nova-cell1-conductor-db-sync" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.797225 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.800288 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.815715 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.899253 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c85412-de26-4d5f-91c0-fef3c84a682a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a6c85412-de26-4d5f-91c0-fef3c84a682a\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.899362 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c85412-de26-4d5f-91c0-fef3c84a682a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a6c85412-de26-4d5f-91c0-fef3c84a682a\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.899424 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrxhb\" (UniqueName: \"kubernetes.io/projected/a6c85412-de26-4d5f-91c0-fef3c84a682a-kube-api-access-xrxhb\") pod \"nova-cell1-conductor-0\" (UID: \"a6c85412-de26-4d5f-91c0-fef3c84a682a\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:05:40 crc kubenswrapper[4831]: I1203 08:05:40.906440 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.001259 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-scripts\") pod \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.001355 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bcwmk\" (UniqueName: \"kubernetes.io/projected/ddf02d76-b548-4fab-9e1d-690a64c0be2e-kube-api-access-bcwmk\") pod \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.001423 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-combined-ca-bundle\") pod \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.001481 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-config-data\") pod \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\" (UID: \"ddf02d76-b548-4fab-9e1d-690a64c0be2e\") " Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.001784 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c85412-de26-4d5f-91c0-fef3c84a682a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a6c85412-de26-4d5f-91c0-fef3c84a682a\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.001894 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrxhb\" (UniqueName: \"kubernetes.io/projected/a6c85412-de26-4d5f-91c0-fef3c84a682a-kube-api-access-xrxhb\") pod \"nova-cell1-conductor-0\" (UID: \"a6c85412-de26-4d5f-91c0-fef3c84a682a\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.002542 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c85412-de26-4d5f-91c0-fef3c84a682a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a6c85412-de26-4d5f-91c0-fef3c84a682a\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.006741 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddf02d76-b548-4fab-9e1d-690a64c0be2e-kube-api-access-bcwmk" (OuterVolumeSpecName: "kube-api-access-bcwmk") pod "ddf02d76-b548-4fab-9e1d-690a64c0be2e" (UID: "ddf02d76-b548-4fab-9e1d-690a64c0be2e"). InnerVolumeSpecName "kube-api-access-bcwmk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.007337 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c85412-de26-4d5f-91c0-fef3c84a682a-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"a6c85412-de26-4d5f-91c0-fef3c84a682a\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.007443 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c85412-de26-4d5f-91c0-fef3c84a682a-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"a6c85412-de26-4d5f-91c0-fef3c84a682a\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.008284 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-scripts" (OuterVolumeSpecName: "scripts") pod "ddf02d76-b548-4fab-9e1d-690a64c0be2e" (UID: "ddf02d76-b548-4fab-9e1d-690a64c0be2e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.017554 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrxhb\" (UniqueName: \"kubernetes.io/projected/a6c85412-de26-4d5f-91c0-fef3c84a682a-kube-api-access-xrxhb\") pod \"nova-cell1-conductor-0\" (UID: \"a6c85412-de26-4d5f-91c0-fef3c84a682a\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.031481 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-config-data" (OuterVolumeSpecName: "config-data") pod "ddf02d76-b548-4fab-9e1d-690a64c0be2e" (UID: "ddf02d76-b548-4fab-9e1d-690a64c0be2e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.033767 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ddf02d76-b548-4fab-9e1d-690a64c0be2e" (UID: "ddf02d76-b548-4fab-9e1d-690a64c0be2e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.105707 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.105755 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bcwmk\" (UniqueName: \"kubernetes.io/projected/ddf02d76-b548-4fab-9e1d-690a64c0be2e-kube-api-access-bcwmk\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.105772 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.105785 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ddf02d76-b548-4fab-9e1d-690a64c0be2e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.124153 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.399603 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:05:41 crc kubenswrapper[4831]: W1203 08:05:41.405712 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda6c85412_de26_4d5f_91c0_fef3c84a682a.slice/crio-5bec58cb6906cfcb53c07d0fe4b75ba3e9c48982bc8729f2886b1a707a3144c8 WatchSource:0}: Error finding container 5bec58cb6906cfcb53c07d0fe4b75ba3e9c48982bc8729f2886b1a707a3144c8: Status 404 returned error can't find the container with id 5bec58cb6906cfcb53c07d0fe4b75ba3e9c48982bc8729f2886b1a707a3144c8 Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.616062 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-gzxbp" event={"ID":"ddf02d76-b548-4fab-9e1d-690a64c0be2e","Type":"ContainerDied","Data":"ae7a94cfe963ca02c3ae459121cd5a969dad8247e9e32bde67c9d889f70bf87c"} Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.616400 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae7a94cfe963ca02c3ae459121cd5a969dad8247e9e32bde67c9d889f70bf87c" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.616079 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-gzxbp" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.619303 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a6c85412-de26-4d5f-91c0-fef3c84a682a","Type":"ContainerStarted","Data":"1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59"} Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.619383 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a6c85412-de26-4d5f-91c0-fef3c84a682a","Type":"ContainerStarted","Data":"5bec58cb6906cfcb53c07d0fe4b75ba3e9c48982bc8729f2886b1a707a3144c8"} Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.619468 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.649343 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=1.649297496 podStartE2EDuration="1.649297496s" podCreationTimestamp="2025-12-03 08:05:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:41.636006922 +0000 UTC m=+5678.979590460" watchObservedRunningTime="2025-12-03 08:05:41.649297496 +0000 UTC m=+5678.992881014" Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.832741 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.833022 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5960168a-c93f-4f82-b663-9e1bc8108758" containerName="nova-api-log" containerID="cri-o://b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821" gracePeriod=30 Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.833124 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="5960168a-c93f-4f82-b663-9e1bc8108758" containerName="nova-api-api" containerID="cri-o://d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84" gracePeriod=30 Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.855073 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.855393 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ba2d1ea0-7885-43ba-86b0-bee016342826" containerName="nova-scheduler-scheduler" containerID="cri-o://f2db86eafc47956a4017c9a91aefd8df080a693ea3bd8adca618989d3e6dc08d" gracePeriod=30 Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.868827 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.869036 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" containerName="nova-metadata-log" containerID="cri-o://8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63" gracePeriod=30 Dec 03 08:05:41 crc kubenswrapper[4831]: I1203 08:05:41.869143 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" containerName="nova-metadata-metadata" containerID="cri-o://12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f" gracePeriod=30 Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.493634 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.501281 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.632082 4831 generic.go:334] "Generic (PLEG): container finished" podID="7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" containerID="12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f" exitCode=0 Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.633182 4831 generic.go:334] "Generic (PLEG): container finished" podID="7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" containerID="8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63" exitCode=143 Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.632334 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.632207 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd","Type":"ContainerDied","Data":"12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f"} Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.633669 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd","Type":"ContainerDied","Data":"8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63"} Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.633778 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd","Type":"ContainerDied","Data":"18f10174971f3f49ff70b141119822bd7d465336f8d8bcbd1a2e47e1148557ff"} Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.633732 4831 scope.go:117] "RemoveContainer" containerID="12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.636384 4831 generic.go:334] "Generic (PLEG): container finished" podID="5960168a-c93f-4f82-b663-9e1bc8108758" containerID="d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84" exitCode=0 Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.636421 4831 generic.go:334] "Generic (PLEG): container finished" podID="5960168a-c93f-4f82-b663-9e1bc8108758" containerID="b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821" exitCode=143 Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.637090 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.637285 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5960168a-c93f-4f82-b663-9e1bc8108758","Type":"ContainerDied","Data":"d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84"} Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.637334 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5960168a-c93f-4f82-b663-9e1bc8108758","Type":"ContainerDied","Data":"b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821"} Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.637349 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"5960168a-c93f-4f82-b663-9e1bc8108758","Type":"ContainerDied","Data":"f1a50b65295ca6bd622fffbcf1b4532bd29ee3ada63ddc84c49357187d807591"} Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.637855 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl2l8\" (UniqueName: \"kubernetes.io/projected/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-kube-api-access-fl2l8\") pod \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.637898 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5960168a-c93f-4f82-b663-9e1bc8108758-logs\") pod \"5960168a-c93f-4f82-b663-9e1bc8108758\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.637922 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f89z4\" (UniqueName: \"kubernetes.io/projected/5960168a-c93f-4f82-b663-9e1bc8108758-kube-api-access-f89z4\") pod \"5960168a-c93f-4f82-b663-9e1bc8108758\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.637964 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5960168a-c93f-4f82-b663-9e1bc8108758-config-data\") pod \"5960168a-c93f-4f82-b663-9e1bc8108758\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.638055 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-combined-ca-bundle\") pod \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.638119 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5960168a-c93f-4f82-b663-9e1bc8108758-combined-ca-bundle\") pod \"5960168a-c93f-4f82-b663-9e1bc8108758\" (UID: \"5960168a-c93f-4f82-b663-9e1bc8108758\") " Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.638160 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-logs\") pod \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.638253 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-config-data\") pod \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\" (UID: \"7800ac0c-ba24-4fa3-b34c-f6e48bff7efd\") " Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.638564 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5960168a-c93f-4f82-b663-9e1bc8108758-logs" (OuterVolumeSpecName: "logs") pod "5960168a-c93f-4f82-b663-9e1bc8108758" (UID: "5960168a-c93f-4f82-b663-9e1bc8108758"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.638750 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-logs" (OuterVolumeSpecName: "logs") pod "7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" (UID: "7800ac0c-ba24-4fa3-b34c-f6e48bff7efd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.644466 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-kube-api-access-fl2l8" (OuterVolumeSpecName: "kube-api-access-fl2l8") pod "7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" (UID: "7800ac0c-ba24-4fa3-b34c-f6e48bff7efd"). InnerVolumeSpecName "kube-api-access-fl2l8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.644680 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5960168a-c93f-4f82-b663-9e1bc8108758-kube-api-access-f89z4" (OuterVolumeSpecName: "kube-api-access-f89z4") pod "5960168a-c93f-4f82-b663-9e1bc8108758" (UID: "5960168a-c93f-4f82-b663-9e1bc8108758"). InnerVolumeSpecName "kube-api-access-f89z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.660103 4831 scope.go:117] "RemoveContainer" containerID="8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.666196 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-config-data" (OuterVolumeSpecName: "config-data") pod "7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" (UID: "7800ac0c-ba24-4fa3-b34c-f6e48bff7efd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.666474 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" (UID: "7800ac0c-ba24-4fa3-b34c-f6e48bff7efd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.668209 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5960168a-c93f-4f82-b663-9e1bc8108758-config-data" (OuterVolumeSpecName: "config-data") pod "5960168a-c93f-4f82-b663-9e1bc8108758" (UID: "5960168a-c93f-4f82-b663-9e1bc8108758"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.674308 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5960168a-c93f-4f82-b663-9e1bc8108758-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5960168a-c93f-4f82-b663-9e1bc8108758" (UID: "5960168a-c93f-4f82-b663-9e1bc8108758"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.712456 4831 scope.go:117] "RemoveContainer" containerID="12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f" Dec 03 08:05:42 crc kubenswrapper[4831]: E1203 08:05:42.713001 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f\": container with ID starting with 12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f not found: ID does not exist" containerID="12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.713048 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f"} err="failed to get container status \"12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f\": rpc error: code = NotFound desc = could not find container \"12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f\": container with ID starting with 12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f not found: ID does not exist" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.713077 4831 scope.go:117] "RemoveContainer" containerID="8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63" Dec 03 08:05:42 crc kubenswrapper[4831]: E1203 08:05:42.713600 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63\": container with ID starting with 8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63 not found: ID does not exist" containerID="8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.713624 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63"} err="failed to get container status \"8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63\": rpc error: code = NotFound desc = could not find container \"8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63\": container with ID starting with 8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63 not found: ID does not exist" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.713637 4831 scope.go:117] "RemoveContainer" containerID="12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.713992 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f"} err="failed to get container status \"12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f\": rpc error: code = NotFound desc = could not find container \"12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f\": container with ID starting with 12239cd91b5e3accd6f70eeed95b7c9cc8eb67cff79957c84e5e3eb73e94b17f not found: ID does not exist" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.714010 4831 scope.go:117] "RemoveContainer" containerID="8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.714298 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63"} err="failed to get container status \"8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63\": rpc error: code = NotFound desc = could not find container \"8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63\": container with ID starting with 8bc02e941998506e86a64418dfb4d503be6b0b4a9244a53806e63d45e318da63 not found: ID does not exist" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.714330 4831 scope.go:117] "RemoveContainer" containerID="d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.742048 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5960168a-c93f-4f82-b663-9e1bc8108758-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.742098 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.742120 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.742141 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl2l8\" (UniqueName: \"kubernetes.io/projected/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-kube-api-access-fl2l8\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.742162 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5960168a-c93f-4f82-b663-9e1bc8108758-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.742181 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f89z4\" (UniqueName: \"kubernetes.io/projected/5960168a-c93f-4f82-b663-9e1bc8108758-kube-api-access-f89z4\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.742200 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5960168a-c93f-4f82-b663-9e1bc8108758-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.742219 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.755502 4831 scope.go:117] "RemoveContainer" containerID="b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.776047 4831 scope.go:117] "RemoveContainer" containerID="d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84" Dec 03 08:05:42 crc kubenswrapper[4831]: E1203 08:05:42.776530 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84\": container with ID starting with d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84 not found: ID does not exist" containerID="d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.776586 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84"} err="failed to get container status \"d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84\": rpc error: code = NotFound desc = could not find container \"d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84\": container with ID starting with d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84 not found: ID does not exist" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.776622 4831 scope.go:117] "RemoveContainer" containerID="b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821" Dec 03 08:05:42 crc kubenswrapper[4831]: E1203 08:05:42.776976 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821\": container with ID starting with b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821 not found: ID does not exist" containerID="b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.777021 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821"} err="failed to get container status \"b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821\": rpc error: code = NotFound desc = could not find container \"b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821\": container with ID starting with b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821 not found: ID does not exist" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.777052 4831 scope.go:117] "RemoveContainer" containerID="d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.777466 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84"} err="failed to get container status \"d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84\": rpc error: code = NotFound desc = could not find container \"d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84\": container with ID starting with d87ed0984c7364e7d53c24c0a8833b23a03976303ace285dc4fc34630f087b84 not found: ID does not exist" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.777542 4831 scope.go:117] "RemoveContainer" containerID="b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.778005 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821"} err="failed to get container status \"b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821\": rpc error: code = NotFound desc = could not find container \"b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821\": container with ID starting with b01c4911c9631d5b8ebd2945caf0d2c04868a57492647733f350c167070f2821 not found: ID does not exist" Dec 03 08:05:42 crc kubenswrapper[4831]: I1203 08:05:42.995553 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.010245 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.045460 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" path="/var/lib/kubelet/pods/7800ac0c-ba24-4fa3-b34c-f6e48bff7efd/volumes" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.046223 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.064066 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.071889 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:05:43 crc kubenswrapper[4831]: E1203 08:05:43.072411 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" containerName="nova-metadata-metadata" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.072431 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" containerName="nova-metadata-metadata" Dec 03 08:05:43 crc kubenswrapper[4831]: E1203 08:05:43.072465 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5960168a-c93f-4f82-b663-9e1bc8108758" containerName="nova-api-api" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.072474 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5960168a-c93f-4f82-b663-9e1bc8108758" containerName="nova-api-api" Dec 03 08:05:43 crc kubenswrapper[4831]: E1203 08:05:43.072500 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" containerName="nova-metadata-log" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.072508 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" containerName="nova-metadata-log" Dec 03 08:05:43 crc kubenswrapper[4831]: E1203 08:05:43.072531 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddf02d76-b548-4fab-9e1d-690a64c0be2e" containerName="nova-manage" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.072538 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddf02d76-b548-4fab-9e1d-690a64c0be2e" containerName="nova-manage" Dec 03 08:05:43 crc kubenswrapper[4831]: E1203 08:05:43.072559 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5960168a-c93f-4f82-b663-9e1bc8108758" containerName="nova-api-log" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.072567 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5960168a-c93f-4f82-b663-9e1bc8108758" containerName="nova-api-log" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.072797 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5960168a-c93f-4f82-b663-9e1bc8108758" containerName="nova-api-api" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.072819 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5960168a-c93f-4f82-b663-9e1bc8108758" containerName="nova-api-log" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.072836 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" containerName="nova-metadata-metadata" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.072854 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddf02d76-b548-4fab-9e1d-690a64c0be2e" containerName="nova-manage" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.072866 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7800ac0c-ba24-4fa3-b34c-f6e48bff7efd" containerName="nova-metadata-log" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.074016 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.077753 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.087946 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.100997 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.103040 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.105813 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.108853 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.149238 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.149401 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tcnk\" (UniqueName: \"kubernetes.io/projected/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-kube-api-access-4tcnk\") pod \"nova-api-0\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.149526 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/027aa669-9cc6-48a9-a586-96d7262dcb56-config-data\") pod \"nova-metadata-0\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.149620 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/027aa669-9cc6-48a9-a586-96d7262dcb56-logs\") pod \"nova-metadata-0\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.149737 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/027aa669-9cc6-48a9-a586-96d7262dcb56-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.149782 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-logs\") pod \"nova-api-0\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.149880 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-config-data\") pod \"nova-api-0\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.149946 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpzp5\" (UniqueName: \"kubernetes.io/projected/027aa669-9cc6-48a9-a586-96d7262dcb56-kube-api-access-cpzp5\") pod \"nova-metadata-0\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.252350 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/027aa669-9cc6-48a9-a586-96d7262dcb56-config-data\") pod \"nova-metadata-0\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.252447 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/027aa669-9cc6-48a9-a586-96d7262dcb56-logs\") pod \"nova-metadata-0\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.252507 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/027aa669-9cc6-48a9-a586-96d7262dcb56-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.252551 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-logs\") pod \"nova-api-0\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.252623 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-config-data\") pod \"nova-api-0\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.252994 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/027aa669-9cc6-48a9-a586-96d7262dcb56-logs\") pod \"nova-metadata-0\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.253370 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-logs\") pod \"nova-api-0\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.253560 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cpzp5\" (UniqueName: \"kubernetes.io/projected/027aa669-9cc6-48a9-a586-96d7262dcb56-kube-api-access-cpzp5\") pod \"nova-metadata-0\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.253631 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.253958 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tcnk\" (UniqueName: \"kubernetes.io/projected/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-kube-api-access-4tcnk\") pod \"nova-api-0\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.257059 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/027aa669-9cc6-48a9-a586-96d7262dcb56-config-data\") pod \"nova-metadata-0\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.257302 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.258751 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-config-data\") pod \"nova-api-0\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.277250 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/027aa669-9cc6-48a9-a586-96d7262dcb56-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.280193 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpzp5\" (UniqueName: \"kubernetes.io/projected/027aa669-9cc6-48a9-a586-96d7262dcb56-kube-api-access-cpzp5\") pod \"nova-metadata-0\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.282927 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tcnk\" (UniqueName: \"kubernetes.io/projected/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-kube-api-access-4tcnk\") pod \"nova-api-0\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.461196 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.468479 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.886702 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.901882 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:43 crc kubenswrapper[4831]: I1203 08:05:43.971192 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:05:43 crc kubenswrapper[4831]: W1203 08:05:43.974406 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9b1c29f8_4166_41ce_8fbe_0587eed28a0a.slice/crio-8ba3e78637c75c63883e12fb9c469e3f578cf3210181f0f8524da666bea0e94b WatchSource:0}: Error finding container 8ba3e78637c75c63883e12fb9c469e3f578cf3210181f0f8524da666bea0e94b: Status 404 returned error can't find the container with id 8ba3e78637c75c63883e12fb9c469e3f578cf3210181f0f8524da666bea0e94b Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.025774 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:05:44 crc kubenswrapper[4831]: W1203 08:05:44.026572 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod027aa669_9cc6_48a9_a586_96d7262dcb56.slice/crio-fcb4e966b088aa592485b67efe14b4362ad53aad929de07051aa69734725a294 WatchSource:0}: Error finding container fcb4e966b088aa592485b67efe14b4362ad53aad929de07051aa69734725a294: Status 404 returned error can't find the container with id fcb4e966b088aa592485b67efe14b4362ad53aad929de07051aa69734725a294 Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.193152 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.263055 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58577dbd7f-rl7pg"] Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.263546 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" podUID="fa9bf6d6-c8e0-4326-b177-e79139d03937" containerName="dnsmasq-dns" containerID="cri-o://52f8930b14a6ddaec0c4894c8222932387cf4257cfa44dbb75771337bc0f9100" gracePeriod=10 Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.658116 4831 generic.go:334] "Generic (PLEG): container finished" podID="fa9bf6d6-c8e0-4326-b177-e79139d03937" containerID="52f8930b14a6ddaec0c4894c8222932387cf4257cfa44dbb75771337bc0f9100" exitCode=0 Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.658208 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" event={"ID":"fa9bf6d6-c8e0-4326-b177-e79139d03937","Type":"ContainerDied","Data":"52f8930b14a6ddaec0c4894c8222932387cf4257cfa44dbb75771337bc0f9100"} Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.659905 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b1c29f8-4166-41ce-8fbe-0587eed28a0a","Type":"ContainerStarted","Data":"1d72b1e1a04c76e955ee542cc76cdc13ef6d506f7b6011f86edd6b5f8961fda2"} Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.661241 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b1c29f8-4166-41ce-8fbe-0587eed28a0a","Type":"ContainerStarted","Data":"becdc3486ae606a2bf46b5a2cad39a2eebb5a2e6e426b83e831cb7f592f43f40"} Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.661332 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b1c29f8-4166-41ce-8fbe-0587eed28a0a","Type":"ContainerStarted","Data":"8ba3e78637c75c63883e12fb9c469e3f578cf3210181f0f8524da666bea0e94b"} Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.664112 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"027aa669-9cc6-48a9-a586-96d7262dcb56","Type":"ContainerStarted","Data":"3c7bd82db298023341a32e950acf0223eea9402dd67faf80f569870c843360fc"} Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.664197 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"027aa669-9cc6-48a9-a586-96d7262dcb56","Type":"ContainerStarted","Data":"c59079d794e16c737c3793cdf6d573fbad79b4a471b5211d091e4a7e9f15a77d"} Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.664254 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"027aa669-9cc6-48a9-a586-96d7262dcb56","Type":"ContainerStarted","Data":"fcb4e966b088aa592485b67efe14b4362ad53aad929de07051aa69734725a294"} Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.671669 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.688834 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=1.6888110360000002 podStartE2EDuration="1.688811036s" podCreationTimestamp="2025-12-03 08:05:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:44.679060892 +0000 UTC m=+5682.022644410" watchObservedRunningTime="2025-12-03 08:05:44.688811036 +0000 UTC m=+5682.032394544" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.718681 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.718654124 podStartE2EDuration="2.718654124s" podCreationTimestamp="2025-12-03 08:05:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:44.702826092 +0000 UTC m=+5682.046409600" watchObservedRunningTime="2025-12-03 08:05:44.718654124 +0000 UTC m=+5682.062237632" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.731966 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.787842 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-ovsdbserver-nb\") pod \"fa9bf6d6-c8e0-4326-b177-e79139d03937\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.787906 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-dns-svc\") pod \"fa9bf6d6-c8e0-4326-b177-e79139d03937\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.787978 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-config\") pod \"fa9bf6d6-c8e0-4326-b177-e79139d03937\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.788010 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2twms\" (UniqueName: \"kubernetes.io/projected/fa9bf6d6-c8e0-4326-b177-e79139d03937-kube-api-access-2twms\") pod \"fa9bf6d6-c8e0-4326-b177-e79139d03937\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.788064 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-ovsdbserver-sb\") pod \"fa9bf6d6-c8e0-4326-b177-e79139d03937\" (UID: \"fa9bf6d6-c8e0-4326-b177-e79139d03937\") " Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.798692 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa9bf6d6-c8e0-4326-b177-e79139d03937-kube-api-access-2twms" (OuterVolumeSpecName: "kube-api-access-2twms") pod "fa9bf6d6-c8e0-4326-b177-e79139d03937" (UID: "fa9bf6d6-c8e0-4326-b177-e79139d03937"). InnerVolumeSpecName "kube-api-access-2twms". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.838965 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fa9bf6d6-c8e0-4326-b177-e79139d03937" (UID: "fa9bf6d6-c8e0-4326-b177-e79139d03937"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.846458 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fa9bf6d6-c8e0-4326-b177-e79139d03937" (UID: "fa9bf6d6-c8e0-4326-b177-e79139d03937"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.857807 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-config" (OuterVolumeSpecName: "config") pod "fa9bf6d6-c8e0-4326-b177-e79139d03937" (UID: "fa9bf6d6-c8e0-4326-b177-e79139d03937"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.880559 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fa9bf6d6-c8e0-4326-b177-e79139d03937" (UID: "fa9bf6d6-c8e0-4326-b177-e79139d03937"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.889688 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.889722 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2twms\" (UniqueName: \"kubernetes.io/projected/fa9bf6d6-c8e0-4326-b177-e79139d03937-kube-api-access-2twms\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.889733 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.889743 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:44 crc kubenswrapper[4831]: I1203 08:05:44.889752 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fa9bf6d6-c8e0-4326-b177-e79139d03937-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:45 crc kubenswrapper[4831]: I1203 08:05:45.024645 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5960168a-c93f-4f82-b663-9e1bc8108758" path="/var/lib/kubelet/pods/5960168a-c93f-4f82-b663-9e1bc8108758/volumes" Dec 03 08:05:45 crc kubenswrapper[4831]: I1203 08:05:45.676913 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" event={"ID":"fa9bf6d6-c8e0-4326-b177-e79139d03937","Type":"ContainerDied","Data":"d9d6478fa677ca946398269e09e5d623c07526d6a36fc293822bfb511d6ec230"} Dec 03 08:05:45 crc kubenswrapper[4831]: I1203 08:05:45.677342 4831 scope.go:117] "RemoveContainer" containerID="52f8930b14a6ddaec0c4894c8222932387cf4257cfa44dbb75771337bc0f9100" Dec 03 08:05:45 crc kubenswrapper[4831]: I1203 08:05:45.677594 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58577dbd7f-rl7pg" Dec 03 08:05:45 crc kubenswrapper[4831]: I1203 08:05:45.715477 4831 scope.go:117] "RemoveContainer" containerID="d74941ed6a0efa7b1657daaf7679f4b97520630b721e2316d02314679abd0a2c" Dec 03 08:05:45 crc kubenswrapper[4831]: I1203 08:05:45.718910 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58577dbd7f-rl7pg"] Dec 03 08:05:45 crc kubenswrapper[4831]: I1203 08:05:45.729270 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58577dbd7f-rl7pg"] Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.170088 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.704223 4831 generic.go:334] "Generic (PLEG): container finished" podID="ba2d1ea0-7885-43ba-86b0-bee016342826" containerID="f2db86eafc47956a4017c9a91aefd8df080a693ea3bd8adca618989d3e6dc08d" exitCode=0 Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.705408 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba2d1ea0-7885-43ba-86b0-bee016342826","Type":"ContainerDied","Data":"f2db86eafc47956a4017c9a91aefd8df080a693ea3bd8adca618989d3e6dc08d"} Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.713113 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-pclsl"] Dec 03 08:05:46 crc kubenswrapper[4831]: E1203 08:05:46.713793 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9bf6d6-c8e0-4326-b177-e79139d03937" containerName="init" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.713811 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9bf6d6-c8e0-4326-b177-e79139d03937" containerName="init" Dec 03 08:05:46 crc kubenswrapper[4831]: E1203 08:05:46.713829 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa9bf6d6-c8e0-4326-b177-e79139d03937" containerName="dnsmasq-dns" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.713839 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa9bf6d6-c8e0-4326-b177-e79139d03937" containerName="dnsmasq-dns" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.714064 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa9bf6d6-c8e0-4326-b177-e79139d03937" containerName="dnsmasq-dns" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.714851 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.717638 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.717807 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.720810 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pclsl"] Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.732836 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-scripts\") pod \"nova-cell1-cell-mapping-pclsl\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.732910 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pclsl\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.732971 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8fjh\" (UniqueName: \"kubernetes.io/projected/d5493d05-e214-4ce9-ab1d-39dc6d512041-kube-api-access-f8fjh\") pod \"nova-cell1-cell-mapping-pclsl\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.733000 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-config-data\") pod \"nova-cell1-cell-mapping-pclsl\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.835279 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pclsl\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.835435 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8fjh\" (UniqueName: \"kubernetes.io/projected/d5493d05-e214-4ce9-ab1d-39dc6d512041-kube-api-access-f8fjh\") pod \"nova-cell1-cell-mapping-pclsl\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.835481 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-config-data\") pod \"nova-cell1-cell-mapping-pclsl\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.835571 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-scripts\") pod \"nova-cell1-cell-mapping-pclsl\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.841588 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-scripts\") pod \"nova-cell1-cell-mapping-pclsl\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.841737 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-pclsl\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.843052 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-config-data\") pod \"nova-cell1-cell-mapping-pclsl\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.854396 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8fjh\" (UniqueName: \"kubernetes.io/projected/d5493d05-e214-4ce9-ab1d-39dc6d512041-kube-api-access-f8fjh\") pod \"nova-cell1-cell-mapping-pclsl\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:46 crc kubenswrapper[4831]: I1203 08:05:46.931083 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.022249 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa9bf6d6-c8e0-4326-b177-e79139d03937" path="/var/lib/kubelet/pods/fa9bf6d6-c8e0-4326-b177-e79139d03937/volumes" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.034970 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.037715 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2d1ea0-7885-43ba-86b0-bee016342826-config-data\") pod \"ba2d1ea0-7885-43ba-86b0-bee016342826\" (UID: \"ba2d1ea0-7885-43ba-86b0-bee016342826\") " Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.037785 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2d1ea0-7885-43ba-86b0-bee016342826-combined-ca-bundle\") pod \"ba2d1ea0-7885-43ba-86b0-bee016342826\" (UID: \"ba2d1ea0-7885-43ba-86b0-bee016342826\") " Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.037879 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtkxz\" (UniqueName: \"kubernetes.io/projected/ba2d1ea0-7885-43ba-86b0-bee016342826-kube-api-access-jtkxz\") pod \"ba2d1ea0-7885-43ba-86b0-bee016342826\" (UID: \"ba2d1ea0-7885-43ba-86b0-bee016342826\") " Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.042679 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba2d1ea0-7885-43ba-86b0-bee016342826-kube-api-access-jtkxz" (OuterVolumeSpecName: "kube-api-access-jtkxz") pod "ba2d1ea0-7885-43ba-86b0-bee016342826" (UID: "ba2d1ea0-7885-43ba-86b0-bee016342826"). InnerVolumeSpecName "kube-api-access-jtkxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.076440 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba2d1ea0-7885-43ba-86b0-bee016342826-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ba2d1ea0-7885-43ba-86b0-bee016342826" (UID: "ba2d1ea0-7885-43ba-86b0-bee016342826"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.079770 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba2d1ea0-7885-43ba-86b0-bee016342826-config-data" (OuterVolumeSpecName: "config-data") pod "ba2d1ea0-7885-43ba-86b0-bee016342826" (UID: "ba2d1ea0-7885-43ba-86b0-bee016342826"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.140364 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ba2d1ea0-7885-43ba-86b0-bee016342826-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.140666 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ba2d1ea0-7885-43ba-86b0-bee016342826-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.140684 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtkxz\" (UniqueName: \"kubernetes.io/projected/ba2d1ea0-7885-43ba-86b0-bee016342826-kube-api-access-jtkxz\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.306103 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-pclsl"] Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.718672 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.718666 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ba2d1ea0-7885-43ba-86b0-bee016342826","Type":"ContainerDied","Data":"64b658b062375f00f4f95afef60b8b7dc289e249e78021596570eed2ec953f86"} Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.719086 4831 scope.go:117] "RemoveContainer" containerID="f2db86eafc47956a4017c9a91aefd8df080a693ea3bd8adca618989d3e6dc08d" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.722617 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pclsl" event={"ID":"d5493d05-e214-4ce9-ab1d-39dc6d512041","Type":"ContainerStarted","Data":"32686504a7304266d746f4f6d0cf7c9dd1e65a000d8c310e047b13bf624f9773"} Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.722677 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pclsl" event={"ID":"d5493d05-e214-4ce9-ab1d-39dc6d512041","Type":"ContainerStarted","Data":"15a058619561f9dce2170ff15ddc5184ef3356b50904c0486205a6f926e23565"} Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.775835 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-pclsl" podStartSLOduration=1.7758113930000001 podStartE2EDuration="1.775811393s" podCreationTimestamp="2025-12-03 08:05:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:47.769677772 +0000 UTC m=+5685.113261280" watchObservedRunningTime="2025-12-03 08:05:47.775811393 +0000 UTC m=+5685.119394911" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.799801 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.816751 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.832613 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:47 crc kubenswrapper[4831]: E1203 08:05:47.833092 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba2d1ea0-7885-43ba-86b0-bee016342826" containerName="nova-scheduler-scheduler" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.833114 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba2d1ea0-7885-43ba-86b0-bee016342826" containerName="nova-scheduler-scheduler" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.833306 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba2d1ea0-7885-43ba-86b0-bee016342826" containerName="nova-scheduler-scheduler" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.833978 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.837168 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.840707 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.859904 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccf8fba2-1634-4af1-961d-c747d2494350-config-data\") pod \"nova-scheduler-0\" (UID: \"ccf8fba2-1634-4af1-961d-c747d2494350\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.859989 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh8jr\" (UniqueName: \"kubernetes.io/projected/ccf8fba2-1634-4af1-961d-c747d2494350-kube-api-access-fh8jr\") pod \"nova-scheduler-0\" (UID: \"ccf8fba2-1634-4af1-961d-c747d2494350\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.860025 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf8fba2-1634-4af1-961d-c747d2494350-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ccf8fba2-1634-4af1-961d-c747d2494350\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.962902 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccf8fba2-1634-4af1-961d-c747d2494350-config-data\") pod \"nova-scheduler-0\" (UID: \"ccf8fba2-1634-4af1-961d-c747d2494350\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.963022 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh8jr\" (UniqueName: \"kubernetes.io/projected/ccf8fba2-1634-4af1-961d-c747d2494350-kube-api-access-fh8jr\") pod \"nova-scheduler-0\" (UID: \"ccf8fba2-1634-4af1-961d-c747d2494350\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.963077 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf8fba2-1634-4af1-961d-c747d2494350-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ccf8fba2-1634-4af1-961d-c747d2494350\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.969560 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccf8fba2-1634-4af1-961d-c747d2494350-config-data\") pod \"nova-scheduler-0\" (UID: \"ccf8fba2-1634-4af1-961d-c747d2494350\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.970634 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf8fba2-1634-4af1-961d-c747d2494350-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"ccf8fba2-1634-4af1-961d-c747d2494350\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:47 crc kubenswrapper[4831]: I1203 08:05:47.985756 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh8jr\" (UniqueName: \"kubernetes.io/projected/ccf8fba2-1634-4af1-961d-c747d2494350-kube-api-access-fh8jr\") pod \"nova-scheduler-0\" (UID: \"ccf8fba2-1634-4af1-961d-c747d2494350\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:48 crc kubenswrapper[4831]: I1203 08:05:48.159405 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:05:48 crc kubenswrapper[4831]: I1203 08:05:48.462217 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 08:05:48 crc kubenswrapper[4831]: I1203 08:05:48.462505 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 08:05:48 crc kubenswrapper[4831]: I1203 08:05:48.608201 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:48 crc kubenswrapper[4831]: I1203 08:05:48.738510 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ccf8fba2-1634-4af1-961d-c747d2494350","Type":"ContainerStarted","Data":"4e3b1abde8e5ddc1bbddd314635cbc0d02461b1c3d28e7dc7badc80457659014"} Dec 03 08:05:49 crc kubenswrapper[4831]: I1203 08:05:49.028069 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba2d1ea0-7885-43ba-86b0-bee016342826" path="/var/lib/kubelet/pods/ba2d1ea0-7885-43ba-86b0-bee016342826/volumes" Dec 03 08:05:49 crc kubenswrapper[4831]: I1203 08:05:49.750417 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ccf8fba2-1634-4af1-961d-c747d2494350","Type":"ContainerStarted","Data":"c5626c4d7e4f9b4506c97ffb5fe21c8c3c5d0e6f56333320081e732f7ca90545"} Dec 03 08:05:49 crc kubenswrapper[4831]: I1203 08:05:49.783974 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.783945466 podStartE2EDuration="2.783945466s" podCreationTimestamp="2025-12-03 08:05:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:49.774361748 +0000 UTC m=+5687.117945286" watchObservedRunningTime="2025-12-03 08:05:49.783945466 +0000 UTC m=+5687.127529004" Dec 03 08:05:52 crc kubenswrapper[4831]: I1203 08:05:52.788939 4831 generic.go:334] "Generic (PLEG): container finished" podID="d5493d05-e214-4ce9-ab1d-39dc6d512041" containerID="32686504a7304266d746f4f6d0cf7c9dd1e65a000d8c310e047b13bf624f9773" exitCode=0 Dec 03 08:05:52 crc kubenswrapper[4831]: I1203 08:05:52.789136 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pclsl" event={"ID":"d5493d05-e214-4ce9-ab1d-39dc6d512041","Type":"ContainerDied","Data":"32686504a7304266d746f4f6d0cf7c9dd1e65a000d8c310e047b13bf624f9773"} Dec 03 08:05:53 crc kubenswrapper[4831]: I1203 08:05:53.160466 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 08:05:53 crc kubenswrapper[4831]: I1203 08:05:53.461889 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 08:05:53 crc kubenswrapper[4831]: I1203 08:05:53.462286 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 08:05:53 crc kubenswrapper[4831]: I1203 08:05:53.469551 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 08:05:53 crc kubenswrapper[4831]: I1203 08:05:53.469617 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.176915 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.196604 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-scripts\") pod \"d5493d05-e214-4ce9-ab1d-39dc6d512041\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.196703 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-combined-ca-bundle\") pod \"d5493d05-e214-4ce9-ab1d-39dc6d512041\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.196759 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-config-data\") pod \"d5493d05-e214-4ce9-ab1d-39dc6d512041\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.196796 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8fjh\" (UniqueName: \"kubernetes.io/projected/d5493d05-e214-4ce9-ab1d-39dc6d512041-kube-api-access-f8fjh\") pod \"d5493d05-e214-4ce9-ab1d-39dc6d512041\" (UID: \"d5493d05-e214-4ce9-ab1d-39dc6d512041\") " Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.204201 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5493d05-e214-4ce9-ab1d-39dc6d512041-kube-api-access-f8fjh" (OuterVolumeSpecName: "kube-api-access-f8fjh") pod "d5493d05-e214-4ce9-ab1d-39dc6d512041" (UID: "d5493d05-e214-4ce9-ab1d-39dc6d512041"). InnerVolumeSpecName "kube-api-access-f8fjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.211450 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-scripts" (OuterVolumeSpecName: "scripts") pod "d5493d05-e214-4ce9-ab1d-39dc6d512041" (UID: "d5493d05-e214-4ce9-ab1d-39dc6d512041"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.240416 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-config-data" (OuterVolumeSpecName: "config-data") pod "d5493d05-e214-4ce9-ab1d-39dc6d512041" (UID: "d5493d05-e214-4ce9-ab1d-39dc6d512041"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.263431 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d5493d05-e214-4ce9-ab1d-39dc6d512041" (UID: "d5493d05-e214-4ce9-ab1d-39dc6d512041"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.299452 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.299705 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8fjh\" (UniqueName: \"kubernetes.io/projected/d5493d05-e214-4ce9-ab1d-39dc6d512041-kube-api-access-f8fjh\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.299847 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.299929 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d5493d05-e214-4ce9-ab1d-39dc6d512041-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.626569 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="027aa669-9cc6-48a9-a586-96d7262dcb56" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.75:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.626614 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9b1c29f8-4166-41ce-8fbe-0587eed28a0a" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.626689 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="9b1c29f8-4166-41ce-8fbe-0587eed28a0a" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.76:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.627091 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="027aa669-9cc6-48a9-a586-96d7262dcb56" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.75:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.810868 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-pclsl" event={"ID":"d5493d05-e214-4ce9-ab1d-39dc6d512041","Type":"ContainerDied","Data":"15a058619561f9dce2170ff15ddc5184ef3356b50904c0486205a6f926e23565"} Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.810936 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15a058619561f9dce2170ff15ddc5184ef3356b50904c0486205a6f926e23565" Dec 03 08:05:54 crc kubenswrapper[4831]: I1203 08:05:54.811344 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-pclsl" Dec 03 08:05:55 crc kubenswrapper[4831]: I1203 08:05:55.026824 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:55 crc kubenswrapper[4831]: I1203 08:05:55.027045 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="ccf8fba2-1634-4af1-961d-c747d2494350" containerName="nova-scheduler-scheduler" containerID="cri-o://c5626c4d7e4f9b4506c97ffb5fe21c8c3c5d0e6f56333320081e732f7ca90545" gracePeriod=30 Dec 03 08:05:55 crc kubenswrapper[4831]: I1203 08:05:55.051916 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:05:55 crc kubenswrapper[4831]: I1203 08:05:55.052205 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9b1c29f8-4166-41ce-8fbe-0587eed28a0a" containerName="nova-api-log" containerID="cri-o://becdc3486ae606a2bf46b5a2cad39a2eebb5a2e6e426b83e831cb7f592f43f40" gracePeriod=30 Dec 03 08:05:55 crc kubenswrapper[4831]: I1203 08:05:55.052423 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="9b1c29f8-4166-41ce-8fbe-0587eed28a0a" containerName="nova-api-api" containerID="cri-o://1d72b1e1a04c76e955ee542cc76cdc13ef6d506f7b6011f86edd6b5f8961fda2" gracePeriod=30 Dec 03 08:05:55 crc kubenswrapper[4831]: I1203 08:05:55.063609 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:05:55 crc kubenswrapper[4831]: I1203 08:05:55.063997 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="027aa669-9cc6-48a9-a586-96d7262dcb56" containerName="nova-metadata-log" containerID="cri-o://c59079d794e16c737c3793cdf6d573fbad79b4a471b5211d091e4a7e9f15a77d" gracePeriod=30 Dec 03 08:05:55 crc kubenswrapper[4831]: I1203 08:05:55.064122 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="027aa669-9cc6-48a9-a586-96d7262dcb56" containerName="nova-metadata-metadata" containerID="cri-o://3c7bd82db298023341a32e950acf0223eea9402dd67faf80f569870c843360fc" gracePeriod=30 Dec 03 08:05:55 crc kubenswrapper[4831]: I1203 08:05:55.848136 4831 generic.go:334] "Generic (PLEG): container finished" podID="027aa669-9cc6-48a9-a586-96d7262dcb56" containerID="c59079d794e16c737c3793cdf6d573fbad79b4a471b5211d091e4a7e9f15a77d" exitCode=143 Dec 03 08:05:55 crc kubenswrapper[4831]: I1203 08:05:55.848215 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"027aa669-9cc6-48a9-a586-96d7262dcb56","Type":"ContainerDied","Data":"c59079d794e16c737c3793cdf6d573fbad79b4a471b5211d091e4a7e9f15a77d"} Dec 03 08:05:55 crc kubenswrapper[4831]: I1203 08:05:55.852084 4831 generic.go:334] "Generic (PLEG): container finished" podID="9b1c29f8-4166-41ce-8fbe-0587eed28a0a" containerID="becdc3486ae606a2bf46b5a2cad39a2eebb5a2e6e426b83e831cb7f592f43f40" exitCode=143 Dec 03 08:05:55 crc kubenswrapper[4831]: I1203 08:05:55.852140 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b1c29f8-4166-41ce-8fbe-0587eed28a0a","Type":"ContainerDied","Data":"becdc3486ae606a2bf46b5a2cad39a2eebb5a2e6e426b83e831cb7f592f43f40"} Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.450982 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.549269 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccf8fba2-1634-4af1-961d-c747d2494350-config-data\") pod \"ccf8fba2-1634-4af1-961d-c747d2494350\" (UID: \"ccf8fba2-1634-4af1-961d-c747d2494350\") " Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.549375 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fh8jr\" (UniqueName: \"kubernetes.io/projected/ccf8fba2-1634-4af1-961d-c747d2494350-kube-api-access-fh8jr\") pod \"ccf8fba2-1634-4af1-961d-c747d2494350\" (UID: \"ccf8fba2-1634-4af1-961d-c747d2494350\") " Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.549490 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf8fba2-1634-4af1-961d-c747d2494350-combined-ca-bundle\") pod \"ccf8fba2-1634-4af1-961d-c747d2494350\" (UID: \"ccf8fba2-1634-4af1-961d-c747d2494350\") " Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.555846 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccf8fba2-1634-4af1-961d-c747d2494350-kube-api-access-fh8jr" (OuterVolumeSpecName: "kube-api-access-fh8jr") pod "ccf8fba2-1634-4af1-961d-c747d2494350" (UID: "ccf8fba2-1634-4af1-961d-c747d2494350"). InnerVolumeSpecName "kube-api-access-fh8jr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.577907 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccf8fba2-1634-4af1-961d-c747d2494350-config-data" (OuterVolumeSpecName: "config-data") pod "ccf8fba2-1634-4af1-961d-c747d2494350" (UID: "ccf8fba2-1634-4af1-961d-c747d2494350"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.592855 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccf8fba2-1634-4af1-961d-c747d2494350-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ccf8fba2-1634-4af1-961d-c747d2494350" (UID: "ccf8fba2-1634-4af1-961d-c747d2494350"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.651109 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ccf8fba2-1634-4af1-961d-c747d2494350-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.651171 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ccf8fba2-1634-4af1-961d-c747d2494350-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.651196 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fh8jr\" (UniqueName: \"kubernetes.io/projected/ccf8fba2-1634-4af1-961d-c747d2494350-kube-api-access-fh8jr\") on node \"crc\" DevicePath \"\"" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.880640 4831 generic.go:334] "Generic (PLEG): container finished" podID="ccf8fba2-1634-4af1-961d-c747d2494350" containerID="c5626c4d7e4f9b4506c97ffb5fe21c8c3c5d0e6f56333320081e732f7ca90545" exitCode=0 Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.880728 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ccf8fba2-1634-4af1-961d-c747d2494350","Type":"ContainerDied","Data":"c5626c4d7e4f9b4506c97ffb5fe21c8c3c5d0e6f56333320081e732f7ca90545"} Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.881020 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"ccf8fba2-1634-4af1-961d-c747d2494350","Type":"ContainerDied","Data":"4e3b1abde8e5ddc1bbddd314635cbc0d02461b1c3d28e7dc7badc80457659014"} Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.881057 4831 scope.go:117] "RemoveContainer" containerID="c5626c4d7e4f9b4506c97ffb5fe21c8c3c5d0e6f56333320081e732f7ca90545" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.880760 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.918904 4831 scope.go:117] "RemoveContainer" containerID="c5626c4d7e4f9b4506c97ffb5fe21c8c3c5d0e6f56333320081e732f7ca90545" Dec 03 08:05:56 crc kubenswrapper[4831]: E1203 08:05:56.919950 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5626c4d7e4f9b4506c97ffb5fe21c8c3c5d0e6f56333320081e732f7ca90545\": container with ID starting with c5626c4d7e4f9b4506c97ffb5fe21c8c3c5d0e6f56333320081e732f7ca90545 not found: ID does not exist" containerID="c5626c4d7e4f9b4506c97ffb5fe21c8c3c5d0e6f56333320081e732f7ca90545" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.920025 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5626c4d7e4f9b4506c97ffb5fe21c8c3c5d0e6f56333320081e732f7ca90545"} err="failed to get container status \"c5626c4d7e4f9b4506c97ffb5fe21c8c3c5d0e6f56333320081e732f7ca90545\": rpc error: code = NotFound desc = could not find container \"c5626c4d7e4f9b4506c97ffb5fe21c8c3c5d0e6f56333320081e732f7ca90545\": container with ID starting with c5626c4d7e4f9b4506c97ffb5fe21c8c3c5d0e6f56333320081e732f7ca90545 not found: ID does not exist" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.938597 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.947873 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.969010 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:56 crc kubenswrapper[4831]: E1203 08:05:56.974694 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccf8fba2-1634-4af1-961d-c747d2494350" containerName="nova-scheduler-scheduler" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.974737 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccf8fba2-1634-4af1-961d-c747d2494350" containerName="nova-scheduler-scheduler" Dec 03 08:05:56 crc kubenswrapper[4831]: E1203 08:05:56.974757 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5493d05-e214-4ce9-ab1d-39dc6d512041" containerName="nova-manage" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.974768 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5493d05-e214-4ce9-ab1d-39dc6d512041" containerName="nova-manage" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.975153 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5493d05-e214-4ce9-ab1d-39dc6d512041" containerName="nova-manage" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.975195 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccf8fba2-1634-4af1-961d-c747d2494350" containerName="nova-scheduler-scheduler" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.976184 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:05:56 crc kubenswrapper[4831]: I1203 08:05:56.980407 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.024497 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccf8fba2-1634-4af1-961d-c747d2494350" path="/var/lib/kubelet/pods/ccf8fba2-1634-4af1-961d-c747d2494350/volumes" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.024993 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.059867 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455748d2-1e53-4029-a091-77d21b68e219-config-data\") pod \"nova-scheduler-0\" (UID: \"455748d2-1e53-4029-a091-77d21b68e219\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.059987 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg7nw\" (UniqueName: \"kubernetes.io/projected/455748d2-1e53-4029-a091-77d21b68e219-kube-api-access-xg7nw\") pod \"nova-scheduler-0\" (UID: \"455748d2-1e53-4029-a091-77d21b68e219\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.060261 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455748d2-1e53-4029-a091-77d21b68e219-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"455748d2-1e53-4029-a091-77d21b68e219\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.163704 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xg7nw\" (UniqueName: \"kubernetes.io/projected/455748d2-1e53-4029-a091-77d21b68e219-kube-api-access-xg7nw\") pod \"nova-scheduler-0\" (UID: \"455748d2-1e53-4029-a091-77d21b68e219\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.163940 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455748d2-1e53-4029-a091-77d21b68e219-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"455748d2-1e53-4029-a091-77d21b68e219\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.164166 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455748d2-1e53-4029-a091-77d21b68e219-config-data\") pod \"nova-scheduler-0\" (UID: \"455748d2-1e53-4029-a091-77d21b68e219\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.169829 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455748d2-1e53-4029-a091-77d21b68e219-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"455748d2-1e53-4029-a091-77d21b68e219\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.174009 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455748d2-1e53-4029-a091-77d21b68e219-config-data\") pod \"nova-scheduler-0\" (UID: \"455748d2-1e53-4029-a091-77d21b68e219\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.186813 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xg7nw\" (UniqueName: \"kubernetes.io/projected/455748d2-1e53-4029-a091-77d21b68e219-kube-api-access-xg7nw\") pod \"nova-scheduler-0\" (UID: \"455748d2-1e53-4029-a091-77d21b68e219\") " pod="openstack/nova-scheduler-0" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.324086 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.596590 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.597005 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.597064 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.597877 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.597941 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" gracePeriod=600 Dec 03 08:05:57 crc kubenswrapper[4831]: E1203 08:05:57.720405 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:05:57 crc kubenswrapper[4831]: W1203 08:05:57.836189 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod455748d2_1e53_4029_a091_77d21b68e219.slice/crio-025c93d122dc151904c60c439c62bf3afff6b59e3ad44f0781ec14e7e8c6dadf WatchSource:0}: Error finding container 025c93d122dc151904c60c439c62bf3afff6b59e3ad44f0781ec14e7e8c6dadf: Status 404 returned error can't find the container with id 025c93d122dc151904c60c439c62bf3afff6b59e3ad44f0781ec14e7e8c6dadf Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.837667 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.894166 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" exitCode=0 Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.894259 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342"} Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.894358 4831 scope.go:117] "RemoveContainer" containerID="b048367eeeb7f28a9954ddb05cb80ad0ca3f94ca3e078cded79e040a0e5a5a5d" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.895109 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:05:57 crc kubenswrapper[4831]: E1203 08:05:57.895675 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:05:57 crc kubenswrapper[4831]: I1203 08:05:57.896372 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"455748d2-1e53-4029-a091-77d21b68e219","Type":"ContainerStarted","Data":"025c93d122dc151904c60c439c62bf3afff6b59e3ad44f0781ec14e7e8c6dadf"} Dec 03 08:05:58 crc kubenswrapper[4831]: I1203 08:05:58.917303 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"455748d2-1e53-4029-a091-77d21b68e219","Type":"ContainerStarted","Data":"2f243a0192fe0f0f7d13fb18f6ac6c723d3c8a64197b85f337ad42a2446030ea"} Dec 03 08:05:58 crc kubenswrapper[4831]: I1203 08:05:58.943817 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.943756579 podStartE2EDuration="2.943756579s" podCreationTimestamp="2025-12-03 08:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:05:58.942573442 +0000 UTC m=+5696.286156990" watchObservedRunningTime="2025-12-03 08:05:58.943756579 +0000 UTC m=+5696.287340127" Dec 03 08:05:59 crc kubenswrapper[4831]: I1203 08:05:59.931106 4831 generic.go:334] "Generic (PLEG): container finished" podID="9b1c29f8-4166-41ce-8fbe-0587eed28a0a" containerID="1d72b1e1a04c76e955ee542cc76cdc13ef6d506f7b6011f86edd6b5f8961fda2" exitCode=0 Dec 03 08:05:59 crc kubenswrapper[4831]: I1203 08:05:59.931470 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b1c29f8-4166-41ce-8fbe-0587eed28a0a","Type":"ContainerDied","Data":"1d72b1e1a04c76e955ee542cc76cdc13ef6d506f7b6011f86edd6b5f8961fda2"} Dec 03 08:05:59 crc kubenswrapper[4831]: I1203 08:05:59.934051 4831 generic.go:334] "Generic (PLEG): container finished" podID="027aa669-9cc6-48a9-a586-96d7262dcb56" containerID="3c7bd82db298023341a32e950acf0223eea9402dd67faf80f569870c843360fc" exitCode=0 Dec 03 08:05:59 crc kubenswrapper[4831]: I1203 08:05:59.940405 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"027aa669-9cc6-48a9-a586-96d7262dcb56","Type":"ContainerDied","Data":"3c7bd82db298023341a32e950acf0223eea9402dd67faf80f569870c843360fc"} Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.094017 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.103284 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.253530 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/027aa669-9cc6-48a9-a586-96d7262dcb56-logs\") pod \"027aa669-9cc6-48a9-a586-96d7262dcb56\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.253922 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-config-data\") pod \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.253978 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/027aa669-9cc6-48a9-a586-96d7262dcb56-config-data\") pod \"027aa669-9cc6-48a9-a586-96d7262dcb56\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.254090 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/027aa669-9cc6-48a9-a586-96d7262dcb56-combined-ca-bundle\") pod \"027aa669-9cc6-48a9-a586-96d7262dcb56\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.254173 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-combined-ca-bundle\") pod \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.254219 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tcnk\" (UniqueName: \"kubernetes.io/projected/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-kube-api-access-4tcnk\") pod \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.254267 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-logs\") pod \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\" (UID: \"9b1c29f8-4166-41ce-8fbe-0587eed28a0a\") " Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.254311 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpzp5\" (UniqueName: \"kubernetes.io/projected/027aa669-9cc6-48a9-a586-96d7262dcb56-kube-api-access-cpzp5\") pod \"027aa669-9cc6-48a9-a586-96d7262dcb56\" (UID: \"027aa669-9cc6-48a9-a586-96d7262dcb56\") " Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.254436 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/027aa669-9cc6-48a9-a586-96d7262dcb56-logs" (OuterVolumeSpecName: "logs") pod "027aa669-9cc6-48a9-a586-96d7262dcb56" (UID: "027aa669-9cc6-48a9-a586-96d7262dcb56"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.254930 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/027aa669-9cc6-48a9-a586-96d7262dcb56-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.256017 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-logs" (OuterVolumeSpecName: "logs") pod "9b1c29f8-4166-41ce-8fbe-0587eed28a0a" (UID: "9b1c29f8-4166-41ce-8fbe-0587eed28a0a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.261313 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/027aa669-9cc6-48a9-a586-96d7262dcb56-kube-api-access-cpzp5" (OuterVolumeSpecName: "kube-api-access-cpzp5") pod "027aa669-9cc6-48a9-a586-96d7262dcb56" (UID: "027aa669-9cc6-48a9-a586-96d7262dcb56"). InnerVolumeSpecName "kube-api-access-cpzp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.261613 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-kube-api-access-4tcnk" (OuterVolumeSpecName: "kube-api-access-4tcnk") pod "9b1c29f8-4166-41ce-8fbe-0587eed28a0a" (UID: "9b1c29f8-4166-41ce-8fbe-0587eed28a0a"). InnerVolumeSpecName "kube-api-access-4tcnk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.278723 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-config-data" (OuterVolumeSpecName: "config-data") pod "9b1c29f8-4166-41ce-8fbe-0587eed28a0a" (UID: "9b1c29f8-4166-41ce-8fbe-0587eed28a0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.294004 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/027aa669-9cc6-48a9-a586-96d7262dcb56-config-data" (OuterVolumeSpecName: "config-data") pod "027aa669-9cc6-48a9-a586-96d7262dcb56" (UID: "027aa669-9cc6-48a9-a586-96d7262dcb56"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.309881 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9b1c29f8-4166-41ce-8fbe-0587eed28a0a" (UID: "9b1c29f8-4166-41ce-8fbe-0587eed28a0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.314475 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/027aa669-9cc6-48a9-a586-96d7262dcb56-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "027aa669-9cc6-48a9-a586-96d7262dcb56" (UID: "027aa669-9cc6-48a9-a586-96d7262dcb56"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.356432 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.356468 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tcnk\" (UniqueName: \"kubernetes.io/projected/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-kube-api-access-4tcnk\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.356480 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.356489 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cpzp5\" (UniqueName: \"kubernetes.io/projected/027aa669-9cc6-48a9-a586-96d7262dcb56-kube-api-access-cpzp5\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.356498 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9b1c29f8-4166-41ce-8fbe-0587eed28a0a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.356507 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/027aa669-9cc6-48a9-a586-96d7262dcb56-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.356514 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/027aa669-9cc6-48a9-a586-96d7262dcb56-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.959400 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"027aa669-9cc6-48a9-a586-96d7262dcb56","Type":"ContainerDied","Data":"fcb4e966b088aa592485b67efe14b4362ad53aad929de07051aa69734725a294"} Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.959485 4831 scope.go:117] "RemoveContainer" containerID="3c7bd82db298023341a32e950acf0223eea9402dd67faf80f569870c843360fc" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.959720 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.965490 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"9b1c29f8-4166-41ce-8fbe-0587eed28a0a","Type":"ContainerDied","Data":"8ba3e78637c75c63883e12fb9c469e3f578cf3210181f0f8524da666bea0e94b"} Dec 03 08:06:00 crc kubenswrapper[4831]: I1203 08:06:00.965754 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.011635 4831 scope.go:117] "RemoveContainer" containerID="c59079d794e16c737c3793cdf6d573fbad79b4a471b5211d091e4a7e9f15a77d" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.046040 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.057599 4831 scope.go:117] "RemoveContainer" containerID="1d72b1e1a04c76e955ee542cc76cdc13ef6d506f7b6011f86edd6b5f8961fda2" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.058848 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.073848 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.086766 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.101493 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:06:01 crc kubenswrapper[4831]: E1203 08:06:01.102014 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="027aa669-9cc6-48a9-a586-96d7262dcb56" containerName="nova-metadata-metadata" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.102035 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="027aa669-9cc6-48a9-a586-96d7262dcb56" containerName="nova-metadata-metadata" Dec 03 08:06:01 crc kubenswrapper[4831]: E1203 08:06:01.102061 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b1c29f8-4166-41ce-8fbe-0587eed28a0a" containerName="nova-api-api" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.102071 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1c29f8-4166-41ce-8fbe-0587eed28a0a" containerName="nova-api-api" Dec 03 08:06:01 crc kubenswrapper[4831]: E1203 08:06:01.102099 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b1c29f8-4166-41ce-8fbe-0587eed28a0a" containerName="nova-api-log" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.102108 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b1c29f8-4166-41ce-8fbe-0587eed28a0a" containerName="nova-api-log" Dec 03 08:06:01 crc kubenswrapper[4831]: E1203 08:06:01.102131 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="027aa669-9cc6-48a9-a586-96d7262dcb56" containerName="nova-metadata-log" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.102140 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="027aa669-9cc6-48a9-a586-96d7262dcb56" containerName="nova-metadata-log" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.102374 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b1c29f8-4166-41ce-8fbe-0587eed28a0a" containerName="nova-api-log" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.102409 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="027aa669-9cc6-48a9-a586-96d7262dcb56" containerName="nova-metadata-log" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.102425 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b1c29f8-4166-41ce-8fbe-0587eed28a0a" containerName="nova-api-api" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.102439 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="027aa669-9cc6-48a9-a586-96d7262dcb56" containerName="nova-metadata-metadata" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.103622 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.106691 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.110189 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.115104 4831 scope.go:117] "RemoveContainer" containerID="becdc3486ae606a2bf46b5a2cad39a2eebb5a2e6e426b83e831cb7f592f43f40" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.130247 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.132226 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.134361 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.138739 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.278437 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74091ef-35c2-465b-a80d-5df2772e3f9d-logs\") pod \"nova-metadata-0\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.279022 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28qcg\" (UniqueName: \"kubernetes.io/projected/f74091ef-35c2-465b-a80d-5df2772e3f9d-kube-api-access-28qcg\") pod \"nova-metadata-0\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.279140 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74091ef-35c2-465b-a80d-5df2772e3f9d-config-data\") pod \"nova-metadata-0\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.279201 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qbqx\" (UniqueName: \"kubernetes.io/projected/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-kube-api-access-4qbqx\") pod \"nova-api-0\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.279244 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-config-data\") pod \"nova-api-0\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.279599 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.279660 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-logs\") pod \"nova-api-0\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.279709 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74091ef-35c2-465b-a80d-5df2772e3f9d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.382196 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74091ef-35c2-465b-a80d-5df2772e3f9d-config-data\") pod \"nova-metadata-0\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.382257 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qbqx\" (UniqueName: \"kubernetes.io/projected/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-kube-api-access-4qbqx\") pod \"nova-api-0\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.382288 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-config-data\") pod \"nova-api-0\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.382310 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.382362 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-logs\") pod \"nova-api-0\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.382385 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74091ef-35c2-465b-a80d-5df2772e3f9d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.382485 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74091ef-35c2-465b-a80d-5df2772e3f9d-logs\") pod \"nova-metadata-0\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.383338 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74091ef-35c2-465b-a80d-5df2772e3f9d-logs\") pod \"nova-metadata-0\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.383359 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-logs\") pod \"nova-api-0\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.383551 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28qcg\" (UniqueName: \"kubernetes.io/projected/f74091ef-35c2-465b-a80d-5df2772e3f9d-kube-api-access-28qcg\") pod \"nova-metadata-0\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.388871 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74091ef-35c2-465b-a80d-5df2772e3f9d-config-data\") pod \"nova-metadata-0\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.390041 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74091ef-35c2-465b-a80d-5df2772e3f9d-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.397781 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.397940 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-config-data\") pod \"nova-api-0\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.402989 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28qcg\" (UniqueName: \"kubernetes.io/projected/f74091ef-35c2-465b-a80d-5df2772e3f9d-kube-api-access-28qcg\") pod \"nova-metadata-0\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.405961 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qbqx\" (UniqueName: \"kubernetes.io/projected/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-kube-api-access-4qbqx\") pod \"nova-api-0\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.432010 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.453551 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:06:01 crc kubenswrapper[4831]: W1203 08:06:01.808951 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode2d6f10f_5fa7_4f7d_b814_332a9a8db55f.slice/crio-7b1c0a07191f6de5e792d7ca4d4d19a33ef9910708e2fe9e9fb66d8be6b16007 WatchSource:0}: Error finding container 7b1c0a07191f6de5e792d7ca4d4d19a33ef9910708e2fe9e9fb66d8be6b16007: Status 404 returned error can't find the container with id 7b1c0a07191f6de5e792d7ca4d4d19a33ef9910708e2fe9e9fb66d8be6b16007 Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.810838 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.934458 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:06:01 crc kubenswrapper[4831]: W1203 08:06:01.943141 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf74091ef_35c2_465b_a80d_5df2772e3f9d.slice/crio-ae076ba64f9c9872f6dc0b44e8597237cfea8a98c0f4e975dc12b52019afb791 WatchSource:0}: Error finding container ae076ba64f9c9872f6dc0b44e8597237cfea8a98c0f4e975dc12b52019afb791: Status 404 returned error can't find the container with id ae076ba64f9c9872f6dc0b44e8597237cfea8a98c0f4e975dc12b52019afb791 Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.982043 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f","Type":"ContainerStarted","Data":"7b1c0a07191f6de5e792d7ca4d4d19a33ef9910708e2fe9e9fb66d8be6b16007"} Dec 03 08:06:01 crc kubenswrapper[4831]: I1203 08:06:01.985896 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f74091ef-35c2-465b-a80d-5df2772e3f9d","Type":"ContainerStarted","Data":"ae076ba64f9c9872f6dc0b44e8597237cfea8a98c0f4e975dc12b52019afb791"} Dec 03 08:06:02 crc kubenswrapper[4831]: I1203 08:06:02.324808 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 08:06:03 crc kubenswrapper[4831]: I1203 08:06:03.000440 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f","Type":"ContainerStarted","Data":"21f3d16ad710a41a6d61da46b78e9f46465c6b65f3e70ef7c051c24dbc27b7fd"} Dec 03 08:06:03 crc kubenswrapper[4831]: I1203 08:06:03.000890 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f","Type":"ContainerStarted","Data":"deeb7027104d9edf24993d23a81a71ed05d4a1ef9086743bdab4fe980caf1f09"} Dec 03 08:06:03 crc kubenswrapper[4831]: I1203 08:06:03.003435 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f74091ef-35c2-465b-a80d-5df2772e3f9d","Type":"ContainerStarted","Data":"e2faead6d37526200dafdf30eb76d97cc94f6be90b39944c47754b5e9c2b7c56"} Dec 03 08:06:03 crc kubenswrapper[4831]: I1203 08:06:03.003480 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f74091ef-35c2-465b-a80d-5df2772e3f9d","Type":"ContainerStarted","Data":"e3f9c75998dec42b0e4284b561eb4bc419a5e51ee9e81070bbd4e1b5f90d2290"} Dec 03 08:06:03 crc kubenswrapper[4831]: I1203 08:06:03.068872 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.068841601 podStartE2EDuration="2.068841601s" podCreationTimestamp="2025-12-03 08:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:06:03.040414666 +0000 UTC m=+5700.383998204" watchObservedRunningTime="2025-12-03 08:06:03.068841601 +0000 UTC m=+5700.412425119" Dec 03 08:06:03 crc kubenswrapper[4831]: I1203 08:06:03.070775 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="027aa669-9cc6-48a9-a586-96d7262dcb56" path="/var/lib/kubelet/pods/027aa669-9cc6-48a9-a586-96d7262dcb56/volumes" Dec 03 08:06:03 crc kubenswrapper[4831]: I1203 08:06:03.071467 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b1c29f8-4166-41ce-8fbe-0587eed28a0a" path="/var/lib/kubelet/pods/9b1c29f8-4166-41ce-8fbe-0587eed28a0a/volumes" Dec 03 08:06:03 crc kubenswrapper[4831]: I1203 08:06:03.101818 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.1017938369999998 podStartE2EDuration="2.101793837s" podCreationTimestamp="2025-12-03 08:06:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:06:03.093951412 +0000 UTC m=+5700.437534920" watchObservedRunningTime="2025-12-03 08:06:03.101793837 +0000 UTC m=+5700.445377355" Dec 03 08:06:06 crc kubenswrapper[4831]: I1203 08:06:06.432237 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 08:06:06 crc kubenswrapper[4831]: I1203 08:06:06.432787 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 08:06:07 crc kubenswrapper[4831]: I1203 08:06:07.324688 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 08:06:07 crc kubenswrapper[4831]: I1203 08:06:07.374153 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 08:06:08 crc kubenswrapper[4831]: I1203 08:06:08.118416 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 08:06:09 crc kubenswrapper[4831]: I1203 08:06:09.012740 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:06:09 crc kubenswrapper[4831]: E1203 08:06:09.013224 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:06:11 crc kubenswrapper[4831]: I1203 08:06:11.432444 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 08:06:11 crc kubenswrapper[4831]: I1203 08:06:11.433387 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 08:06:11 crc kubenswrapper[4831]: I1203 08:06:11.455078 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 08:06:11 crc kubenswrapper[4831]: I1203 08:06:11.455195 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 08:06:12 crc kubenswrapper[4831]: I1203 08:06:12.597568 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.80:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:06:12 crc kubenswrapper[4831]: I1203 08:06:12.597568 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.80:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:06:12 crc kubenswrapper[4831]: I1203 08:06:12.597636 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:06:12 crc kubenswrapper[4831]: I1203 08:06:12.597921 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.81:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:06:19 crc kubenswrapper[4831]: I1203 08:06:19.773508 4831 scope.go:117] "RemoveContainer" containerID="9ccc325c6c1a21c7688c9a00f678b1bda9089ee92ad8bdba2bd474ba475ef818" Dec 03 08:06:21 crc kubenswrapper[4831]: I1203 08:06:21.013463 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:06:21 crc kubenswrapper[4831]: E1203 08:06:21.014057 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:06:21 crc kubenswrapper[4831]: I1203 08:06:21.435695 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 08:06:21 crc kubenswrapper[4831]: I1203 08:06:21.438057 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 08:06:21 crc kubenswrapper[4831]: I1203 08:06:21.438916 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 08:06:21 crc kubenswrapper[4831]: I1203 08:06:21.467210 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 08:06:21 crc kubenswrapper[4831]: I1203 08:06:21.467572 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 08:06:21 crc kubenswrapper[4831]: I1203 08:06:21.467756 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 08:06:21 crc kubenswrapper[4831]: I1203 08:06:21.476021 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.202261 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.204419 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.207095 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.466803 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-9fc9f4dc7-bbl65"] Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.468861 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.482956 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9fc9f4dc7-bbl65"] Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.643660 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th2rv\" (UniqueName: \"kubernetes.io/projected/a04ea935-caf4-46ad-8187-bd86753b3692-kube-api-access-th2rv\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.643753 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-config\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.643857 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-ovsdbserver-nb\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.643933 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-dns-svc\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.644045 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-ovsdbserver-sb\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.745663 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-ovsdbserver-sb\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.745734 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th2rv\" (UniqueName: \"kubernetes.io/projected/a04ea935-caf4-46ad-8187-bd86753b3692-kube-api-access-th2rv\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.745760 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-config\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.745803 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-ovsdbserver-nb\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.745838 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-dns-svc\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.746588 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-dns-svc\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.746690 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-ovsdbserver-sb\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.747421 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-config\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.747446 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-ovsdbserver-nb\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.784932 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th2rv\" (UniqueName: \"kubernetes.io/projected/a04ea935-caf4-46ad-8187-bd86753b3692-kube-api-access-th2rv\") pod \"dnsmasq-dns-9fc9f4dc7-bbl65\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:22 crc kubenswrapper[4831]: I1203 08:06:22.800497 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:23 crc kubenswrapper[4831]: I1203 08:06:23.271086 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-9fc9f4dc7-bbl65"] Dec 03 08:06:24 crc kubenswrapper[4831]: I1203 08:06:24.218286 4831 generic.go:334] "Generic (PLEG): container finished" podID="a04ea935-caf4-46ad-8187-bd86753b3692" containerID="b86ded81795837938b8215fd2c69fc070135bb0f0dfeb116852ab90277f39936" exitCode=0 Dec 03 08:06:24 crc kubenswrapper[4831]: I1203 08:06:24.218339 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" event={"ID":"a04ea935-caf4-46ad-8187-bd86753b3692","Type":"ContainerDied","Data":"b86ded81795837938b8215fd2c69fc070135bb0f0dfeb116852ab90277f39936"} Dec 03 08:06:24 crc kubenswrapper[4831]: I1203 08:06:24.218802 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" event={"ID":"a04ea935-caf4-46ad-8187-bd86753b3692","Type":"ContainerStarted","Data":"ce6b09b87c96c05f4b0f1b3a2f819ee9c261018e109d0a0b6840c3287c964d26"} Dec 03 08:06:25 crc kubenswrapper[4831]: I1203 08:06:25.227532 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" event={"ID":"a04ea935-caf4-46ad-8187-bd86753b3692","Type":"ContainerStarted","Data":"0d9031b2be3a91f0a36174f84f9d85279759c4da6dc4056674d6f17b3dd77de5"} Dec 03 08:06:25 crc kubenswrapper[4831]: I1203 08:06:25.228968 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:25 crc kubenswrapper[4831]: I1203 08:06:25.263993 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" podStartSLOduration=3.263975864 podStartE2EDuration="3.263975864s" podCreationTimestamp="2025-12-03 08:06:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:06:25.259460223 +0000 UTC m=+5722.603043721" watchObservedRunningTime="2025-12-03 08:06:25.263975864 +0000 UTC m=+5722.607559372" Dec 03 08:06:32 crc kubenswrapper[4831]: I1203 08:06:32.013394 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:06:32 crc kubenswrapper[4831]: E1203 08:06:32.014367 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:06:32 crc kubenswrapper[4831]: I1203 08:06:32.802696 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:32 crc kubenswrapper[4831]: I1203 08:06:32.896901 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77595d75f7-z7pp2"] Dec 03 08:06:32 crc kubenswrapper[4831]: I1203 08:06:32.897284 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" podUID="a8de247c-1cb7-448b-9071-0742bea10b51" containerName="dnsmasq-dns" containerID="cri-o://c2aa2133cc51efb73d16b353a60c8108bd5985f29ff0bcd06041b48f162bbefc" gracePeriod=10 Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.329795 4831 generic.go:334] "Generic (PLEG): container finished" podID="a8de247c-1cb7-448b-9071-0742bea10b51" containerID="c2aa2133cc51efb73d16b353a60c8108bd5985f29ff0bcd06041b48f162bbefc" exitCode=0 Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.329898 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" event={"ID":"a8de247c-1cb7-448b-9071-0742bea10b51","Type":"ContainerDied","Data":"c2aa2133cc51efb73d16b353a60c8108bd5985f29ff0bcd06041b48f162bbefc"} Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.402178 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.556088 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-ovsdbserver-nb\") pod \"a8de247c-1cb7-448b-9071-0742bea10b51\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.556165 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-ovsdbserver-sb\") pod \"a8de247c-1cb7-448b-9071-0742bea10b51\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.556239 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-config\") pod \"a8de247c-1cb7-448b-9071-0742bea10b51\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.556280 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gfzg4\" (UniqueName: \"kubernetes.io/projected/a8de247c-1cb7-448b-9071-0742bea10b51-kube-api-access-gfzg4\") pod \"a8de247c-1cb7-448b-9071-0742bea10b51\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.556385 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-dns-svc\") pod \"a8de247c-1cb7-448b-9071-0742bea10b51\" (UID: \"a8de247c-1cb7-448b-9071-0742bea10b51\") " Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.574567 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8de247c-1cb7-448b-9071-0742bea10b51-kube-api-access-gfzg4" (OuterVolumeSpecName: "kube-api-access-gfzg4") pod "a8de247c-1cb7-448b-9071-0742bea10b51" (UID: "a8de247c-1cb7-448b-9071-0742bea10b51"). InnerVolumeSpecName "kube-api-access-gfzg4". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.609833 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-config" (OuterVolumeSpecName: "config") pod "a8de247c-1cb7-448b-9071-0742bea10b51" (UID: "a8de247c-1cb7-448b-9071-0742bea10b51"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.612783 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a8de247c-1cb7-448b-9071-0742bea10b51" (UID: "a8de247c-1cb7-448b-9071-0742bea10b51"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.623780 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a8de247c-1cb7-448b-9071-0742bea10b51" (UID: "a8de247c-1cb7-448b-9071-0742bea10b51"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.630105 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a8de247c-1cb7-448b-9071-0742bea10b51" (UID: "a8de247c-1cb7-448b-9071-0742bea10b51"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.658548 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gfzg4\" (UniqueName: \"kubernetes.io/projected/a8de247c-1cb7-448b-9071-0742bea10b51-kube-api-access-gfzg4\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.658595 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.658610 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.658620 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:33 crc kubenswrapper[4831]: I1203 08:06:33.658631 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8de247c-1cb7-448b-9071-0742bea10b51-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:34 crc kubenswrapper[4831]: I1203 08:06:34.383118 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" Dec 03 08:06:34 crc kubenswrapper[4831]: I1203 08:06:34.383247 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-77595d75f7-z7pp2" event={"ID":"a8de247c-1cb7-448b-9071-0742bea10b51","Type":"ContainerDied","Data":"b0ddbea2f7c49c2a1ce4b9b17164b6df0697e86903c49dae5ccb2a92771da182"} Dec 03 08:06:34 crc kubenswrapper[4831]: I1203 08:06:34.383364 4831 scope.go:117] "RemoveContainer" containerID="c2aa2133cc51efb73d16b353a60c8108bd5985f29ff0bcd06041b48f162bbefc" Dec 03 08:06:34 crc kubenswrapper[4831]: I1203 08:06:34.407255 4831 scope.go:117] "RemoveContainer" containerID="f06c2836bde642573a46ebd630af0e45a7fc19dbe484c13278edd6f3cf95bb99" Dec 03 08:06:34 crc kubenswrapper[4831]: I1203 08:06:34.444231 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-77595d75f7-z7pp2"] Dec 03 08:06:34 crc kubenswrapper[4831]: I1203 08:06:34.470991 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-77595d75f7-z7pp2"] Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.026022 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8de247c-1cb7-448b-9071-0742bea10b51" path="/var/lib/kubelet/pods/a8de247c-1cb7-448b-9071-0742bea10b51/volumes" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.585478 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-xjhz6"] Dec 03 08:06:35 crc kubenswrapper[4831]: E1203 08:06:35.585981 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8de247c-1cb7-448b-9071-0742bea10b51" containerName="init" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.586000 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8de247c-1cb7-448b-9071-0742bea10b51" containerName="init" Dec 03 08:06:35 crc kubenswrapper[4831]: E1203 08:06:35.586044 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8de247c-1cb7-448b-9071-0742bea10b51" containerName="dnsmasq-dns" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.586052 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8de247c-1cb7-448b-9071-0742bea10b51" containerName="dnsmasq-dns" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.586261 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8de247c-1cb7-448b-9071-0742bea10b51" containerName="dnsmasq-dns" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.587082 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xjhz6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.595882 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xjhz6"] Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.675329 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-b3e3-account-create-update-ktdk6"] Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.676413 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b3e3-account-create-update-ktdk6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.678656 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.687174 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b3e3-account-create-update-ktdk6"] Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.703831 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b67cda2-032a-4d1d-aa32-8c79fb4828b4-operator-scripts\") pod \"cinder-db-create-xjhz6\" (UID: \"7b67cda2-032a-4d1d-aa32-8c79fb4828b4\") " pod="openstack/cinder-db-create-xjhz6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.703971 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk9wb\" (UniqueName: \"kubernetes.io/projected/7b67cda2-032a-4d1d-aa32-8c79fb4828b4-kube-api-access-vk9wb\") pod \"cinder-db-create-xjhz6\" (UID: \"7b67cda2-032a-4d1d-aa32-8c79fb4828b4\") " pod="openstack/cinder-db-create-xjhz6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.805903 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kz7w\" (UniqueName: \"kubernetes.io/projected/a915b2c9-9e16-4841-b5cd-f572ba326520-kube-api-access-5kz7w\") pod \"cinder-b3e3-account-create-update-ktdk6\" (UID: \"a915b2c9-9e16-4841-b5cd-f572ba326520\") " pod="openstack/cinder-b3e3-account-create-update-ktdk6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.806005 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk9wb\" (UniqueName: \"kubernetes.io/projected/7b67cda2-032a-4d1d-aa32-8c79fb4828b4-kube-api-access-vk9wb\") pod \"cinder-db-create-xjhz6\" (UID: \"7b67cda2-032a-4d1d-aa32-8c79fb4828b4\") " pod="openstack/cinder-db-create-xjhz6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.806128 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b67cda2-032a-4d1d-aa32-8c79fb4828b4-operator-scripts\") pod \"cinder-db-create-xjhz6\" (UID: \"7b67cda2-032a-4d1d-aa32-8c79fb4828b4\") " pod="openstack/cinder-db-create-xjhz6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.806167 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a915b2c9-9e16-4841-b5cd-f572ba326520-operator-scripts\") pod \"cinder-b3e3-account-create-update-ktdk6\" (UID: \"a915b2c9-9e16-4841-b5cd-f572ba326520\") " pod="openstack/cinder-b3e3-account-create-update-ktdk6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.807048 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b67cda2-032a-4d1d-aa32-8c79fb4828b4-operator-scripts\") pod \"cinder-db-create-xjhz6\" (UID: \"7b67cda2-032a-4d1d-aa32-8c79fb4828b4\") " pod="openstack/cinder-db-create-xjhz6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.823443 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk9wb\" (UniqueName: \"kubernetes.io/projected/7b67cda2-032a-4d1d-aa32-8c79fb4828b4-kube-api-access-vk9wb\") pod \"cinder-db-create-xjhz6\" (UID: \"7b67cda2-032a-4d1d-aa32-8c79fb4828b4\") " pod="openstack/cinder-db-create-xjhz6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.908119 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a915b2c9-9e16-4841-b5cd-f572ba326520-operator-scripts\") pod \"cinder-b3e3-account-create-update-ktdk6\" (UID: \"a915b2c9-9e16-4841-b5cd-f572ba326520\") " pod="openstack/cinder-b3e3-account-create-update-ktdk6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.908528 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5kz7w\" (UniqueName: \"kubernetes.io/projected/a915b2c9-9e16-4841-b5cd-f572ba326520-kube-api-access-5kz7w\") pod \"cinder-b3e3-account-create-update-ktdk6\" (UID: \"a915b2c9-9e16-4841-b5cd-f572ba326520\") " pod="openstack/cinder-b3e3-account-create-update-ktdk6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.909019 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a915b2c9-9e16-4841-b5cd-f572ba326520-operator-scripts\") pod \"cinder-b3e3-account-create-update-ktdk6\" (UID: \"a915b2c9-9e16-4841-b5cd-f572ba326520\") " pod="openstack/cinder-b3e3-account-create-update-ktdk6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.911954 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xjhz6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.929877 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kz7w\" (UniqueName: \"kubernetes.io/projected/a915b2c9-9e16-4841-b5cd-f572ba326520-kube-api-access-5kz7w\") pod \"cinder-b3e3-account-create-update-ktdk6\" (UID: \"a915b2c9-9e16-4841-b5cd-f572ba326520\") " pod="openstack/cinder-b3e3-account-create-update-ktdk6" Dec 03 08:06:35 crc kubenswrapper[4831]: I1203 08:06:35.992425 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b3e3-account-create-update-ktdk6" Dec 03 08:06:36 crc kubenswrapper[4831]: I1203 08:06:36.498413 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-xjhz6"] Dec 03 08:06:36 crc kubenswrapper[4831]: I1203 08:06:36.547225 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-b3e3-account-create-update-ktdk6"] Dec 03 08:06:36 crc kubenswrapper[4831]: W1203 08:06:36.560050 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda915b2c9_9e16_4841_b5cd_f572ba326520.slice/crio-6fe229fa299db67290e602003989ab4702a1b87ec5b71b9748b8fbf35ebe7d32 WatchSource:0}: Error finding container 6fe229fa299db67290e602003989ab4702a1b87ec5b71b9748b8fbf35ebe7d32: Status 404 returned error can't find the container with id 6fe229fa299db67290e602003989ab4702a1b87ec5b71b9748b8fbf35ebe7d32 Dec 03 08:06:37 crc kubenswrapper[4831]: I1203 08:06:37.419986 4831 generic.go:334] "Generic (PLEG): container finished" podID="a915b2c9-9e16-4841-b5cd-f572ba326520" containerID="60f57e9f1925805d73f9e26237c75ccc38b17081cf89e1d75e623dcd98b9d30c" exitCode=0 Dec 03 08:06:37 crc kubenswrapper[4831]: I1203 08:06:37.420117 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b3e3-account-create-update-ktdk6" event={"ID":"a915b2c9-9e16-4841-b5cd-f572ba326520","Type":"ContainerDied","Data":"60f57e9f1925805d73f9e26237c75ccc38b17081cf89e1d75e623dcd98b9d30c"} Dec 03 08:06:37 crc kubenswrapper[4831]: I1203 08:06:37.422215 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b3e3-account-create-update-ktdk6" event={"ID":"a915b2c9-9e16-4841-b5cd-f572ba326520","Type":"ContainerStarted","Data":"6fe229fa299db67290e602003989ab4702a1b87ec5b71b9748b8fbf35ebe7d32"} Dec 03 08:06:37 crc kubenswrapper[4831]: I1203 08:06:37.426846 4831 generic.go:334] "Generic (PLEG): container finished" podID="7b67cda2-032a-4d1d-aa32-8c79fb4828b4" containerID="73a3bc39da2ba251a65c32c919cb5e27ca0fd492c08cdecb2d19a87f429875c0" exitCode=0 Dec 03 08:06:37 crc kubenswrapper[4831]: I1203 08:06:37.426894 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xjhz6" event={"ID":"7b67cda2-032a-4d1d-aa32-8c79fb4828b4","Type":"ContainerDied","Data":"73a3bc39da2ba251a65c32c919cb5e27ca0fd492c08cdecb2d19a87f429875c0"} Dec 03 08:06:37 crc kubenswrapper[4831]: I1203 08:06:37.427188 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xjhz6" event={"ID":"7b67cda2-032a-4d1d-aa32-8c79fb4828b4","Type":"ContainerStarted","Data":"63a73d3edddd5191a833b1f6d99bb232551603b0a4562f98b3f198df91662aa6"} Dec 03 08:06:38 crc kubenswrapper[4831]: I1203 08:06:38.866282 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b3e3-account-create-update-ktdk6" Dec 03 08:06:38 crc kubenswrapper[4831]: I1203 08:06:38.873811 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xjhz6" Dec 03 08:06:38 crc kubenswrapper[4831]: I1203 08:06:38.966678 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a915b2c9-9e16-4841-b5cd-f572ba326520-operator-scripts\") pod \"a915b2c9-9e16-4841-b5cd-f572ba326520\" (UID: \"a915b2c9-9e16-4841-b5cd-f572ba326520\") " Dec 03 08:06:38 crc kubenswrapper[4831]: I1203 08:06:38.966768 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk9wb\" (UniqueName: \"kubernetes.io/projected/7b67cda2-032a-4d1d-aa32-8c79fb4828b4-kube-api-access-vk9wb\") pod \"7b67cda2-032a-4d1d-aa32-8c79fb4828b4\" (UID: \"7b67cda2-032a-4d1d-aa32-8c79fb4828b4\") " Dec 03 08:06:38 crc kubenswrapper[4831]: I1203 08:06:38.966797 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5kz7w\" (UniqueName: \"kubernetes.io/projected/a915b2c9-9e16-4841-b5cd-f572ba326520-kube-api-access-5kz7w\") pod \"a915b2c9-9e16-4841-b5cd-f572ba326520\" (UID: \"a915b2c9-9e16-4841-b5cd-f572ba326520\") " Dec 03 08:06:38 crc kubenswrapper[4831]: I1203 08:06:38.966851 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b67cda2-032a-4d1d-aa32-8c79fb4828b4-operator-scripts\") pod \"7b67cda2-032a-4d1d-aa32-8c79fb4828b4\" (UID: \"7b67cda2-032a-4d1d-aa32-8c79fb4828b4\") " Dec 03 08:06:38 crc kubenswrapper[4831]: I1203 08:06:38.967529 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a915b2c9-9e16-4841-b5cd-f572ba326520-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a915b2c9-9e16-4841-b5cd-f572ba326520" (UID: "a915b2c9-9e16-4841-b5cd-f572ba326520"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:06:38 crc kubenswrapper[4831]: I1203 08:06:38.967788 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b67cda2-032a-4d1d-aa32-8c79fb4828b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b67cda2-032a-4d1d-aa32-8c79fb4828b4" (UID: "7b67cda2-032a-4d1d-aa32-8c79fb4828b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:06:38 crc kubenswrapper[4831]: I1203 08:06:38.972370 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a915b2c9-9e16-4841-b5cd-f572ba326520-kube-api-access-5kz7w" (OuterVolumeSpecName: "kube-api-access-5kz7w") pod "a915b2c9-9e16-4841-b5cd-f572ba326520" (UID: "a915b2c9-9e16-4841-b5cd-f572ba326520"). InnerVolumeSpecName "kube-api-access-5kz7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:06:38 crc kubenswrapper[4831]: I1203 08:06:38.972662 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b67cda2-032a-4d1d-aa32-8c79fb4828b4-kube-api-access-vk9wb" (OuterVolumeSpecName: "kube-api-access-vk9wb") pod "7b67cda2-032a-4d1d-aa32-8c79fb4828b4" (UID: "7b67cda2-032a-4d1d-aa32-8c79fb4828b4"). InnerVolumeSpecName "kube-api-access-vk9wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:06:39 crc kubenswrapper[4831]: I1203 08:06:39.070893 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a915b2c9-9e16-4841-b5cd-f572ba326520-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:39 crc kubenswrapper[4831]: I1203 08:06:39.070938 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk9wb\" (UniqueName: \"kubernetes.io/projected/7b67cda2-032a-4d1d-aa32-8c79fb4828b4-kube-api-access-vk9wb\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:39 crc kubenswrapper[4831]: I1203 08:06:39.070951 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5kz7w\" (UniqueName: \"kubernetes.io/projected/a915b2c9-9e16-4841-b5cd-f572ba326520-kube-api-access-5kz7w\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:39 crc kubenswrapper[4831]: I1203 08:06:39.070960 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b67cda2-032a-4d1d-aa32-8c79fb4828b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:39 crc kubenswrapper[4831]: I1203 08:06:39.454045 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-xjhz6" Dec 03 08:06:39 crc kubenswrapper[4831]: I1203 08:06:39.454940 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-xjhz6" event={"ID":"7b67cda2-032a-4d1d-aa32-8c79fb4828b4","Type":"ContainerDied","Data":"63a73d3edddd5191a833b1f6d99bb232551603b0a4562f98b3f198df91662aa6"} Dec 03 08:06:39 crc kubenswrapper[4831]: I1203 08:06:39.454974 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="63a73d3edddd5191a833b1f6d99bb232551603b0a4562f98b3f198df91662aa6" Dec 03 08:06:39 crc kubenswrapper[4831]: I1203 08:06:39.458119 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-b3e3-account-create-update-ktdk6" event={"ID":"a915b2c9-9e16-4841-b5cd-f572ba326520","Type":"ContainerDied","Data":"6fe229fa299db67290e602003989ab4702a1b87ec5b71b9748b8fbf35ebe7d32"} Dec 03 08:06:39 crc kubenswrapper[4831]: I1203 08:06:39.458169 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6fe229fa299db67290e602003989ab4702a1b87ec5b71b9748b8fbf35ebe7d32" Dec 03 08:06:39 crc kubenswrapper[4831]: I1203 08:06:39.458243 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-b3e3-account-create-update-ktdk6" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.833570 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-2mdbb"] Dec 03 08:06:40 crc kubenswrapper[4831]: E1203 08:06:40.835468 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b67cda2-032a-4d1d-aa32-8c79fb4828b4" containerName="mariadb-database-create" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.835709 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b67cda2-032a-4d1d-aa32-8c79fb4828b4" containerName="mariadb-database-create" Dec 03 08:06:40 crc kubenswrapper[4831]: E1203 08:06:40.835732 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a915b2c9-9e16-4841-b5cd-f572ba326520" containerName="mariadb-account-create-update" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.835741 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a915b2c9-9e16-4841-b5cd-f572ba326520" containerName="mariadb-account-create-update" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.835912 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a915b2c9-9e16-4841-b5cd-f572ba326520" containerName="mariadb-account-create-update" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.835933 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b67cda2-032a-4d1d-aa32-8c79fb4828b4" containerName="mariadb-database-create" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.836526 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.838943 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.839184 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-f2sxp" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.839808 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.857359 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2mdbb"] Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.912857 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7c5v\" (UniqueName: \"kubernetes.io/projected/87832c9d-bf1c-45fc-b106-bbcac2b8641c-kube-api-access-j7c5v\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.913145 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-db-sync-config-data\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.913361 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-scripts\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.913534 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87832c9d-bf1c-45fc-b106-bbcac2b8641c-etc-machine-id\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.913660 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-config-data\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:40 crc kubenswrapper[4831]: I1203 08:06:40.913791 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-combined-ca-bundle\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.016498 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-scripts\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.016655 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87832c9d-bf1c-45fc-b106-bbcac2b8641c-etc-machine-id\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.016691 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-config-data\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.016742 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-combined-ca-bundle\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.016783 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j7c5v\" (UniqueName: \"kubernetes.io/projected/87832c9d-bf1c-45fc-b106-bbcac2b8641c-kube-api-access-j7c5v\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.016822 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-db-sync-config-data\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.017353 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87832c9d-bf1c-45fc-b106-bbcac2b8641c-etc-machine-id\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.034371 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-db-sync-config-data\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.034481 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-combined-ca-bundle\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.036734 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-scripts\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.037056 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-config-data\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.037718 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7c5v\" (UniqueName: \"kubernetes.io/projected/87832c9d-bf1c-45fc-b106-bbcac2b8641c-kube-api-access-j7c5v\") pod \"cinder-db-sync-2mdbb\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.167007 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.631156 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-2mdbb"] Dec 03 08:06:41 crc kubenswrapper[4831]: I1203 08:06:41.808310 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2mdbb" event={"ID":"87832c9d-bf1c-45fc-b106-bbcac2b8641c","Type":"ContainerStarted","Data":"5e45447af2350ac720a51536339046e2fc32a9f1427453d7d750ccfdb8b3f4bb"} Dec 03 08:06:42 crc kubenswrapper[4831]: I1203 08:06:42.818164 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2mdbb" event={"ID":"87832c9d-bf1c-45fc-b106-bbcac2b8641c","Type":"ContainerStarted","Data":"8abd76b0bd850cc8d760482a65b0688aa91ebaba0281bdc1736e721dd5d31911"} Dec 03 08:06:42 crc kubenswrapper[4831]: I1203 08:06:42.848991 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-2mdbb" podStartSLOduration=2.848970848 podStartE2EDuration="2.848970848s" podCreationTimestamp="2025-12-03 08:06:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:06:42.838558284 +0000 UTC m=+5740.182141802" watchObservedRunningTime="2025-12-03 08:06:42.848970848 +0000 UTC m=+5740.192554366" Dec 03 08:06:44 crc kubenswrapper[4831]: I1203 08:06:44.013220 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:06:44 crc kubenswrapper[4831]: E1203 08:06:44.014109 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:06:45 crc kubenswrapper[4831]: E1203 08:06:45.064118 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87832c9d_bf1c_45fc_b106_bbcac2b8641c.slice/crio-conmon-8abd76b0bd850cc8d760482a65b0688aa91ebaba0281bdc1736e721dd5d31911.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod87832c9d_bf1c_45fc_b106_bbcac2b8641c.slice/crio-8abd76b0bd850cc8d760482a65b0688aa91ebaba0281bdc1736e721dd5d31911.scope\": RecentStats: unable to find data in memory cache]" Dec 03 08:06:45 crc kubenswrapper[4831]: I1203 08:06:45.856782 4831 generic.go:334] "Generic (PLEG): container finished" podID="87832c9d-bf1c-45fc-b106-bbcac2b8641c" containerID="8abd76b0bd850cc8d760482a65b0688aa91ebaba0281bdc1736e721dd5d31911" exitCode=0 Dec 03 08:06:45 crc kubenswrapper[4831]: I1203 08:06:45.857097 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2mdbb" event={"ID":"87832c9d-bf1c-45fc-b106-bbcac2b8641c","Type":"ContainerDied","Data":"8abd76b0bd850cc8d760482a65b0688aa91ebaba0281bdc1736e721dd5d31911"} Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.381824 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.450018 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87832c9d-bf1c-45fc-b106-bbcac2b8641c-etc-machine-id\") pod \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.450098 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-db-sync-config-data\") pod \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.450158 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-combined-ca-bundle\") pod \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.450177 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/87832c9d-bf1c-45fc-b106-bbcac2b8641c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "87832c9d-bf1c-45fc-b106-bbcac2b8641c" (UID: "87832c9d-bf1c-45fc-b106-bbcac2b8641c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.450208 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-scripts\") pod \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.450341 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-config-data\") pod \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.450592 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7c5v\" (UniqueName: \"kubernetes.io/projected/87832c9d-bf1c-45fc-b106-bbcac2b8641c-kube-api-access-j7c5v\") pod \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\" (UID: \"87832c9d-bf1c-45fc-b106-bbcac2b8641c\") " Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.451444 4831 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/87832c9d-bf1c-45fc-b106-bbcac2b8641c-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.456359 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "87832c9d-bf1c-45fc-b106-bbcac2b8641c" (UID: "87832c9d-bf1c-45fc-b106-bbcac2b8641c"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.457957 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87832c9d-bf1c-45fc-b106-bbcac2b8641c-kube-api-access-j7c5v" (OuterVolumeSpecName: "kube-api-access-j7c5v") pod "87832c9d-bf1c-45fc-b106-bbcac2b8641c" (UID: "87832c9d-bf1c-45fc-b106-bbcac2b8641c"). InnerVolumeSpecName "kube-api-access-j7c5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.460487 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-scripts" (OuterVolumeSpecName: "scripts") pod "87832c9d-bf1c-45fc-b106-bbcac2b8641c" (UID: "87832c9d-bf1c-45fc-b106-bbcac2b8641c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.483058 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "87832c9d-bf1c-45fc-b106-bbcac2b8641c" (UID: "87832c9d-bf1c-45fc-b106-bbcac2b8641c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.518227 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-config-data" (OuterVolumeSpecName: "config-data") pod "87832c9d-bf1c-45fc-b106-bbcac2b8641c" (UID: "87832c9d-bf1c-45fc-b106-bbcac2b8641c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.554139 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j7c5v\" (UniqueName: \"kubernetes.io/projected/87832c9d-bf1c-45fc-b106-bbcac2b8641c-kube-api-access-j7c5v\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.554225 4831 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.554245 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.554263 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.554282 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/87832c9d-bf1c-45fc-b106-bbcac2b8641c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.880982 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-2mdbb" event={"ID":"87832c9d-bf1c-45fc-b106-bbcac2b8641c","Type":"ContainerDied","Data":"5e45447af2350ac720a51536339046e2fc32a9f1427453d7d750ccfdb8b3f4bb"} Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.881041 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e45447af2350ac720a51536339046e2fc32a9f1427453d7d750ccfdb8b3f4bb" Dec 03 08:06:47 crc kubenswrapper[4831]: I1203 08:06:47.881175 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-2mdbb" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.121514 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bfb66ff99-jlg8h"] Dec 03 08:06:48 crc kubenswrapper[4831]: E1203 08:06:48.123494 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="87832c9d-bf1c-45fc-b106-bbcac2b8641c" containerName="cinder-db-sync" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.123518 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="87832c9d-bf1c-45fc-b106-bbcac2b8641c" containerName="cinder-db-sync" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.123705 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="87832c9d-bf1c-45fc-b106-bbcac2b8641c" containerName="cinder-db-sync" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.124683 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.151088 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bfb66ff99-jlg8h"] Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.273735 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-config\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.273807 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-dns-svc\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.273832 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-ovsdbserver-sb\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.273903 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rczfx\" (UniqueName: \"kubernetes.io/projected/b217d44f-743f-48e6-a819-3483767f288b-kube-api-access-rczfx\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.273970 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-ovsdbserver-nb\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.308470 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.322552 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.323554 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.326050 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.326534 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.326697 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.326854 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-f2sxp" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.375859 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-config-data\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.375909 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvtz6\" (UniqueName: \"kubernetes.io/projected/2255c247-84da-4f6b-9ea8-f0cc38ca7313-kube-api-access-rvtz6\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.375937 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.376993 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2255c247-84da-4f6b-9ea8-f0cc38ca7313-logs\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.377029 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2255c247-84da-4f6b-9ea8-f0cc38ca7313-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.377065 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-config\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.377086 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-config-data-custom\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.377151 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-dns-svc\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.377171 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-ovsdbserver-sb\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.377196 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-scripts\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.377271 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfx\" (UniqueName: \"kubernetes.io/projected/b217d44f-743f-48e6-a819-3483767f288b-kube-api-access-rczfx\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.377373 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-ovsdbserver-nb\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.378075 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-config\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.378112 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-dns-svc\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.378125 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-ovsdbserver-nb\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.378303 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-ovsdbserver-sb\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.398506 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfx\" (UniqueName: \"kubernetes.io/projected/b217d44f-743f-48e6-a819-3483767f288b-kube-api-access-rczfx\") pod \"dnsmasq-dns-5bfb66ff99-jlg8h\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.446079 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.479491 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.479572 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2255c247-84da-4f6b-9ea8-f0cc38ca7313-logs\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.479607 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2255c247-84da-4f6b-9ea8-f0cc38ca7313-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.479649 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-config-data-custom\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.479714 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-scripts\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.479739 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2255c247-84da-4f6b-9ea8-f0cc38ca7313-etc-machine-id\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.479843 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-config-data\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.479875 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvtz6\" (UniqueName: \"kubernetes.io/projected/2255c247-84da-4f6b-9ea8-f0cc38ca7313-kube-api-access-rvtz6\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.480176 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2255c247-84da-4f6b-9ea8-f0cc38ca7313-logs\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.483483 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.483702 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-config-data-custom\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.485473 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-scripts\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.488088 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-config-data\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.502981 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvtz6\" (UniqueName: \"kubernetes.io/projected/2255c247-84da-4f6b-9ea8-f0cc38ca7313-kube-api-access-rvtz6\") pod \"cinder-api-0\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.641785 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 08:06:48 crc kubenswrapper[4831]: I1203 08:06:48.916831 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bfb66ff99-jlg8h"] Dec 03 08:06:49 crc kubenswrapper[4831]: W1203 08:06:49.150455 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2255c247_84da_4f6b_9ea8_f0cc38ca7313.slice/crio-c53206489e8a93fa75232cc21a1c21d0bb1997feb3302fd05b9bdde9eb7224a3 WatchSource:0}: Error finding container c53206489e8a93fa75232cc21a1c21d0bb1997feb3302fd05b9bdde9eb7224a3: Status 404 returned error can't find the container with id c53206489e8a93fa75232cc21a1c21d0bb1997feb3302fd05b9bdde9eb7224a3 Dec 03 08:06:49 crc kubenswrapper[4831]: I1203 08:06:49.151967 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 08:06:49 crc kubenswrapper[4831]: I1203 08:06:49.906507 4831 generic.go:334] "Generic (PLEG): container finished" podID="b217d44f-743f-48e6-a819-3483767f288b" containerID="9adee222cf7b0b8b08cbf85d4a58fd8912e48371faf2de5a5c1996202d02d576" exitCode=0 Dec 03 08:06:49 crc kubenswrapper[4831]: I1203 08:06:49.906939 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" event={"ID":"b217d44f-743f-48e6-a819-3483767f288b","Type":"ContainerDied","Data":"9adee222cf7b0b8b08cbf85d4a58fd8912e48371faf2de5a5c1996202d02d576"} Dec 03 08:06:49 crc kubenswrapper[4831]: I1203 08:06:49.906966 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" event={"ID":"b217d44f-743f-48e6-a819-3483767f288b","Type":"ContainerStarted","Data":"6d50b2fa85bf36a58debc2582bd07de3f0dbbb5ef108fdbb17e753fd0c76cdc5"} Dec 03 08:06:49 crc kubenswrapper[4831]: I1203 08:06:49.913118 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2255c247-84da-4f6b-9ea8-f0cc38ca7313","Type":"ContainerStarted","Data":"99b2dba27bc21e4e137c98b76154b31e0140a01f34c2f9b7c2b7e192768f125c"} Dec 03 08:06:49 crc kubenswrapper[4831]: I1203 08:06:49.913159 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2255c247-84da-4f6b-9ea8-f0cc38ca7313","Type":"ContainerStarted","Data":"c53206489e8a93fa75232cc21a1c21d0bb1997feb3302fd05b9bdde9eb7224a3"} Dec 03 08:06:50 crc kubenswrapper[4831]: I1203 08:06:50.925928 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" event={"ID":"b217d44f-743f-48e6-a819-3483767f288b","Type":"ContainerStarted","Data":"96ad549132d9ec5938ef2491ec2ea753a3e614e94d37b2df4264d29c938a5878"} Dec 03 08:06:50 crc kubenswrapper[4831]: I1203 08:06:50.926370 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:50 crc kubenswrapper[4831]: I1203 08:06:50.929156 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2255c247-84da-4f6b-9ea8-f0cc38ca7313","Type":"ContainerStarted","Data":"7c96e53cf2d0eb10e870d09bd9eefca080a94cec234950242ce9cee17122fe40"} Dec 03 08:06:50 crc kubenswrapper[4831]: I1203 08:06:50.929396 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 08:06:50 crc kubenswrapper[4831]: I1203 08:06:50.957176 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:06:50 crc kubenswrapper[4831]: I1203 08:06:50.957414 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="455748d2-1e53-4029-a091-77d21b68e219" containerName="nova-scheduler-scheduler" containerID="cri-o://2f243a0192fe0f0f7d13fb18f6ac6c723d3c8a64197b85f337ad42a2446030ea" gracePeriod=30 Dec 03 08:06:50 crc kubenswrapper[4831]: I1203 08:06:50.974342 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" podStartSLOduration=2.974297659 podStartE2EDuration="2.974297659s" podCreationTimestamp="2025-12-03 08:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:06:50.942885911 +0000 UTC m=+5748.286469439" watchObservedRunningTime="2025-12-03 08:06:50.974297659 +0000 UTC m=+5748.317881167" Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.011468 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.011537 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.011555 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.011781 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerName="nova-metadata-log" containerID="cri-o://e3f9c75998dec42b0e4284b561eb4bc419a5e51ee9e81070bbd4e1b5f90d2290" gracePeriod=30 Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.011961 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerName="nova-metadata-metadata" containerID="cri-o://e2faead6d37526200dafdf30eb76d97cc94f6be90b39944c47754b5e9c2b7c56" gracePeriod=30 Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.012090 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" containerName="nova-api-log" containerID="cri-o://deeb7027104d9edf24993d23a81a71ed05d4a1ef9086743bdab4fe980caf1f09" gracePeriod=30 Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.012295 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" containerName="nova-api-api" containerID="cri-o://21f3d16ad710a41a6d61da46b78e9f46465c6b65f3e70ef7c051c24dbc27b7fd" gracePeriod=30 Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.012359 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="1b192790-67c2-4112-bd22-9b0abbfb9394" containerName="nova-cell0-conductor-conductor" containerID="cri-o://7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396" gracePeriod=30 Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.031642 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.031620112 podStartE2EDuration="3.031620112s" podCreationTimestamp="2025-12-03 08:06:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:06:51.022718145 +0000 UTC m=+5748.366301643" watchObservedRunningTime="2025-12-03 08:06:51.031620112 +0000 UTC m=+5748.375203620" Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.048667 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.048884 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="527835db-0f53-4146-87b2-e382437f5014" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://31c8fd4d995f9dd8deec526786a8bc03999bb1a71defd07122d9a19b78801df4" gracePeriod=30 Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.875508 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.939483 4831 generic.go:334] "Generic (PLEG): container finished" podID="e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" containerID="deeb7027104d9edf24993d23a81a71ed05d4a1ef9086743bdab4fe980caf1f09" exitCode=143 Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.939557 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f","Type":"ContainerDied","Data":"deeb7027104d9edf24993d23a81a71ed05d4a1ef9086743bdab4fe980caf1f09"} Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.944928 4831 generic.go:334] "Generic (PLEG): container finished" podID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerID="e3f9c75998dec42b0e4284b561eb4bc419a5e51ee9e81070bbd4e1b5f90d2290" exitCode=143 Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.944988 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f74091ef-35c2-465b-a80d-5df2772e3f9d","Type":"ContainerDied","Data":"e3f9c75998dec42b0e4284b561eb4bc419a5e51ee9e81070bbd4e1b5f90d2290"} Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.946677 4831 generic.go:334] "Generic (PLEG): container finished" podID="527835db-0f53-4146-87b2-e382437f5014" containerID="31c8fd4d995f9dd8deec526786a8bc03999bb1a71defd07122d9a19b78801df4" exitCode=0 Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.946711 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"527835db-0f53-4146-87b2-e382437f5014","Type":"ContainerDied","Data":"31c8fd4d995f9dd8deec526786a8bc03999bb1a71defd07122d9a19b78801df4"} Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.946760 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"527835db-0f53-4146-87b2-e382437f5014","Type":"ContainerDied","Data":"67b051f83eb139b5ee2cc97e66249af597b23c0a9fadba4252fb9afe9f188264"} Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.946727 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.946780 4831 scope.go:117] "RemoveContainer" containerID="31c8fd4d995f9dd8deec526786a8bc03999bb1a71defd07122d9a19b78801df4" Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.971606 4831 scope.go:117] "RemoveContainer" containerID="31c8fd4d995f9dd8deec526786a8bc03999bb1a71defd07122d9a19b78801df4" Dec 03 08:06:51 crc kubenswrapper[4831]: E1203 08:06:51.972097 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31c8fd4d995f9dd8deec526786a8bc03999bb1a71defd07122d9a19b78801df4\": container with ID starting with 31c8fd4d995f9dd8deec526786a8bc03999bb1a71defd07122d9a19b78801df4 not found: ID does not exist" containerID="31c8fd4d995f9dd8deec526786a8bc03999bb1a71defd07122d9a19b78801df4" Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.972154 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31c8fd4d995f9dd8deec526786a8bc03999bb1a71defd07122d9a19b78801df4"} err="failed to get container status \"31c8fd4d995f9dd8deec526786a8bc03999bb1a71defd07122d9a19b78801df4\": rpc error: code = NotFound desc = could not find container \"31c8fd4d995f9dd8deec526786a8bc03999bb1a71defd07122d9a19b78801df4\": container with ID starting with 31c8fd4d995f9dd8deec526786a8bc03999bb1a71defd07122d9a19b78801df4 not found: ID does not exist" Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.972827 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkspf\" (UniqueName: \"kubernetes.io/projected/527835db-0f53-4146-87b2-e382437f5014-kube-api-access-nkspf\") pod \"527835db-0f53-4146-87b2-e382437f5014\" (UID: \"527835db-0f53-4146-87b2-e382437f5014\") " Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.972945 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527835db-0f53-4146-87b2-e382437f5014-config-data\") pod \"527835db-0f53-4146-87b2-e382437f5014\" (UID: \"527835db-0f53-4146-87b2-e382437f5014\") " Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.973567 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527835db-0f53-4146-87b2-e382437f5014-combined-ca-bundle\") pod \"527835db-0f53-4146-87b2-e382437f5014\" (UID: \"527835db-0f53-4146-87b2-e382437f5014\") " Dec 03 08:06:51 crc kubenswrapper[4831]: I1203 08:06:51.978368 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/527835db-0f53-4146-87b2-e382437f5014-kube-api-access-nkspf" (OuterVolumeSpecName: "kube-api-access-nkspf") pod "527835db-0f53-4146-87b2-e382437f5014" (UID: "527835db-0f53-4146-87b2-e382437f5014"). InnerVolumeSpecName "kube-api-access-nkspf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.003270 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527835db-0f53-4146-87b2-e382437f5014-config-data" (OuterVolumeSpecName: "config-data") pod "527835db-0f53-4146-87b2-e382437f5014" (UID: "527835db-0f53-4146-87b2-e382437f5014"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.011797 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/527835db-0f53-4146-87b2-e382437f5014-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "527835db-0f53-4146-87b2-e382437f5014" (UID: "527835db-0f53-4146-87b2-e382437f5014"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.076368 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/527835db-0f53-4146-87b2-e382437f5014-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.076982 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkspf\" (UniqueName: \"kubernetes.io/projected/527835db-0f53-4146-87b2-e382437f5014-kube-api-access-nkspf\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.077088 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/527835db-0f53-4146-87b2-e382437f5014-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.281271 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.296734 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.316550 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 08:06:52 crc kubenswrapper[4831]: E1203 08:06:52.317409 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="527835db-0f53-4146-87b2-e382437f5014" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.317507 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="527835db-0f53-4146-87b2-e382437f5014" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.317779 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="527835db-0f53-4146-87b2-e382437f5014" containerName="nova-cell1-novncproxy-novncproxy" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.318607 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.321671 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Dec 03 08:06:52 crc kubenswrapper[4831]: E1203 08:06:52.329586 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2f243a0192fe0f0f7d13fb18f6ac6c723d3c8a64197b85f337ad42a2446030ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 08:06:52 crc kubenswrapper[4831]: E1203 08:06:52.331079 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2f243a0192fe0f0f7d13fb18f6ac6c723d3c8a64197b85f337ad42a2446030ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 08:06:52 crc kubenswrapper[4831]: E1203 08:06:52.333908 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="2f243a0192fe0f0f7d13fb18f6ac6c723d3c8a64197b85f337ad42a2446030ea" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 08:06:52 crc kubenswrapper[4831]: E1203 08:06:52.334028 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="455748d2-1e53-4029-a091-77d21b68e219" containerName="nova-scheduler-scheduler" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.335180 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.383800 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c49f52-ed33-4f4b-a70e-6fb63117f774-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"45c49f52-ed33-4f4b-a70e-6fb63117f774\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.384084 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42n2n\" (UniqueName: \"kubernetes.io/projected/45c49f52-ed33-4f4b-a70e-6fb63117f774-kube-api-access-42n2n\") pod \"nova-cell1-novncproxy-0\" (UID: \"45c49f52-ed33-4f4b-a70e-6fb63117f774\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.384824 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c49f52-ed33-4f4b-a70e-6fb63117f774-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"45c49f52-ed33-4f4b-a70e-6fb63117f774\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.486437 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c49f52-ed33-4f4b-a70e-6fb63117f774-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"45c49f52-ed33-4f4b-a70e-6fb63117f774\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.486485 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42n2n\" (UniqueName: \"kubernetes.io/projected/45c49f52-ed33-4f4b-a70e-6fb63117f774-kube-api-access-42n2n\") pod \"nova-cell1-novncproxy-0\" (UID: \"45c49f52-ed33-4f4b-a70e-6fb63117f774\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.486588 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c49f52-ed33-4f4b-a70e-6fb63117f774-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"45c49f52-ed33-4f4b-a70e-6fb63117f774\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.491224 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/45c49f52-ed33-4f4b-a70e-6fb63117f774-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"45c49f52-ed33-4f4b-a70e-6fb63117f774\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.496140 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/45c49f52-ed33-4f4b-a70e-6fb63117f774-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"45c49f52-ed33-4f4b-a70e-6fb63117f774\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.503036 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42n2n\" (UniqueName: \"kubernetes.io/projected/45c49f52-ed33-4f4b-a70e-6fb63117f774-kube-api-access-42n2n\") pod \"nova-cell1-novncproxy-0\" (UID: \"45c49f52-ed33-4f4b-a70e-6fb63117f774\") " pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:52 crc kubenswrapper[4831]: I1203 08:06:52.640198 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:52 crc kubenswrapper[4831]: E1203 08:06:52.694230 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 08:06:52 crc kubenswrapper[4831]: E1203 08:06:52.696105 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 08:06:52 crc kubenswrapper[4831]: E1203 08:06:52.698007 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 08:06:52 crc kubenswrapper[4831]: E1203 08:06:52.698068 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="1b192790-67c2-4112-bd22-9b0abbfb9394" containerName="nova-cell0-conductor-conductor" Dec 03 08:06:53 crc kubenswrapper[4831]: I1203 08:06:53.035854 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="527835db-0f53-4146-87b2-e382437f5014" path="/var/lib/kubelet/pods/527835db-0f53-4146-87b2-e382437f5014/volumes" Dec 03 08:06:53 crc kubenswrapper[4831]: I1203 08:06:53.149238 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Dec 03 08:06:53 crc kubenswrapper[4831]: I1203 08:06:53.987073 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"45c49f52-ed33-4f4b-a70e-6fb63117f774","Type":"ContainerStarted","Data":"c05c7c6392ada02fb395871dbd56cdfb2991c1875c3c683de3e31cc0128ac3df"} Dec 03 08:06:53 crc kubenswrapper[4831]: I1203 08:06:53.987564 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"45c49f52-ed33-4f4b-a70e-6fb63117f774","Type":"ContainerStarted","Data":"3b65252f84c3fd7af2994ad90747a8d20a2956a7be654ac51f201c1fe3cdb1a6"} Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.169477 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.80:8775/\": read tcp 10.217.0.2:43308->10.217.1.80:8775: read: connection reset by peer" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.170340 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.80:8775/\": read tcp 10.217.0.2:43318->10.217.1.80:8775: read: connection reset by peer" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.283057 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.283035816 podStartE2EDuration="2.283035816s" podCreationTimestamp="2025-12-03 08:06:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:06:54.022898601 +0000 UTC m=+5751.366482219" watchObservedRunningTime="2025-12-03 08:06:54.283035816 +0000 UTC m=+5751.626619334" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.293504 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.293754 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="a6c85412-de26-4d5f-91c0-fef3c84a682a" containerName="nova-cell1-conductor-conductor" containerID="cri-o://1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59" gracePeriod=30 Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.661140 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.730868 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74091ef-35c2-465b-a80d-5df2772e3f9d-config-data\") pod \"f74091ef-35c2-465b-a80d-5df2772e3f9d\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.730966 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74091ef-35c2-465b-a80d-5df2772e3f9d-combined-ca-bundle\") pod \"f74091ef-35c2-465b-a80d-5df2772e3f9d\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.731005 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74091ef-35c2-465b-a80d-5df2772e3f9d-logs\") pod \"f74091ef-35c2-465b-a80d-5df2772e3f9d\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.731039 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28qcg\" (UniqueName: \"kubernetes.io/projected/f74091ef-35c2-465b-a80d-5df2772e3f9d-kube-api-access-28qcg\") pod \"f74091ef-35c2-465b-a80d-5df2772e3f9d\" (UID: \"f74091ef-35c2-465b-a80d-5df2772e3f9d\") " Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.733229 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f74091ef-35c2-465b-a80d-5df2772e3f9d-logs" (OuterVolumeSpecName: "logs") pod "f74091ef-35c2-465b-a80d-5df2772e3f9d" (UID: "f74091ef-35c2-465b-a80d-5df2772e3f9d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.736621 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f74091ef-35c2-465b-a80d-5df2772e3f9d-kube-api-access-28qcg" (OuterVolumeSpecName: "kube-api-access-28qcg") pod "f74091ef-35c2-465b-a80d-5df2772e3f9d" (UID: "f74091ef-35c2-465b-a80d-5df2772e3f9d"). InnerVolumeSpecName "kube-api-access-28qcg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.758471 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74091ef-35c2-465b-a80d-5df2772e3f9d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f74091ef-35c2-465b-a80d-5df2772e3f9d" (UID: "f74091ef-35c2-465b-a80d-5df2772e3f9d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.768339 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f74091ef-35c2-465b-a80d-5df2772e3f9d-config-data" (OuterVolumeSpecName: "config-data") pod "f74091ef-35c2-465b-a80d-5df2772e3f9d" (UID: "f74091ef-35c2-465b-a80d-5df2772e3f9d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.832470 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f74091ef-35c2-465b-a80d-5df2772e3f9d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.832503 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f74091ef-35c2-465b-a80d-5df2772e3f9d-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.832513 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-28qcg\" (UniqueName: \"kubernetes.io/projected/f74091ef-35c2-465b-a80d-5df2772e3f9d-kube-api-access-28qcg\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.832524 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f74091ef-35c2-465b-a80d-5df2772e3f9d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.840304 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.934006 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qbqx\" (UniqueName: \"kubernetes.io/projected/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-kube-api-access-4qbqx\") pod \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.934121 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-config-data\") pod \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.934243 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-combined-ca-bundle\") pod \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.934262 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-logs\") pod \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\" (UID: \"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f\") " Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.935467 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-logs" (OuterVolumeSpecName: "logs") pod "e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" (UID: "e2d6f10f-5fa7-4f7d-b814-332a9a8db55f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.939972 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-kube-api-access-4qbqx" (OuterVolumeSpecName: "kube-api-access-4qbqx") pod "e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" (UID: "e2d6f10f-5fa7-4f7d-b814-332a9a8db55f"). InnerVolumeSpecName "kube-api-access-4qbqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.959432 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" (UID: "e2d6f10f-5fa7-4f7d-b814-332a9a8db55f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.974603 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-config-data" (OuterVolumeSpecName: "config-data") pod "e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" (UID: "e2d6f10f-5fa7-4f7d-b814-332a9a8db55f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.995709 4831 generic.go:334] "Generic (PLEG): container finished" podID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerID="e2faead6d37526200dafdf30eb76d97cc94f6be90b39944c47754b5e9c2b7c56" exitCode=0 Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.995803 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.996713 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f74091ef-35c2-465b-a80d-5df2772e3f9d","Type":"ContainerDied","Data":"e2faead6d37526200dafdf30eb76d97cc94f6be90b39944c47754b5e9c2b7c56"} Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.996786 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"f74091ef-35c2-465b-a80d-5df2772e3f9d","Type":"ContainerDied","Data":"ae076ba64f9c9872f6dc0b44e8597237cfea8a98c0f4e975dc12b52019afb791"} Dec 03 08:06:54 crc kubenswrapper[4831]: I1203 08:06:54.996836 4831 scope.go:117] "RemoveContainer" containerID="e2faead6d37526200dafdf30eb76d97cc94f6be90b39944c47754b5e9c2b7c56" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.000454 4831 generic.go:334] "Generic (PLEG): container finished" podID="e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" containerID="21f3d16ad710a41a6d61da46b78e9f46465c6b65f3e70ef7c051c24dbc27b7fd" exitCode=0 Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.000575 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.000579 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f","Type":"ContainerDied","Data":"21f3d16ad710a41a6d61da46b78e9f46465c6b65f3e70ef7c051c24dbc27b7fd"} Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.001612 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e2d6f10f-5fa7-4f7d-b814-332a9a8db55f","Type":"ContainerDied","Data":"7b1c0a07191f6de5e792d7ca4d4d19a33ef9910708e2fe9e9fb66d8be6b16007"} Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.012919 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:06:55 crc kubenswrapper[4831]: E1203 08:06:55.013113 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.019064 4831 scope.go:117] "RemoveContainer" containerID="e3f9c75998dec42b0e4284b561eb4bc419a5e51ee9e81070bbd4e1b5f90d2290" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.037783 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.038370 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.038395 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qbqx\" (UniqueName: \"kubernetes.io/projected/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-kube-api-access-4qbqx\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.038407 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.042754 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.051940 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.064557 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.071716 4831 scope.go:117] "RemoveContainer" containerID="e2faead6d37526200dafdf30eb76d97cc94f6be90b39944c47754b5e9c2b7c56" Dec 03 08:06:55 crc kubenswrapper[4831]: E1203 08:06:55.072531 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2faead6d37526200dafdf30eb76d97cc94f6be90b39944c47754b5e9c2b7c56\": container with ID starting with e2faead6d37526200dafdf30eb76d97cc94f6be90b39944c47754b5e9c2b7c56 not found: ID does not exist" containerID="e2faead6d37526200dafdf30eb76d97cc94f6be90b39944c47754b5e9c2b7c56" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.072556 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2faead6d37526200dafdf30eb76d97cc94f6be90b39944c47754b5e9c2b7c56"} err="failed to get container status \"e2faead6d37526200dafdf30eb76d97cc94f6be90b39944c47754b5e9c2b7c56\": rpc error: code = NotFound desc = could not find container \"e2faead6d37526200dafdf30eb76d97cc94f6be90b39944c47754b5e9c2b7c56\": container with ID starting with e2faead6d37526200dafdf30eb76d97cc94f6be90b39944c47754b5e9c2b7c56 not found: ID does not exist" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.072575 4831 scope.go:117] "RemoveContainer" containerID="e3f9c75998dec42b0e4284b561eb4bc419a5e51ee9e81070bbd4e1b5f90d2290" Dec 03 08:06:55 crc kubenswrapper[4831]: E1203 08:06:55.073056 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3f9c75998dec42b0e4284b561eb4bc419a5e51ee9e81070bbd4e1b5f90d2290\": container with ID starting with e3f9c75998dec42b0e4284b561eb4bc419a5e51ee9e81070bbd4e1b5f90d2290 not found: ID does not exist" containerID="e3f9c75998dec42b0e4284b561eb4bc419a5e51ee9e81070bbd4e1b5f90d2290" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.073077 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3f9c75998dec42b0e4284b561eb4bc419a5e51ee9e81070bbd4e1b5f90d2290"} err="failed to get container status \"e3f9c75998dec42b0e4284b561eb4bc419a5e51ee9e81070bbd4e1b5f90d2290\": rpc error: code = NotFound desc = could not find container \"e3f9c75998dec42b0e4284b561eb4bc419a5e51ee9e81070bbd4e1b5f90d2290\": container with ID starting with e3f9c75998dec42b0e4284b561eb4bc419a5e51ee9e81070bbd4e1b5f90d2290 not found: ID does not exist" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.073119 4831 scope.go:117] "RemoveContainer" containerID="21f3d16ad710a41a6d61da46b78e9f46465c6b65f3e70ef7c051c24dbc27b7fd" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.086281 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.105679 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:06:55 crc kubenswrapper[4831]: E1203 08:06:55.106266 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" containerName="nova-api-api" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.106281 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" containerName="nova-api-api" Dec 03 08:06:55 crc kubenswrapper[4831]: E1203 08:06:55.106297 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerName="nova-metadata-log" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.106306 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerName="nova-metadata-log" Dec 03 08:06:55 crc kubenswrapper[4831]: E1203 08:06:55.106332 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" containerName="nova-api-log" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.106338 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" containerName="nova-api-log" Dec 03 08:06:55 crc kubenswrapper[4831]: E1203 08:06:55.106361 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerName="nova-metadata-metadata" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.106367 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerName="nova-metadata-metadata" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.106548 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerName="nova-metadata-metadata" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.106564 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" containerName="nova-api-log" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.106573 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f74091ef-35c2-465b-a80d-5df2772e3f9d" containerName="nova-metadata-log" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.106585 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" containerName="nova-api-api" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.107718 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.113001 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.118567 4831 scope.go:117] "RemoveContainer" containerID="deeb7027104d9edf24993d23a81a71ed05d4a1ef9086743bdab4fe980caf1f09" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.120855 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.132545 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.134155 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.137708 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.142576 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.143136 4831 scope.go:117] "RemoveContainer" containerID="21f3d16ad710a41a6d61da46b78e9f46465c6b65f3e70ef7c051c24dbc27b7fd" Dec 03 08:06:55 crc kubenswrapper[4831]: E1203 08:06:55.143819 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21f3d16ad710a41a6d61da46b78e9f46465c6b65f3e70ef7c051c24dbc27b7fd\": container with ID starting with 21f3d16ad710a41a6d61da46b78e9f46465c6b65f3e70ef7c051c24dbc27b7fd not found: ID does not exist" containerID="21f3d16ad710a41a6d61da46b78e9f46465c6b65f3e70ef7c051c24dbc27b7fd" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.143945 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21f3d16ad710a41a6d61da46b78e9f46465c6b65f3e70ef7c051c24dbc27b7fd"} err="failed to get container status \"21f3d16ad710a41a6d61da46b78e9f46465c6b65f3e70ef7c051c24dbc27b7fd\": rpc error: code = NotFound desc = could not find container \"21f3d16ad710a41a6d61da46b78e9f46465c6b65f3e70ef7c051c24dbc27b7fd\": container with ID starting with 21f3d16ad710a41a6d61da46b78e9f46465c6b65f3e70ef7c051c24dbc27b7fd not found: ID does not exist" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.144053 4831 scope.go:117] "RemoveContainer" containerID="deeb7027104d9edf24993d23a81a71ed05d4a1ef9086743bdab4fe980caf1f09" Dec 03 08:06:55 crc kubenswrapper[4831]: E1203 08:06:55.146620 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deeb7027104d9edf24993d23a81a71ed05d4a1ef9086743bdab4fe980caf1f09\": container with ID starting with deeb7027104d9edf24993d23a81a71ed05d4a1ef9086743bdab4fe980caf1f09 not found: ID does not exist" containerID="deeb7027104d9edf24993d23a81a71ed05d4a1ef9086743bdab4fe980caf1f09" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.146729 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deeb7027104d9edf24993d23a81a71ed05d4a1ef9086743bdab4fe980caf1f09"} err="failed to get container status \"deeb7027104d9edf24993d23a81a71ed05d4a1ef9086743bdab4fe980caf1f09\": rpc error: code = NotFound desc = could not find container \"deeb7027104d9edf24993d23a81a71ed05d4a1ef9086743bdab4fe980caf1f09\": container with ID starting with deeb7027104d9edf24993d23a81a71ed05d4a1ef9086743bdab4fe980caf1f09 not found: ID does not exist" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.242015 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22841d09-0c8c-49b0-a674-8bc092431ad9-logs\") pod \"nova-metadata-0\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.243028 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22841d09-0c8c-49b0-a674-8bc092431ad9-config-data\") pod \"nova-metadata-0\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.243363 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22841d09-0c8c-49b0-a674-8bc092431ad9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.243440 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw2g6\" (UniqueName: \"kubernetes.io/projected/22841d09-0c8c-49b0-a674-8bc092431ad9-kube-api-access-pw2g6\") pod \"nova-metadata-0\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.243543 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5ndt\" (UniqueName: \"kubernetes.io/projected/e306eb84-97a5-44b8-9675-018da9b131a2-kube-api-access-g5ndt\") pod \"nova-api-0\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.243620 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e306eb84-97a5-44b8-9675-018da9b131a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.243776 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e306eb84-97a5-44b8-9675-018da9b131a2-logs\") pod \"nova-api-0\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.243863 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e306eb84-97a5-44b8-9675-018da9b131a2-config-data\") pod \"nova-api-0\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.347537 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e306eb84-97a5-44b8-9675-018da9b131a2-logs\") pod \"nova-api-0\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.347924 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e306eb84-97a5-44b8-9675-018da9b131a2-config-data\") pod \"nova-api-0\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.348143 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22841d09-0c8c-49b0-a674-8bc092431ad9-logs\") pod \"nova-metadata-0\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.348327 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22841d09-0c8c-49b0-a674-8bc092431ad9-config-data\") pod \"nova-metadata-0\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.348348 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e306eb84-97a5-44b8-9675-018da9b131a2-logs\") pod \"nova-api-0\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.348668 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22841d09-0c8c-49b0-a674-8bc092431ad9-logs\") pod \"nova-metadata-0\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.348873 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22841d09-0c8c-49b0-a674-8bc092431ad9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.349028 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pw2g6\" (UniqueName: \"kubernetes.io/projected/22841d09-0c8c-49b0-a674-8bc092431ad9-kube-api-access-pw2g6\") pod \"nova-metadata-0\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.349252 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5ndt\" (UniqueName: \"kubernetes.io/projected/e306eb84-97a5-44b8-9675-018da9b131a2-kube-api-access-g5ndt\") pod \"nova-api-0\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.349746 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e306eb84-97a5-44b8-9675-018da9b131a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.353608 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e306eb84-97a5-44b8-9675-018da9b131a2-config-data\") pod \"nova-api-0\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.354599 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e306eb84-97a5-44b8-9675-018da9b131a2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.357221 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22841d09-0c8c-49b0-a674-8bc092431ad9-config-data\") pod \"nova-metadata-0\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.367368 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22841d09-0c8c-49b0-a674-8bc092431ad9-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.370108 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw2g6\" (UniqueName: \"kubernetes.io/projected/22841d09-0c8c-49b0-a674-8bc092431ad9-kube-api-access-pw2g6\") pod \"nova-metadata-0\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.373432 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5ndt\" (UniqueName: \"kubernetes.io/projected/e306eb84-97a5-44b8-9675-018da9b131a2-kube-api-access-g5ndt\") pod \"nova-api-0\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.439397 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.451730 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.950726 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:06:55 crc kubenswrapper[4831]: I1203 08:06:55.958392 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.020663 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"22841d09-0c8c-49b0-a674-8bc092431ad9","Type":"ContainerStarted","Data":"0f4ac62a9a26eeab2a44f876d31982e589bc149781159d72c65c09be1539b92f"} Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.022709 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e306eb84-97a5-44b8-9675-018da9b131a2","Type":"ContainerStarted","Data":"c46966245dc37b5e552e6963f251fa84110f2946eda6c2bfa1a06399e75a26f1"} Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.029709 4831 generic.go:334] "Generic (PLEG): container finished" podID="455748d2-1e53-4029-a091-77d21b68e219" containerID="2f243a0192fe0f0f7d13fb18f6ac6c723d3c8a64197b85f337ad42a2446030ea" exitCode=0 Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.029738 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"455748d2-1e53-4029-a091-77d21b68e219","Type":"ContainerDied","Data":"2f243a0192fe0f0f7d13fb18f6ac6c723d3c8a64197b85f337ad42a2446030ea"} Dec 03 08:06:56 crc kubenswrapper[4831]: E1203 08:06:56.126741 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 08:06:56 crc kubenswrapper[4831]: E1203 08:06:56.128374 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 08:06:56 crc kubenswrapper[4831]: E1203 08:06:56.130240 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 08:06:56 crc kubenswrapper[4831]: E1203 08:06:56.130337 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="a6c85412-de26-4d5f-91c0-fef3c84a682a" containerName="nova-cell1-conductor-conductor" Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.191664 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.277919 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455748d2-1e53-4029-a091-77d21b68e219-config-data\") pod \"455748d2-1e53-4029-a091-77d21b68e219\" (UID: \"455748d2-1e53-4029-a091-77d21b68e219\") " Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.277968 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xg7nw\" (UniqueName: \"kubernetes.io/projected/455748d2-1e53-4029-a091-77d21b68e219-kube-api-access-xg7nw\") pod \"455748d2-1e53-4029-a091-77d21b68e219\" (UID: \"455748d2-1e53-4029-a091-77d21b68e219\") " Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.278045 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455748d2-1e53-4029-a091-77d21b68e219-combined-ca-bundle\") pod \"455748d2-1e53-4029-a091-77d21b68e219\" (UID: \"455748d2-1e53-4029-a091-77d21b68e219\") " Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.282927 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/455748d2-1e53-4029-a091-77d21b68e219-kube-api-access-xg7nw" (OuterVolumeSpecName: "kube-api-access-xg7nw") pod "455748d2-1e53-4029-a091-77d21b68e219" (UID: "455748d2-1e53-4029-a091-77d21b68e219"). InnerVolumeSpecName "kube-api-access-xg7nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.323551 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/455748d2-1e53-4029-a091-77d21b68e219-config-data" (OuterVolumeSpecName: "config-data") pod "455748d2-1e53-4029-a091-77d21b68e219" (UID: "455748d2-1e53-4029-a091-77d21b68e219"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.326602 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/455748d2-1e53-4029-a091-77d21b68e219-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "455748d2-1e53-4029-a091-77d21b68e219" (UID: "455748d2-1e53-4029-a091-77d21b68e219"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.379798 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/455748d2-1e53-4029-a091-77d21b68e219-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.379837 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xg7nw\" (UniqueName: \"kubernetes.io/projected/455748d2-1e53-4029-a091-77d21b68e219-kube-api-access-xg7nw\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:56 crc kubenswrapper[4831]: I1203 08:06:56.379847 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/455748d2-1e53-4029-a091-77d21b68e219-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.038085 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2d6f10f-5fa7-4f7d-b814-332a9a8db55f" path="/var/lib/kubelet/pods/e2d6f10f-5fa7-4f7d-b814-332a9a8db55f/volumes" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.039831 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f74091ef-35c2-465b-a80d-5df2772e3f9d" path="/var/lib/kubelet/pods/f74091ef-35c2-465b-a80d-5df2772e3f9d/volumes" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.045334 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"22841d09-0c8c-49b0-a674-8bc092431ad9","Type":"ContainerStarted","Data":"8d13188b6a11edaf0f068aab6ccc351d8563743e7c04acfee07314ab55af7794"} Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.045411 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"22841d09-0c8c-49b0-a674-8bc092431ad9","Type":"ContainerStarted","Data":"3f6898f03c18071164b5bd2e1f6e1bd607c4b65e8b1eae9acb8f07a6aa623283"} Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.047875 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e306eb84-97a5-44b8-9675-018da9b131a2","Type":"ContainerStarted","Data":"4e66c7cf75c0c3304e4b90e1c3e52b7136e800a21c8e342fc96adee70a183598"} Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.047920 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e306eb84-97a5-44b8-9675-018da9b131a2","Type":"ContainerStarted","Data":"1ffa5f648dc2bef875942469a166ba6b5974d8cb1edfbd6b2b963cb68f7fb5cb"} Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.050280 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"455748d2-1e53-4029-a091-77d21b68e219","Type":"ContainerDied","Data":"025c93d122dc151904c60c439c62bf3afff6b59e3ad44f0781ec14e7e8c6dadf"} Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.050340 4831 scope.go:117] "RemoveContainer" containerID="2f243a0192fe0f0f7d13fb18f6ac6c723d3c8a64197b85f337ad42a2446030ea" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.050392 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.075411 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.075373573 podStartE2EDuration="2.075373573s" podCreationTimestamp="2025-12-03 08:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:06:57.062179803 +0000 UTC m=+5754.405763311" watchObservedRunningTime="2025-12-03 08:06:57.075373573 +0000 UTC m=+5754.418957121" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.103716 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.103693285 podStartE2EDuration="2.103693285s" podCreationTimestamp="2025-12-03 08:06:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:06:57.083785635 +0000 UTC m=+5754.427369173" watchObservedRunningTime="2025-12-03 08:06:57.103693285 +0000 UTC m=+5754.447276793" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.119135 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.131076 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.143220 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:06:57 crc kubenswrapper[4831]: E1203 08:06:57.143701 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="455748d2-1e53-4029-a091-77d21b68e219" containerName="nova-scheduler-scheduler" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.143718 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="455748d2-1e53-4029-a091-77d21b68e219" containerName="nova-scheduler-scheduler" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.143955 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="455748d2-1e53-4029-a091-77d21b68e219" containerName="nova-scheduler-scheduler" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.144860 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.153943 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.183633 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.309251 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4856725f-47d3-4087-873b-89e7d53b6b0d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4856725f-47d3-4087-873b-89e7d53b6b0d\") " pod="openstack/nova-scheduler-0" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.310097 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4856725f-47d3-4087-873b-89e7d53b6b0d-config-data\") pod \"nova-scheduler-0\" (UID: \"4856725f-47d3-4087-873b-89e7d53b6b0d\") " pod="openstack/nova-scheduler-0" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.310335 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg4xz\" (UniqueName: \"kubernetes.io/projected/4856725f-47d3-4087-873b-89e7d53b6b0d-kube-api-access-rg4xz\") pod \"nova-scheduler-0\" (UID: \"4856725f-47d3-4087-873b-89e7d53b6b0d\") " pod="openstack/nova-scheduler-0" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.412241 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rg4xz\" (UniqueName: \"kubernetes.io/projected/4856725f-47d3-4087-873b-89e7d53b6b0d-kube-api-access-rg4xz\") pod \"nova-scheduler-0\" (UID: \"4856725f-47d3-4087-873b-89e7d53b6b0d\") " pod="openstack/nova-scheduler-0" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.412720 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4856725f-47d3-4087-873b-89e7d53b6b0d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4856725f-47d3-4087-873b-89e7d53b6b0d\") " pod="openstack/nova-scheduler-0" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.412888 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4856725f-47d3-4087-873b-89e7d53b6b0d-config-data\") pod \"nova-scheduler-0\" (UID: \"4856725f-47d3-4087-873b-89e7d53b6b0d\") " pod="openstack/nova-scheduler-0" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.422546 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4856725f-47d3-4087-873b-89e7d53b6b0d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"4856725f-47d3-4087-873b-89e7d53b6b0d\") " pod="openstack/nova-scheduler-0" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.422814 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4856725f-47d3-4087-873b-89e7d53b6b0d-config-data\") pod \"nova-scheduler-0\" (UID: \"4856725f-47d3-4087-873b-89e7d53b6b0d\") " pod="openstack/nova-scheduler-0" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.437135 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg4xz\" (UniqueName: \"kubernetes.io/projected/4856725f-47d3-4087-873b-89e7d53b6b0d-kube-api-access-rg4xz\") pod \"nova-scheduler-0\" (UID: \"4856725f-47d3-4087-873b-89e7d53b6b0d\") " pod="openstack/nova-scheduler-0" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.495482 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.641398 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:06:57 crc kubenswrapper[4831]: E1203 08:06:57.698695 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 08:06:57 crc kubenswrapper[4831]: E1203 08:06:57.702652 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 08:06:57 crc kubenswrapper[4831]: E1203 08:06:57.704409 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 08:06:57 crc kubenswrapper[4831]: E1203 08:06:57.704444 4831 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-cell0-conductor-0" podUID="1b192790-67c2-4112-bd22-9b0abbfb9394" containerName="nova-cell0-conductor-conductor" Dec 03 08:06:57 crc kubenswrapper[4831]: I1203 08:06:57.973218 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:06:57 crc kubenswrapper[4831]: W1203 08:06:57.983603 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4856725f_47d3_4087_873b_89e7d53b6b0d.slice/crio-9874ccca751e2b9abb7ac6b908f83e0af901a62cc566bf340ec4cfbcd623f2a0 WatchSource:0}: Error finding container 9874ccca751e2b9abb7ac6b908f83e0af901a62cc566bf340ec4cfbcd623f2a0: Status 404 returned error can't find the container with id 9874ccca751e2b9abb7ac6b908f83e0af901a62cc566bf340ec4cfbcd623f2a0 Dec 03 08:06:58 crc kubenswrapper[4831]: I1203 08:06:58.063688 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4856725f-47d3-4087-873b-89e7d53b6b0d","Type":"ContainerStarted","Data":"9874ccca751e2b9abb7ac6b908f83e0af901a62cc566bf340ec4cfbcd623f2a0"} Dec 03 08:06:58 crc kubenswrapper[4831]: I1203 08:06:58.447558 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:06:58 crc kubenswrapper[4831]: I1203 08:06:58.562642 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9fc9f4dc7-bbl65"] Dec 03 08:06:58 crc kubenswrapper[4831]: I1203 08:06:58.562934 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" podUID="a04ea935-caf4-46ad-8187-bd86753b3692" containerName="dnsmasq-dns" containerID="cri-o://0d9031b2be3a91f0a36174f84f9d85279759c4da6dc4056674d6f17b3dd77de5" gracePeriod=10 Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.021894 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="455748d2-1e53-4029-a091-77d21b68e219" path="/var/lib/kubelet/pods/455748d2-1e53-4029-a091-77d21b68e219/volumes" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.035088 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.097771 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4856725f-47d3-4087-873b-89e7d53b6b0d","Type":"ContainerStarted","Data":"b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383"} Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.106759 4831 generic.go:334] "Generic (PLEG): container finished" podID="a04ea935-caf4-46ad-8187-bd86753b3692" containerID="0d9031b2be3a91f0a36174f84f9d85279759c4da6dc4056674d6f17b3dd77de5" exitCode=0 Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.106806 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" event={"ID":"a04ea935-caf4-46ad-8187-bd86753b3692","Type":"ContainerDied","Data":"0d9031b2be3a91f0a36174f84f9d85279759c4da6dc4056674d6f17b3dd77de5"} Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.106829 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" event={"ID":"a04ea935-caf4-46ad-8187-bd86753b3692","Type":"ContainerDied","Data":"ce6b09b87c96c05f4b0f1b3a2f819ee9c261018e109d0a0b6840c3287c964d26"} Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.106847 4831 scope.go:117] "RemoveContainer" containerID="0d9031b2be3a91f0a36174f84f9d85279759c4da6dc4056674d6f17b3dd77de5" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.106961 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-9fc9f4dc7-bbl65" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.117185 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.117169064 podStartE2EDuration="2.117169064s" podCreationTimestamp="2025-12-03 08:06:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:06:59.114637496 +0000 UTC m=+5756.458221004" watchObservedRunningTime="2025-12-03 08:06:59.117169064 +0000 UTC m=+5756.460752572" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.133539 4831 scope.go:117] "RemoveContainer" containerID="b86ded81795837938b8215fd2c69fc070135bb0f0dfeb116852ab90277f39936" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.152503 4831 scope.go:117] "RemoveContainer" containerID="0d9031b2be3a91f0a36174f84f9d85279759c4da6dc4056674d6f17b3dd77de5" Dec 03 08:06:59 crc kubenswrapper[4831]: E1203 08:06:59.153969 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0d9031b2be3a91f0a36174f84f9d85279759c4da6dc4056674d6f17b3dd77de5\": container with ID starting with 0d9031b2be3a91f0a36174f84f9d85279759c4da6dc4056674d6f17b3dd77de5 not found: ID does not exist" containerID="0d9031b2be3a91f0a36174f84f9d85279759c4da6dc4056674d6f17b3dd77de5" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.154016 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0d9031b2be3a91f0a36174f84f9d85279759c4da6dc4056674d6f17b3dd77de5"} err="failed to get container status \"0d9031b2be3a91f0a36174f84f9d85279759c4da6dc4056674d6f17b3dd77de5\": rpc error: code = NotFound desc = could not find container \"0d9031b2be3a91f0a36174f84f9d85279759c4da6dc4056674d6f17b3dd77de5\": container with ID starting with 0d9031b2be3a91f0a36174f84f9d85279759c4da6dc4056674d6f17b3dd77de5 not found: ID does not exist" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.154042 4831 scope.go:117] "RemoveContainer" containerID="b86ded81795837938b8215fd2c69fc070135bb0f0dfeb116852ab90277f39936" Dec 03 08:06:59 crc kubenswrapper[4831]: E1203 08:06:59.154342 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b86ded81795837938b8215fd2c69fc070135bb0f0dfeb116852ab90277f39936\": container with ID starting with b86ded81795837938b8215fd2c69fc070135bb0f0dfeb116852ab90277f39936 not found: ID does not exist" containerID="b86ded81795837938b8215fd2c69fc070135bb0f0dfeb116852ab90277f39936" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.154387 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b86ded81795837938b8215fd2c69fc070135bb0f0dfeb116852ab90277f39936"} err="failed to get container status \"b86ded81795837938b8215fd2c69fc070135bb0f0dfeb116852ab90277f39936\": rpc error: code = NotFound desc = could not find container \"b86ded81795837938b8215fd2c69fc070135bb0f0dfeb116852ab90277f39936\": container with ID starting with b86ded81795837938b8215fd2c69fc070135bb0f0dfeb116852ab90277f39936 not found: ID does not exist" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.166181 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-ovsdbserver-sb\") pod \"a04ea935-caf4-46ad-8187-bd86753b3692\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.166279 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-ovsdbserver-nb\") pod \"a04ea935-caf4-46ad-8187-bd86753b3692\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.166423 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-config\") pod \"a04ea935-caf4-46ad-8187-bd86753b3692\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.166578 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-dns-svc\") pod \"a04ea935-caf4-46ad-8187-bd86753b3692\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.166651 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th2rv\" (UniqueName: \"kubernetes.io/projected/a04ea935-caf4-46ad-8187-bd86753b3692-kube-api-access-th2rv\") pod \"a04ea935-caf4-46ad-8187-bd86753b3692\" (UID: \"a04ea935-caf4-46ad-8187-bd86753b3692\") " Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.177882 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a04ea935-caf4-46ad-8187-bd86753b3692-kube-api-access-th2rv" (OuterVolumeSpecName: "kube-api-access-th2rv") pod "a04ea935-caf4-46ad-8187-bd86753b3692" (UID: "a04ea935-caf4-46ad-8187-bd86753b3692"). InnerVolumeSpecName "kube-api-access-th2rv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.223078 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a04ea935-caf4-46ad-8187-bd86753b3692" (UID: "a04ea935-caf4-46ad-8187-bd86753b3692"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.224407 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-config" (OuterVolumeSpecName: "config") pod "a04ea935-caf4-46ad-8187-bd86753b3692" (UID: "a04ea935-caf4-46ad-8187-bd86753b3692"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.230780 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a04ea935-caf4-46ad-8187-bd86753b3692" (UID: "a04ea935-caf4-46ad-8187-bd86753b3692"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.232894 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a04ea935-caf4-46ad-8187-bd86753b3692" (UID: "a04ea935-caf4-46ad-8187-bd86753b3692"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.269139 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th2rv\" (UniqueName: \"kubernetes.io/projected/a04ea935-caf4-46ad-8187-bd86753b3692-kube-api-access-th2rv\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.269390 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.269400 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.269409 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.269421 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a04ea935-caf4-46ad-8187-bd86753b3692-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.439685 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-9fc9f4dc7-bbl65"] Dec 03 08:06:59 crc kubenswrapper[4831]: I1203 08:06:59.446498 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-9fc9f4dc7-bbl65"] Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.071954 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.121211 4831 generic.go:334] "Generic (PLEG): container finished" podID="a6c85412-de26-4d5f-91c0-fef3c84a682a" containerID="1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59" exitCode=0 Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.121291 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a6c85412-de26-4d5f-91c0-fef3c84a682a","Type":"ContainerDied","Data":"1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59"} Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.121334 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"a6c85412-de26-4d5f-91c0-fef3c84a682a","Type":"ContainerDied","Data":"5bec58cb6906cfcb53c07d0fe4b75ba3e9c48982bc8729f2886b1a707a3144c8"} Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.121351 4831 scope.go:117] "RemoveContainer" containerID="1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.121551 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.143407 4831 scope.go:117] "RemoveContainer" containerID="1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59" Dec 03 08:07:00 crc kubenswrapper[4831]: E1203 08:07:00.145632 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59\": container with ID starting with 1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59 not found: ID does not exist" containerID="1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.145676 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59"} err="failed to get container status \"1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59\": rpc error: code = NotFound desc = could not find container \"1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59\": container with ID starting with 1c44f691304559c566cb2d3528b01483905951d676e4dcc6f7916a605eb0aa59 not found: ID does not exist" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.185802 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrxhb\" (UniqueName: \"kubernetes.io/projected/a6c85412-de26-4d5f-91c0-fef3c84a682a-kube-api-access-xrxhb\") pod \"a6c85412-de26-4d5f-91c0-fef3c84a682a\" (UID: \"a6c85412-de26-4d5f-91c0-fef3c84a682a\") " Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.185937 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c85412-de26-4d5f-91c0-fef3c84a682a-combined-ca-bundle\") pod \"a6c85412-de26-4d5f-91c0-fef3c84a682a\" (UID: \"a6c85412-de26-4d5f-91c0-fef3c84a682a\") " Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.185978 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c85412-de26-4d5f-91c0-fef3c84a682a-config-data\") pod \"a6c85412-de26-4d5f-91c0-fef3c84a682a\" (UID: \"a6c85412-de26-4d5f-91c0-fef3c84a682a\") " Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.191205 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6c85412-de26-4d5f-91c0-fef3c84a682a-kube-api-access-xrxhb" (OuterVolumeSpecName: "kube-api-access-xrxhb") pod "a6c85412-de26-4d5f-91c0-fef3c84a682a" (UID: "a6c85412-de26-4d5f-91c0-fef3c84a682a"). InnerVolumeSpecName "kube-api-access-xrxhb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.228150 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c85412-de26-4d5f-91c0-fef3c84a682a-config-data" (OuterVolumeSpecName: "config-data") pod "a6c85412-de26-4d5f-91c0-fef3c84a682a" (UID: "a6c85412-de26-4d5f-91c0-fef3c84a682a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.232748 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a6c85412-de26-4d5f-91c0-fef3c84a682a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a6c85412-de26-4d5f-91c0-fef3c84a682a" (UID: "a6c85412-de26-4d5f-91c0-fef3c84a682a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.288534 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrxhb\" (UniqueName: \"kubernetes.io/projected/a6c85412-de26-4d5f-91c0-fef3c84a682a-kube-api-access-xrxhb\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.288565 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a6c85412-de26-4d5f-91c0-fef3c84a682a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.288574 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a6c85412-de26-4d5f-91c0-fef3c84a682a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.427740 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.440519 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.440648 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.492197 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.505359 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.528561 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:07:00 crc kubenswrapper[4831]: E1203 08:07:00.528972 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a04ea935-caf4-46ad-8187-bd86753b3692" containerName="dnsmasq-dns" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.528989 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04ea935-caf4-46ad-8187-bd86753b3692" containerName="dnsmasq-dns" Dec 03 08:07:00 crc kubenswrapper[4831]: E1203 08:07:00.529019 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6c85412-de26-4d5f-91c0-fef3c84a682a" containerName="nova-cell1-conductor-conductor" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.529026 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6c85412-de26-4d5f-91c0-fef3c84a682a" containerName="nova-cell1-conductor-conductor" Dec 03 08:07:00 crc kubenswrapper[4831]: E1203 08:07:00.529038 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a04ea935-caf4-46ad-8187-bd86753b3692" containerName="init" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.529044 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a04ea935-caf4-46ad-8187-bd86753b3692" containerName="init" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.529212 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6c85412-de26-4d5f-91c0-fef3c84a682a" containerName="nova-cell1-conductor-conductor" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.529231 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a04ea935-caf4-46ad-8187-bd86753b3692" containerName="dnsmasq-dns" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.529860 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.532094 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.568992 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.701293 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwv4t\" (UniqueName: \"kubernetes.io/projected/f75ecfa5-4e93-432b-907b-e79a5de81fc9-kube-api-access-hwv4t\") pod \"nova-cell1-conductor-0\" (UID: \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.701367 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f75ecfa5-4e93-432b-907b-e79a5de81fc9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.701426 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f75ecfa5-4e93-432b-907b-e79a5de81fc9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.803483 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwv4t\" (UniqueName: \"kubernetes.io/projected/f75ecfa5-4e93-432b-907b-e79a5de81fc9-kube-api-access-hwv4t\") pod \"nova-cell1-conductor-0\" (UID: \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.803548 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f75ecfa5-4e93-432b-907b-e79a5de81fc9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.803628 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f75ecfa5-4e93-432b-907b-e79a5de81fc9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.810260 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f75ecfa5-4e93-432b-907b-e79a5de81fc9-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.810941 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f75ecfa5-4e93-432b-907b-e79a5de81fc9-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.824491 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwv4t\" (UniqueName: \"kubernetes.io/projected/f75ecfa5-4e93-432b-907b-e79a5de81fc9-kube-api-access-hwv4t\") pod \"nova-cell1-conductor-0\" (UID: \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:00 crc kubenswrapper[4831]: I1203 08:07:00.864972 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:01 crc kubenswrapper[4831]: I1203 08:07:01.022257 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a04ea935-caf4-46ad-8187-bd86753b3692" path="/var/lib/kubelet/pods/a04ea935-caf4-46ad-8187-bd86753b3692/volumes" Dec 03 08:07:01 crc kubenswrapper[4831]: I1203 08:07:01.023057 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6c85412-de26-4d5f-91c0-fef3c84a682a" path="/var/lib/kubelet/pods/a6c85412-de26-4d5f-91c0-fef3c84a682a/volumes" Dec 03 08:07:01 crc kubenswrapper[4831]: I1203 08:07:01.171089 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:07:01 crc kubenswrapper[4831]: I1203 08:07:01.673219 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:01 crc kubenswrapper[4831]: I1203 08:07:01.821097 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjfxn\" (UniqueName: \"kubernetes.io/projected/1b192790-67c2-4112-bd22-9b0abbfb9394-kube-api-access-jjfxn\") pod \"1b192790-67c2-4112-bd22-9b0abbfb9394\" (UID: \"1b192790-67c2-4112-bd22-9b0abbfb9394\") " Dec 03 08:07:01 crc kubenswrapper[4831]: I1203 08:07:01.821437 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b192790-67c2-4112-bd22-9b0abbfb9394-combined-ca-bundle\") pod \"1b192790-67c2-4112-bd22-9b0abbfb9394\" (UID: \"1b192790-67c2-4112-bd22-9b0abbfb9394\") " Dec 03 08:07:01 crc kubenswrapper[4831]: I1203 08:07:01.821597 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b192790-67c2-4112-bd22-9b0abbfb9394-config-data\") pod \"1b192790-67c2-4112-bd22-9b0abbfb9394\" (UID: \"1b192790-67c2-4112-bd22-9b0abbfb9394\") " Dec 03 08:07:01 crc kubenswrapper[4831]: I1203 08:07:01.826442 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b192790-67c2-4112-bd22-9b0abbfb9394-kube-api-access-jjfxn" (OuterVolumeSpecName: "kube-api-access-jjfxn") pod "1b192790-67c2-4112-bd22-9b0abbfb9394" (UID: "1b192790-67c2-4112-bd22-9b0abbfb9394"). InnerVolumeSpecName "kube-api-access-jjfxn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:07:01 crc kubenswrapper[4831]: I1203 08:07:01.864507 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b192790-67c2-4112-bd22-9b0abbfb9394-config-data" (OuterVolumeSpecName: "config-data") pod "1b192790-67c2-4112-bd22-9b0abbfb9394" (UID: "1b192790-67c2-4112-bd22-9b0abbfb9394"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:07:01 crc kubenswrapper[4831]: I1203 08:07:01.868838 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b192790-67c2-4112-bd22-9b0abbfb9394-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1b192790-67c2-4112-bd22-9b0abbfb9394" (UID: "1b192790-67c2-4112-bd22-9b0abbfb9394"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:07:01 crc kubenswrapper[4831]: I1203 08:07:01.923883 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1b192790-67c2-4112-bd22-9b0abbfb9394-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:01 crc kubenswrapper[4831]: I1203 08:07:01.923927 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1b192790-67c2-4112-bd22-9b0abbfb9394-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:01 crc kubenswrapper[4831]: I1203 08:07:01.923942 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjfxn\" (UniqueName: \"kubernetes.io/projected/1b192790-67c2-4112-bd22-9b0abbfb9394-kube-api-access-jjfxn\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.158780 4831 generic.go:334] "Generic (PLEG): container finished" podID="1b192790-67c2-4112-bd22-9b0abbfb9394" containerID="7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396" exitCode=0 Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.158838 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1b192790-67c2-4112-bd22-9b0abbfb9394","Type":"ContainerDied","Data":"7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396"} Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.158865 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"1b192790-67c2-4112-bd22-9b0abbfb9394","Type":"ContainerDied","Data":"f7642a4e280ad328a6c8bc5839918eee3b3b63f365ceb76ca05adca364353fd8"} Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.158882 4831 scope.go:117] "RemoveContainer" containerID="7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.158996 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.173559 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f75ecfa5-4e93-432b-907b-e79a5de81fc9","Type":"ContainerStarted","Data":"82d8a75c1e5c6455407f6bdf9a8d421543346dd521c05398480bc5dd2bc67eb5"} Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.173618 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f75ecfa5-4e93-432b-907b-e79a5de81fc9","Type":"ContainerStarted","Data":"9c6849908295908bc4900f5f55dc2239af4f0d39875298386f62d2f78e66e0fa"} Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.174579 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.191692 4831 scope.go:117] "RemoveContainer" containerID="7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396" Dec 03 08:07:02 crc kubenswrapper[4831]: E1203 08:07:02.192255 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396\": container with ID starting with 7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396 not found: ID does not exist" containerID="7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.192356 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396"} err="failed to get container status \"7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396\": rpc error: code = NotFound desc = could not find container \"7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396\": container with ID starting with 7499895aa7626c5bde98f6859831ef2ebc4f66b818bc5587ed0c53be6d4d5396 not found: ID does not exist" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.212506 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.21248268 podStartE2EDuration="2.21248268s" podCreationTimestamp="2025-12-03 08:07:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:07:02.207773784 +0000 UTC m=+5759.551357292" watchObservedRunningTime="2025-12-03 08:07:02.21248268 +0000 UTC m=+5759.556066198" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.236526 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.258773 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.269046 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:07:02 crc kubenswrapper[4831]: E1203 08:07:02.269440 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b192790-67c2-4112-bd22-9b0abbfb9394" containerName="nova-cell0-conductor-conductor" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.269457 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b192790-67c2-4112-bd22-9b0abbfb9394" containerName="nova-cell0-conductor-conductor" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.269627 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b192790-67c2-4112-bd22-9b0abbfb9394" containerName="nova-cell0-conductor-conductor" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.270222 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.273401 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.279997 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.446758 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.446826 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.446858 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6twn\" (UniqueName: \"kubernetes.io/projected/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-kube-api-access-n6twn\") pod \"nova-cell0-conductor-0\" (UID: \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.496373 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.548510 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.548598 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.548647 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6twn\" (UniqueName: \"kubernetes.io/projected/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-kube-api-access-n6twn\") pod \"nova-cell0-conductor-0\" (UID: \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.554216 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.557067 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.575131 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6twn\" (UniqueName: \"kubernetes.io/projected/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-kube-api-access-n6twn\") pod \"nova-cell0-conductor-0\" (UID: \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.605521 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.650608 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:07:02 crc kubenswrapper[4831]: I1203 08:07:02.677575 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:07:03 crc kubenswrapper[4831]: I1203 08:07:03.026819 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b192790-67c2-4112-bd22-9b0abbfb9394" path="/var/lib/kubelet/pods/1b192790-67c2-4112-bd22-9b0abbfb9394/volumes" Dec 03 08:07:03 crc kubenswrapper[4831]: I1203 08:07:03.084755 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:07:03 crc kubenswrapper[4831]: I1203 08:07:03.187376 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135","Type":"ContainerStarted","Data":"6e4db7ee316c8f000d3c245b8096f7b54bd893bcd3f3ddfe39271f3b01d291e6"} Dec 03 08:07:03 crc kubenswrapper[4831]: I1203 08:07:03.197771 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Dec 03 08:07:04 crc kubenswrapper[4831]: I1203 08:07:04.195014 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135","Type":"ContainerStarted","Data":"6d70e512d40e327aa2cfe8461f9a7f61bf8a9cd51f6fccbe2d8bb8e8db1c6120"} Dec 03 08:07:04 crc kubenswrapper[4831]: I1203 08:07:04.195239 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:04 crc kubenswrapper[4831]: I1203 08:07:04.215622 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.215597967 podStartE2EDuration="2.215597967s" podCreationTimestamp="2025-12-03 08:07:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:07:04.209766216 +0000 UTC m=+5761.553349724" watchObservedRunningTime="2025-12-03 08:07:04.215597967 +0000 UTC m=+5761.559181475" Dec 03 08:07:05 crc kubenswrapper[4831]: I1203 08:07:05.440365 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 08:07:05 crc kubenswrapper[4831]: I1203 08:07:05.440435 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 08:07:05 crc kubenswrapper[4831]: I1203 08:07:05.451919 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 08:07:05 crc kubenswrapper[4831]: I1203 08:07:05.452470 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 08:07:06 crc kubenswrapper[4831]: I1203 08:07:06.604481 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.89:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:07:06 crc kubenswrapper[4831]: I1203 08:07:06.604521 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.89:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:07:06 crc kubenswrapper[4831]: I1203 08:07:06.604547 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e306eb84-97a5-44b8-9675-018da9b131a2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.90:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:07:06 crc kubenswrapper[4831]: I1203 08:07:06.604999 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="e306eb84-97a5-44b8-9675-018da9b131a2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.90:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:07:07 crc kubenswrapper[4831]: I1203 08:07:07.496567 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 08:07:07 crc kubenswrapper[4831]: I1203 08:07:07.524167 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 08:07:08 crc kubenswrapper[4831]: I1203 08:07:08.284445 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 08:07:09 crc kubenswrapper[4831]: I1203 08:07:09.018616 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:07:09 crc kubenswrapper[4831]: E1203 08:07:09.019027 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:07:10 crc kubenswrapper[4831]: I1203 08:07:10.905474 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 08:07:12 crc kubenswrapper[4831]: I1203 08:07:12.635049 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 08:07:15 crc kubenswrapper[4831]: I1203 08:07:15.443815 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 08:07:15 crc kubenswrapper[4831]: I1203 08:07:15.444347 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 08:07:15 crc kubenswrapper[4831]: I1203 08:07:15.447019 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 08:07:15 crc kubenswrapper[4831]: I1203 08:07:15.447741 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 08:07:15 crc kubenswrapper[4831]: I1203 08:07:15.456730 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 08:07:15 crc kubenswrapper[4831]: I1203 08:07:15.457526 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 08:07:15 crc kubenswrapper[4831]: I1203 08:07:15.458460 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 08:07:15 crc kubenswrapper[4831]: I1203 08:07:15.460247 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.322114 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.329276 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.675516 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.678301 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.681132 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.689073 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.833604 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.833832 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-scripts\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.833881 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.833921 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-config-data\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.834239 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x46pg\" (UniqueName: \"kubernetes.io/projected/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-kube-api-access-x46pg\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.834345 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.935939 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-scripts\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.936014 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.936068 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-config-data\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.936189 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x46pg\" (UniqueName: \"kubernetes.io/projected/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-kube-api-access-x46pg\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.936232 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.936279 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.937420 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.942134 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-scripts\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.942241 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.945859 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-config-data\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.961577 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.980385 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x46pg\" (UniqueName: \"kubernetes.io/projected/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-kube-api-access-x46pg\") pod \"cinder-scheduler-0\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:16 crc kubenswrapper[4831]: I1203 08:07:16.997797 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 08:07:17 crc kubenswrapper[4831]: I1203 08:07:17.443733 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 08:07:17 crc kubenswrapper[4831]: W1203 08:07:17.446394 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b9f7fd1_b910_4ac7_bd1f_a2961d780e1e.slice/crio-83b8c4ed0ae6f8c59110e4a136b5673171ed8a84b31a11ffab449c1a6d2790d3 WatchSource:0}: Error finding container 83b8c4ed0ae6f8c59110e4a136b5673171ed8a84b31a11ffab449c1a6d2790d3: Status 404 returned error can't find the container with id 83b8c4ed0ae6f8c59110e4a136b5673171ed8a84b31a11ffab449c1a6d2790d3 Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.259905 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.260632 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2255c247-84da-4f6b-9ea8-f0cc38ca7313" containerName="cinder-api-log" containerID="cri-o://99b2dba27bc21e4e137c98b76154b31e0140a01f34c2f9b7c2b7e192768f125c" gracePeriod=30 Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.260666 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="2255c247-84da-4f6b-9ea8-f0cc38ca7313" containerName="cinder-api" containerID="cri-o://7c96e53cf2d0eb10e870d09bd9eefca080a94cec234950242ce9cee17122fe40" gracePeriod=30 Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.348590 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e","Type":"ContainerStarted","Data":"5e26d7b6c2cceb588cdcd40f0932dad066ffced9aa22cefc5be5a729baadb174"} Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.348640 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e","Type":"ContainerStarted","Data":"83b8c4ed0ae6f8c59110e4a136b5673171ed8a84b31a11ffab449c1a6d2790d3"} Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.880116 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.882491 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.895903 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.896588 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.979876 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-sys\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.979939 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-dev\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.979974 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.980015 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.980066 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a7f424f-8685-47e7-a374-44e1e824e364-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.980108 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a7f424f-8685-47e7-a374-44e1e824e364-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.980142 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.980164 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.980192 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-run\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.980215 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.980240 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.980276 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a7f424f-8685-47e7-a374-44e1e824e364-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.980300 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8a7f424f-8685-47e7-a374-44e1e824e364-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.980343 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt86v\" (UniqueName: \"kubernetes.io/projected/8a7f424f-8685-47e7-a374-44e1e824e364-kube-api-access-lt86v\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.980370 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:18 crc kubenswrapper[4831]: I1203 08:07:18.980435 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a7f424f-8685-47e7-a374-44e1e824e364-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081462 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-sys\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081511 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-dev\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081541 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081567 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081602 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a7f424f-8685-47e7-a374-44e1e824e364-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081633 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a7f424f-8685-47e7-a374-44e1e824e364-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081652 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081671 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081689 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-run\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081704 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081723 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081749 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a7f424f-8685-47e7-a374-44e1e824e364-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081766 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8a7f424f-8685-47e7-a374-44e1e824e364-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081785 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt86v\" (UniqueName: \"kubernetes.io/projected/8a7f424f-8685-47e7-a374-44e1e824e364-kube-api-access-lt86v\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081805 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.081846 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a7f424f-8685-47e7-a374-44e1e824e364-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.082627 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.082671 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-sys\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.082694 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-dev\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.082857 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.082935 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.090125 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a7f424f-8685-47e7-a374-44e1e824e364-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.090211 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.091468 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8a7f424f-8685-47e7-a374-44e1e824e364-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.091544 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.091584 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.091613 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.091635 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8a7f424f-8685-47e7-a374-44e1e824e364-run\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.095025 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/8a7f424f-8685-47e7-a374-44e1e824e364-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.099612 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a7f424f-8685-47e7-a374-44e1e824e364-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.099879 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8a7f424f-8685-47e7-a374-44e1e824e364-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.109912 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt86v\" (UniqueName: \"kubernetes.io/projected/8a7f424f-8685-47e7-a374-44e1e824e364-kube-api-access-lt86v\") pod \"cinder-volume-volume1-0\" (UID: \"8a7f424f-8685-47e7-a374-44e1e824e364\") " pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.257149 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.371845 4831 generic.go:334] "Generic (PLEG): container finished" podID="2255c247-84da-4f6b-9ea8-f0cc38ca7313" containerID="99b2dba27bc21e4e137c98b76154b31e0140a01f34c2f9b7c2b7e192768f125c" exitCode=143 Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.372211 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2255c247-84da-4f6b-9ea8-f0cc38ca7313","Type":"ContainerDied","Data":"99b2dba27bc21e4e137c98b76154b31e0140a01f34c2f9b7c2b7e192768f125c"} Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.375502 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e","Type":"ContainerStarted","Data":"b68226d1b35f49639ae6a8e9e9a40a35446ce5d246ff0f0e733c4de81dad74d1"} Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.397539 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.397520488 podStartE2EDuration="3.397520488s" podCreationTimestamp="2025-12-03 08:07:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:07:19.390387326 +0000 UTC m=+5776.733970844" watchObservedRunningTime="2025-12-03 08:07:19.397520488 +0000 UTC m=+5776.741103996" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.601972 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.613477 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.613797 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.618973 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.692717 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.692788 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-sys\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.692815 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0dd4df11-40e0-455a-81a1-e0b82a541868-ceph\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.693012 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd4df11-40e0-455a-81a1-e0b82a541868-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.693205 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0dd4df11-40e0-455a-81a1-e0b82a541868-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.693266 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd4df11-40e0-455a-81a1-e0b82a541868-scripts\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.693405 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-lib-modules\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.693447 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.693494 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd4df11-40e0-455a-81a1-e0b82a541868-config-data\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.693526 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-run\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.693590 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.693638 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.693694 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.693733 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.693753 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-dev\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.693851 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28tpx\" (UniqueName: \"kubernetes.io/projected/0dd4df11-40e0-455a-81a1-e0b82a541868-kube-api-access-28tpx\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795281 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd4df11-40e0-455a-81a1-e0b82a541868-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795386 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0dd4df11-40e0-455a-81a1-e0b82a541868-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795420 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd4df11-40e0-455a-81a1-e0b82a541868-scripts\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795440 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-lib-modules\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795456 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795476 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd4df11-40e0-455a-81a1-e0b82a541868-config-data\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795505 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-run\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795528 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795530 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-lib-modules\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795548 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795593 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-run\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795614 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-etc-nvme\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795627 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795655 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795774 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795839 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795847 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795891 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-dev\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795935 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-28tpx\" (UniqueName: \"kubernetes.io/projected/0dd4df11-40e0-455a-81a1-e0b82a541868-kube-api-access-28tpx\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795970 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-dev\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.795972 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.796086 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.796159 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-sys\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.796198 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0dd4df11-40e0-455a-81a1-e0b82a541868-ceph\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.796231 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.796271 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0dd4df11-40e0-455a-81a1-e0b82a541868-sys\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.801301 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/0dd4df11-40e0-455a-81a1-e0b82a541868-ceph\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.802829 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0dd4df11-40e0-455a-81a1-e0b82a541868-config-data-custom\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.802853 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd4df11-40e0-455a-81a1-e0b82a541868-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.803268 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd4df11-40e0-455a-81a1-e0b82a541868-scripts\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.803498 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd4df11-40e0-455a-81a1-e0b82a541868-config-data\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.817836 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-28tpx\" (UniqueName: \"kubernetes.io/projected/0dd4df11-40e0-455a-81a1-e0b82a541868-kube-api-access-28tpx\") pod \"cinder-backup-0\" (UID: \"0dd4df11-40e0-455a-81a1-e0b82a541868\") " pod="openstack/cinder-backup-0" Dec 03 08:07:19 crc kubenswrapper[4831]: W1203 08:07:19.822679 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a7f424f_8685_47e7_a374_44e1e824e364.slice/crio-c20a4eab7ff316c69fd0207097a32c2791e3e276b631305967b2723bf113ecc5 WatchSource:0}: Error finding container c20a4eab7ff316c69fd0207097a32c2791e3e276b631305967b2723bf113ecc5: Status 404 returned error can't find the container with id c20a4eab7ff316c69fd0207097a32c2791e3e276b631305967b2723bf113ecc5 Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.824816 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.830959 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Dec 03 08:07:19 crc kubenswrapper[4831]: I1203 08:07:19.938563 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Dec 03 08:07:20 crc kubenswrapper[4831]: I1203 08:07:20.385388 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"8a7f424f-8685-47e7-a374-44e1e824e364","Type":"ContainerStarted","Data":"c20a4eab7ff316c69fd0207097a32c2791e3e276b631305967b2723bf113ecc5"} Dec 03 08:07:20 crc kubenswrapper[4831]: I1203 08:07:20.495627 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Dec 03 08:07:21 crc kubenswrapper[4831]: I1203 08:07:21.404238 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"8a7f424f-8685-47e7-a374-44e1e824e364","Type":"ContainerStarted","Data":"0feb1f410290d8aac9f3af13be773a548bf0b10761cf243775717044d0f5b02e"} Dec 03 08:07:21 crc kubenswrapper[4831]: I1203 08:07:21.404957 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"8a7f424f-8685-47e7-a374-44e1e824e364","Type":"ContainerStarted","Data":"cdbed9a47e0ae85da061a70ab1155925241e69c0f524a8e0579972fa82987208"} Dec 03 08:07:21 crc kubenswrapper[4831]: I1203 08:07:21.410866 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0dd4df11-40e0-455a-81a1-e0b82a541868","Type":"ContainerStarted","Data":"207d1a6ad87d70b9e8c057a13ba63c514916d1864d508056634b3108d6d49712"} Dec 03 08:07:21 crc kubenswrapper[4831]: I1203 08:07:21.437405 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=2.67655933 podStartE2EDuration="3.437279495s" podCreationTimestamp="2025-12-03 08:07:18 +0000 UTC" firstStartedPulling="2025-12-03 08:07:19.824633129 +0000 UTC m=+5777.168216637" lastFinishedPulling="2025-12-03 08:07:20.585353294 +0000 UTC m=+5777.928936802" observedRunningTime="2025-12-03 08:07:21.431492715 +0000 UTC m=+5778.775076263" watchObservedRunningTime="2025-12-03 08:07:21.437279495 +0000 UTC m=+5778.780863023" Dec 03 08:07:21 crc kubenswrapper[4831]: I1203 08:07:21.442689 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="2255c247-84da-4f6b-9ea8-f0cc38ca7313" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.1.87:8776/healthcheck\": read tcp 10.217.0.2:41790->10.217.1.87:8776: read: connection reset by peer" Dec 03 08:07:21 crc kubenswrapper[4831]: I1203 08:07:21.922071 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.000538 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.051111 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rvtz6\" (UniqueName: \"kubernetes.io/projected/2255c247-84da-4f6b-9ea8-f0cc38ca7313-kube-api-access-rvtz6\") pod \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.051196 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2255c247-84da-4f6b-9ea8-f0cc38ca7313-logs\") pod \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.051237 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-scripts\") pod \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.051282 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2255c247-84da-4f6b-9ea8-f0cc38ca7313-etc-machine-id\") pod \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.051305 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-combined-ca-bundle\") pod \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.051434 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-config-data-custom\") pod \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.051474 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-config-data\") pod \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\" (UID: \"2255c247-84da-4f6b-9ea8-f0cc38ca7313\") " Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.051882 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2255c247-84da-4f6b-9ea8-f0cc38ca7313-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "2255c247-84da-4f6b-9ea8-f0cc38ca7313" (UID: "2255c247-84da-4f6b-9ea8-f0cc38ca7313"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.052278 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2255c247-84da-4f6b-9ea8-f0cc38ca7313-logs" (OuterVolumeSpecName: "logs") pod "2255c247-84da-4f6b-9ea8-f0cc38ca7313" (UID: "2255c247-84da-4f6b-9ea8-f0cc38ca7313"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.052404 4831 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/2255c247-84da-4f6b-9ea8-f0cc38ca7313-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.052423 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2255c247-84da-4f6b-9ea8-f0cc38ca7313-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.054593 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-scripts" (OuterVolumeSpecName: "scripts") pod "2255c247-84da-4f6b-9ea8-f0cc38ca7313" (UID: "2255c247-84da-4f6b-9ea8-f0cc38ca7313"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.054995 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2255c247-84da-4f6b-9ea8-f0cc38ca7313-kube-api-access-rvtz6" (OuterVolumeSpecName: "kube-api-access-rvtz6") pod "2255c247-84da-4f6b-9ea8-f0cc38ca7313" (UID: "2255c247-84da-4f6b-9ea8-f0cc38ca7313"). InnerVolumeSpecName "kube-api-access-rvtz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.055027 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "2255c247-84da-4f6b-9ea8-f0cc38ca7313" (UID: "2255c247-84da-4f6b-9ea8-f0cc38ca7313"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.104625 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2255c247-84da-4f6b-9ea8-f0cc38ca7313" (UID: "2255c247-84da-4f6b-9ea8-f0cc38ca7313"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.106476 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-config-data" (OuterVolumeSpecName: "config-data") pod "2255c247-84da-4f6b-9ea8-f0cc38ca7313" (UID: "2255c247-84da-4f6b-9ea8-f0cc38ca7313"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.157345 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.157378 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rvtz6\" (UniqueName: \"kubernetes.io/projected/2255c247-84da-4f6b-9ea8-f0cc38ca7313-kube-api-access-rvtz6\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.157390 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.157399 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.157407 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/2255c247-84da-4f6b-9ea8-f0cc38ca7313-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.421441 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0dd4df11-40e0-455a-81a1-e0b82a541868","Type":"ContainerStarted","Data":"972167e2ef269cd7c7ebb6abaf52e017d4b5409e50368fc5c1527a7e5fea047d"} Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.421695 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"0dd4df11-40e0-455a-81a1-e0b82a541868","Type":"ContainerStarted","Data":"ce6b94cc348af813f5a605950daa63c458141433b801f6f82cc75e735744dc57"} Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.430575 4831 generic.go:334] "Generic (PLEG): container finished" podID="2255c247-84da-4f6b-9ea8-f0cc38ca7313" containerID="7c96e53cf2d0eb10e870d09bd9eefca080a94cec234950242ce9cee17122fe40" exitCode=0 Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.430654 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.430674 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2255c247-84da-4f6b-9ea8-f0cc38ca7313","Type":"ContainerDied","Data":"7c96e53cf2d0eb10e870d09bd9eefca080a94cec234950242ce9cee17122fe40"} Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.430741 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"2255c247-84da-4f6b-9ea8-f0cc38ca7313","Type":"ContainerDied","Data":"c53206489e8a93fa75232cc21a1c21d0bb1997feb3302fd05b9bdde9eb7224a3"} Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.430760 4831 scope.go:117] "RemoveContainer" containerID="7c96e53cf2d0eb10e870d09bd9eefca080a94cec234950242ce9cee17122fe40" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.452600 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=2.348328957 podStartE2EDuration="3.452579962s" podCreationTimestamp="2025-12-03 08:07:19 +0000 UTC" firstStartedPulling="2025-12-03 08:07:20.544041647 +0000 UTC m=+5777.887625165" lastFinishedPulling="2025-12-03 08:07:21.648292662 +0000 UTC m=+5778.991876170" observedRunningTime="2025-12-03 08:07:22.442075364 +0000 UTC m=+5779.785658872" watchObservedRunningTime="2025-12-03 08:07:22.452579962 +0000 UTC m=+5779.796163470" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.474825 4831 scope.go:117] "RemoveContainer" containerID="99b2dba27bc21e4e137c98b76154b31e0140a01f34c2f9b7c2b7e192768f125c" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.494691 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.510511 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.525157 4831 scope.go:117] "RemoveContainer" containerID="7c96e53cf2d0eb10e870d09bd9eefca080a94cec234950242ce9cee17122fe40" Dec 03 08:07:22 crc kubenswrapper[4831]: E1203 08:07:22.527021 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c96e53cf2d0eb10e870d09bd9eefca080a94cec234950242ce9cee17122fe40\": container with ID starting with 7c96e53cf2d0eb10e870d09bd9eefca080a94cec234950242ce9cee17122fe40 not found: ID does not exist" containerID="7c96e53cf2d0eb10e870d09bd9eefca080a94cec234950242ce9cee17122fe40" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.527074 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c96e53cf2d0eb10e870d09bd9eefca080a94cec234950242ce9cee17122fe40"} err="failed to get container status \"7c96e53cf2d0eb10e870d09bd9eefca080a94cec234950242ce9cee17122fe40\": rpc error: code = NotFound desc = could not find container \"7c96e53cf2d0eb10e870d09bd9eefca080a94cec234950242ce9cee17122fe40\": container with ID starting with 7c96e53cf2d0eb10e870d09bd9eefca080a94cec234950242ce9cee17122fe40 not found: ID does not exist" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.527113 4831 scope.go:117] "RemoveContainer" containerID="99b2dba27bc21e4e137c98b76154b31e0140a01f34c2f9b7c2b7e192768f125c" Dec 03 08:07:22 crc kubenswrapper[4831]: E1203 08:07:22.527627 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99b2dba27bc21e4e137c98b76154b31e0140a01f34c2f9b7c2b7e192768f125c\": container with ID starting with 99b2dba27bc21e4e137c98b76154b31e0140a01f34c2f9b7c2b7e192768f125c not found: ID does not exist" containerID="99b2dba27bc21e4e137c98b76154b31e0140a01f34c2f9b7c2b7e192768f125c" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.527675 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99b2dba27bc21e4e137c98b76154b31e0140a01f34c2f9b7c2b7e192768f125c"} err="failed to get container status \"99b2dba27bc21e4e137c98b76154b31e0140a01f34c2f9b7c2b7e192768f125c\": rpc error: code = NotFound desc = could not find container \"99b2dba27bc21e4e137c98b76154b31e0140a01f34c2f9b7c2b7e192768f125c\": container with ID starting with 99b2dba27bc21e4e137c98b76154b31e0140a01f34c2f9b7c2b7e192768f125c not found: ID does not exist" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.527917 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Dec 03 08:07:22 crc kubenswrapper[4831]: E1203 08:07:22.528431 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2255c247-84da-4f6b-9ea8-f0cc38ca7313" containerName="cinder-api" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.528448 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2255c247-84da-4f6b-9ea8-f0cc38ca7313" containerName="cinder-api" Dec 03 08:07:22 crc kubenswrapper[4831]: E1203 08:07:22.528494 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2255c247-84da-4f6b-9ea8-f0cc38ca7313" containerName="cinder-api-log" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.528506 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2255c247-84da-4f6b-9ea8-f0cc38ca7313" containerName="cinder-api-log" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.528750 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2255c247-84da-4f6b-9ea8-f0cc38ca7313" containerName="cinder-api" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.528788 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2255c247-84da-4f6b-9ea8-f0cc38ca7313" containerName="cinder-api-log" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.530194 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.534810 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.539241 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.667480 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80590171-e127-452a-8c6f-666b84bd0a6e-config-data-custom\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.667906 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80590171-e127-452a-8c6f-666b84bd0a6e-logs\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.667987 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9df4\" (UniqueName: \"kubernetes.io/projected/80590171-e127-452a-8c6f-666b84bd0a6e-kube-api-access-q9df4\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.668039 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80590171-e127-452a-8c6f-666b84bd0a6e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.668107 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80590171-e127-452a-8c6f-666b84bd0a6e-config-data\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.668131 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80590171-e127-452a-8c6f-666b84bd0a6e-scripts\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.668160 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80590171-e127-452a-8c6f-666b84bd0a6e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.769767 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9df4\" (UniqueName: \"kubernetes.io/projected/80590171-e127-452a-8c6f-666b84bd0a6e-kube-api-access-q9df4\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.769887 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80590171-e127-452a-8c6f-666b84bd0a6e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.769960 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80590171-e127-452a-8c6f-666b84bd0a6e-config-data\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.769989 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80590171-e127-452a-8c6f-666b84bd0a6e-scripts\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.770025 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80590171-e127-452a-8c6f-666b84bd0a6e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.770084 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80590171-e127-452a-8c6f-666b84bd0a6e-config-data-custom\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.770112 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80590171-e127-452a-8c6f-666b84bd0a6e-logs\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.770204 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/80590171-e127-452a-8c6f-666b84bd0a6e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.770690 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/80590171-e127-452a-8c6f-666b84bd0a6e-logs\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.786016 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/80590171-e127-452a-8c6f-666b84bd0a6e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.786039 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/80590171-e127-452a-8c6f-666b84bd0a6e-config-data-custom\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.786179 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/80590171-e127-452a-8c6f-666b84bd0a6e-scripts\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.787081 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/80590171-e127-452a-8c6f-666b84bd0a6e-config-data\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.791110 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9df4\" (UniqueName: \"kubernetes.io/projected/80590171-e127-452a-8c6f-666b84bd0a6e-kube-api-access-q9df4\") pod \"cinder-api-0\" (UID: \"80590171-e127-452a-8c6f-666b84bd0a6e\") " pod="openstack/cinder-api-0" Dec 03 08:07:22 crc kubenswrapper[4831]: I1203 08:07:22.858696 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Dec 03 08:07:23 crc kubenswrapper[4831]: I1203 08:07:23.033290 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2255c247-84da-4f6b-9ea8-f0cc38ca7313" path="/var/lib/kubelet/pods/2255c247-84da-4f6b-9ea8-f0cc38ca7313/volumes" Dec 03 08:07:23 crc kubenswrapper[4831]: I1203 08:07:23.301766 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Dec 03 08:07:23 crc kubenswrapper[4831]: I1203 08:07:23.447851 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"80590171-e127-452a-8c6f-666b84bd0a6e","Type":"ContainerStarted","Data":"6ae9f77630b793560893c3ee9b231408238b3f476812e29e3e76cbf747418367"} Dec 03 08:07:24 crc kubenswrapper[4831]: I1203 08:07:24.014986 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:07:24 crc kubenswrapper[4831]: E1203 08:07:24.015499 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:07:24 crc kubenswrapper[4831]: I1203 08:07:24.257996 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:24 crc kubenswrapper[4831]: I1203 08:07:24.469195 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"80590171-e127-452a-8c6f-666b84bd0a6e","Type":"ContainerStarted","Data":"02811415c2cacfdfd31a015813c0140f90f8249ee923b3bbcab9b18280dbba7f"} Dec 03 08:07:24 crc kubenswrapper[4831]: I1203 08:07:24.939639 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Dec 03 08:07:25 crc kubenswrapper[4831]: I1203 08:07:25.479714 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"80590171-e127-452a-8c6f-666b84bd0a6e","Type":"ContainerStarted","Data":"0a41ddacd027bfdf2374e5261a908c292522f3bc38f4daa3ee246d3ca6ce533f"} Dec 03 08:07:25 crc kubenswrapper[4831]: I1203 08:07:25.480047 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Dec 03 08:07:25 crc kubenswrapper[4831]: I1203 08:07:25.501984 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.501949657 podStartE2EDuration="3.501949657s" podCreationTimestamp="2025-12-03 08:07:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:07:25.498410988 +0000 UTC m=+5782.841994526" watchObservedRunningTime="2025-12-03 08:07:25.501949657 +0000 UTC m=+5782.845533165" Dec 03 08:07:27 crc kubenswrapper[4831]: I1203 08:07:27.224891 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 08:07:27 crc kubenswrapper[4831]: I1203 08:07:27.333576 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 08:07:27 crc kubenswrapper[4831]: I1203 08:07:27.507879 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" containerName="cinder-scheduler" containerID="cri-o://5e26d7b6c2cceb588cdcd40f0932dad066ffced9aa22cefc5be5a729baadb174" gracePeriod=30 Dec 03 08:07:27 crc kubenswrapper[4831]: I1203 08:07:27.508035 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" containerName="probe" containerID="cri-o://b68226d1b35f49639ae6a8e9e9a40a35446ce5d246ff0f0e733c4de81dad74d1" gracePeriod=30 Dec 03 08:07:28 crc kubenswrapper[4831]: I1203 08:07:28.519290 4831 generic.go:334] "Generic (PLEG): container finished" podID="4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" containerID="b68226d1b35f49639ae6a8e9e9a40a35446ce5d246ff0f0e733c4de81dad74d1" exitCode=0 Dec 03 08:07:28 crc kubenswrapper[4831]: I1203 08:07:28.519354 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e","Type":"ContainerDied","Data":"b68226d1b35f49639ae6a8e9e9a40a35446ce5d246ff0f0e733c4de81dad74d1"} Dec 03 08:07:29 crc kubenswrapper[4831]: I1203 08:07:29.552729 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Dec 03 08:07:29 crc kubenswrapper[4831]: I1203 08:07:29.988678 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.153637 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x46pg\" (UniqueName: \"kubernetes.io/projected/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-kube-api-access-x46pg\") pod \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.153720 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-scripts\") pod \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.153777 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-etc-machine-id\") pod \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.153842 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-combined-ca-bundle\") pod \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.153871 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-config-data-custom\") pod \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.153917 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-config-data\") pod \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\" (UID: \"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e\") " Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.154982 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" (UID: "4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.159231 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-scripts" (OuterVolumeSpecName: "scripts") pod "4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" (UID: "4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.159368 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-kube-api-access-x46pg" (OuterVolumeSpecName: "kube-api-access-x46pg") pod "4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" (UID: "4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e"). InnerVolumeSpecName "kube-api-access-x46pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.165346 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" (UID: "4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.186659 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.209176 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" (UID: "4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.256273 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x46pg\" (UniqueName: \"kubernetes.io/projected/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-kube-api-access-x46pg\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.256302 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.256326 4831 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.256336 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.256348 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-config-data-custom\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.269603 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-config-data" (OuterVolumeSpecName: "config-data") pod "4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" (UID: "4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.359382 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.548085 4831 generic.go:334] "Generic (PLEG): container finished" podID="4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" containerID="5e26d7b6c2cceb588cdcd40f0932dad066ffced9aa22cefc5be5a729baadb174" exitCode=0 Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.548148 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e","Type":"ContainerDied","Data":"5e26d7b6c2cceb588cdcd40f0932dad066ffced9aa22cefc5be5a729baadb174"} Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.548153 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.548183 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e","Type":"ContainerDied","Data":"83b8c4ed0ae6f8c59110e4a136b5673171ed8a84b31a11ffab449c1a6d2790d3"} Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.548229 4831 scope.go:117] "RemoveContainer" containerID="b68226d1b35f49639ae6a8e9e9a40a35446ce5d246ff0f0e733c4de81dad74d1" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.585866 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.595722 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.615707 4831 scope.go:117] "RemoveContainer" containerID="5e26d7b6c2cceb588cdcd40f0932dad066ffced9aa22cefc5be5a729baadb174" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.631597 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 08:07:30 crc kubenswrapper[4831]: E1203 08:07:30.632417 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" containerName="cinder-scheduler" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.632441 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" containerName="cinder-scheduler" Dec 03 08:07:30 crc kubenswrapper[4831]: E1203 08:07:30.632460 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" containerName="probe" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.632470 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" containerName="probe" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.632788 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" containerName="cinder-scheduler" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.632829 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" containerName="probe" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.634968 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.638813 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.649932 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.650523 4831 scope.go:117] "RemoveContainer" containerID="b68226d1b35f49639ae6a8e9e9a40a35446ce5d246ff0f0e733c4de81dad74d1" Dec 03 08:07:30 crc kubenswrapper[4831]: E1203 08:07:30.666518 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b68226d1b35f49639ae6a8e9e9a40a35446ce5d246ff0f0e733c4de81dad74d1\": container with ID starting with b68226d1b35f49639ae6a8e9e9a40a35446ce5d246ff0f0e733c4de81dad74d1 not found: ID does not exist" containerID="b68226d1b35f49639ae6a8e9e9a40a35446ce5d246ff0f0e733c4de81dad74d1" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.666582 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b68226d1b35f49639ae6a8e9e9a40a35446ce5d246ff0f0e733c4de81dad74d1"} err="failed to get container status \"b68226d1b35f49639ae6a8e9e9a40a35446ce5d246ff0f0e733c4de81dad74d1\": rpc error: code = NotFound desc = could not find container \"b68226d1b35f49639ae6a8e9e9a40a35446ce5d246ff0f0e733c4de81dad74d1\": container with ID starting with b68226d1b35f49639ae6a8e9e9a40a35446ce5d246ff0f0e733c4de81dad74d1 not found: ID does not exist" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.666619 4831 scope.go:117] "RemoveContainer" containerID="5e26d7b6c2cceb588cdcd40f0932dad066ffced9aa22cefc5be5a729baadb174" Dec 03 08:07:30 crc kubenswrapper[4831]: E1203 08:07:30.667647 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e26d7b6c2cceb588cdcd40f0932dad066ffced9aa22cefc5be5a729baadb174\": container with ID starting with 5e26d7b6c2cceb588cdcd40f0932dad066ffced9aa22cefc5be5a729baadb174 not found: ID does not exist" containerID="5e26d7b6c2cceb588cdcd40f0932dad066ffced9aa22cefc5be5a729baadb174" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.667702 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e26d7b6c2cceb588cdcd40f0932dad066ffced9aa22cefc5be5a729baadb174"} err="failed to get container status \"5e26d7b6c2cceb588cdcd40f0932dad066ffced9aa22cefc5be5a729baadb174\": rpc error: code = NotFound desc = could not find container \"5e26d7b6c2cceb588cdcd40f0932dad066ffced9aa22cefc5be5a729baadb174\": container with ID starting with 5e26d7b6c2cceb588cdcd40f0932dad066ffced9aa22cefc5be5a729baadb174 not found: ID does not exist" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.767560 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33a61679-48f8-4157-84db-db0208fc85ad-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.767647 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a61679-48f8-4157-84db-db0208fc85ad-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.767851 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33a61679-48f8-4157-84db-db0208fc85ad-scripts\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.767892 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33a61679-48f8-4157-84db-db0208fc85ad-config-data\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.767958 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33a61679-48f8-4157-84db-db0208fc85ad-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.767992 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf7q4\" (UniqueName: \"kubernetes.io/projected/33a61679-48f8-4157-84db-db0208fc85ad-kube-api-access-sf7q4\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.869594 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33a61679-48f8-4157-84db-db0208fc85ad-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.869640 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a61679-48f8-4157-84db-db0208fc85ad-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.869713 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33a61679-48f8-4157-84db-db0208fc85ad-scripts\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.869732 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33a61679-48f8-4157-84db-db0208fc85ad-config-data\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.869764 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33a61679-48f8-4157-84db-db0208fc85ad-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.869781 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf7q4\" (UniqueName: \"kubernetes.io/projected/33a61679-48f8-4157-84db-db0208fc85ad-kube-api-access-sf7q4\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.869986 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/33a61679-48f8-4157-84db-db0208fc85ad-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.873052 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/33a61679-48f8-4157-84db-db0208fc85ad-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.873163 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/33a61679-48f8-4157-84db-db0208fc85ad-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.887108 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/33a61679-48f8-4157-84db-db0208fc85ad-config-data\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.888015 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/33a61679-48f8-4157-84db-db0208fc85ad-scripts\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.896499 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf7q4\" (UniqueName: \"kubernetes.io/projected/33a61679-48f8-4157-84db-db0208fc85ad-kube-api-access-sf7q4\") pod \"cinder-scheduler-0\" (UID: \"33a61679-48f8-4157-84db-db0208fc85ad\") " pod="openstack/cinder-scheduler-0" Dec 03 08:07:30 crc kubenswrapper[4831]: I1203 08:07:30.964293 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Dec 03 08:07:31 crc kubenswrapper[4831]: I1203 08:07:31.045144 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e" path="/var/lib/kubelet/pods/4b9f7fd1-b910-4ac7-bd1f-a2961d780e1e/volumes" Dec 03 08:07:31 crc kubenswrapper[4831]: I1203 08:07:31.430935 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Dec 03 08:07:31 crc kubenswrapper[4831]: W1203 08:07:31.435194 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod33a61679_48f8_4157_84db_db0208fc85ad.slice/crio-aeb5ab46283b520a25f381fc4fc707d245ba38a40f4b7a68bbdd52157805c414 WatchSource:0}: Error finding container aeb5ab46283b520a25f381fc4fc707d245ba38a40f4b7a68bbdd52157805c414: Status 404 returned error can't find the container with id aeb5ab46283b520a25f381fc4fc707d245ba38a40f4b7a68bbdd52157805c414 Dec 03 08:07:31 crc kubenswrapper[4831]: I1203 08:07:31.557047 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"33a61679-48f8-4157-84db-db0208fc85ad","Type":"ContainerStarted","Data":"aeb5ab46283b520a25f381fc4fc707d245ba38a40f4b7a68bbdd52157805c414"} Dec 03 08:07:32 crc kubenswrapper[4831]: I1203 08:07:32.572087 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"33a61679-48f8-4157-84db-db0208fc85ad","Type":"ContainerStarted","Data":"f078ff97678c4c3130a9a94d572e57f0ffd69ecf0e318b43c12cc8e1650e1745"} Dec 03 08:07:33 crc kubenswrapper[4831]: I1203 08:07:33.585956 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"33a61679-48f8-4157-84db-db0208fc85ad","Type":"ContainerStarted","Data":"964a43dbc551294a748a4574e39e9c513f95b29c49807349a76405594da1f435"} Dec 03 08:07:33 crc kubenswrapper[4831]: I1203 08:07:33.625206 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.625174182 podStartE2EDuration="3.625174182s" podCreationTimestamp="2025-12-03 08:07:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:07:33.611688193 +0000 UTC m=+5790.955271741" watchObservedRunningTime="2025-12-03 08:07:33.625174182 +0000 UTC m=+5790.968757730" Dec 03 08:07:34 crc kubenswrapper[4831]: I1203 08:07:34.623867 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Dec 03 08:07:35 crc kubenswrapper[4831]: I1203 08:07:35.964584 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Dec 03 08:07:38 crc kubenswrapper[4831]: I1203 08:07:38.013369 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:07:38 crc kubenswrapper[4831]: E1203 08:07:38.014568 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:07:41 crc kubenswrapper[4831]: I1203 08:07:41.131827 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Dec 03 08:07:51 crc kubenswrapper[4831]: I1203 08:07:51.013004 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:07:51 crc kubenswrapper[4831]: E1203 08:07:51.013823 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:08:05 crc kubenswrapper[4831]: I1203 08:08:05.013453 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:08:05 crc kubenswrapper[4831]: E1203 08:08:05.014459 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:08:19 crc kubenswrapper[4831]: I1203 08:08:19.013871 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:08:19 crc kubenswrapper[4831]: E1203 08:08:19.015011 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:08:31 crc kubenswrapper[4831]: I1203 08:08:31.013593 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:08:31 crc kubenswrapper[4831]: E1203 08:08:31.014287 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:08:42 crc kubenswrapper[4831]: I1203 08:08:42.014194 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:08:42 crc kubenswrapper[4831]: E1203 08:08:42.015426 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:08:56 crc kubenswrapper[4831]: I1203 08:08:56.013523 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:08:56 crc kubenswrapper[4831]: E1203 08:08:56.014724 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:09:02 crc kubenswrapper[4831]: I1203 08:09:02.673671 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-b9pkv"] Dec 03 08:09:02 crc kubenswrapper[4831]: I1203 08:09:02.682636 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:02 crc kubenswrapper[4831]: I1203 08:09:02.694106 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9pkv"] Dec 03 08:09:02 crc kubenswrapper[4831]: I1203 08:09:02.872648 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qppnd\" (UniqueName: \"kubernetes.io/projected/72966ca5-c358-48b0-89b0-72c2f65d3541-kube-api-access-qppnd\") pod \"redhat-marketplace-b9pkv\" (UID: \"72966ca5-c358-48b0-89b0-72c2f65d3541\") " pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:02 crc kubenswrapper[4831]: I1203 08:09:02.872895 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72966ca5-c358-48b0-89b0-72c2f65d3541-utilities\") pod \"redhat-marketplace-b9pkv\" (UID: \"72966ca5-c358-48b0-89b0-72c2f65d3541\") " pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:02 crc kubenswrapper[4831]: I1203 08:09:02.873028 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72966ca5-c358-48b0-89b0-72c2f65d3541-catalog-content\") pod \"redhat-marketplace-b9pkv\" (UID: \"72966ca5-c358-48b0-89b0-72c2f65d3541\") " pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:02 crc kubenswrapper[4831]: I1203 08:09:02.974737 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qppnd\" (UniqueName: \"kubernetes.io/projected/72966ca5-c358-48b0-89b0-72c2f65d3541-kube-api-access-qppnd\") pod \"redhat-marketplace-b9pkv\" (UID: \"72966ca5-c358-48b0-89b0-72c2f65d3541\") " pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:02 crc kubenswrapper[4831]: I1203 08:09:02.974862 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72966ca5-c358-48b0-89b0-72c2f65d3541-utilities\") pod \"redhat-marketplace-b9pkv\" (UID: \"72966ca5-c358-48b0-89b0-72c2f65d3541\") " pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:02 crc kubenswrapper[4831]: I1203 08:09:02.974926 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72966ca5-c358-48b0-89b0-72c2f65d3541-catalog-content\") pod \"redhat-marketplace-b9pkv\" (UID: \"72966ca5-c358-48b0-89b0-72c2f65d3541\") " pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:02 crc kubenswrapper[4831]: I1203 08:09:02.975727 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72966ca5-c358-48b0-89b0-72c2f65d3541-catalog-content\") pod \"redhat-marketplace-b9pkv\" (UID: \"72966ca5-c358-48b0-89b0-72c2f65d3541\") " pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:02 crc kubenswrapper[4831]: I1203 08:09:02.975898 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72966ca5-c358-48b0-89b0-72c2f65d3541-utilities\") pod \"redhat-marketplace-b9pkv\" (UID: \"72966ca5-c358-48b0-89b0-72c2f65d3541\") " pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:03 crc kubenswrapper[4831]: I1203 08:09:03.018484 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qppnd\" (UniqueName: \"kubernetes.io/projected/72966ca5-c358-48b0-89b0-72c2f65d3541-kube-api-access-qppnd\") pod \"redhat-marketplace-b9pkv\" (UID: \"72966ca5-c358-48b0-89b0-72c2f65d3541\") " pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:03 crc kubenswrapper[4831]: I1203 08:09:03.024855 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:03 crc kubenswrapper[4831]: W1203 08:09:03.567597 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72966ca5_c358_48b0_89b0_72c2f65d3541.slice/crio-97dbd1c6f9328bf48e52ea8d35c9fdd4a65f8189d879619461cd6a168f5a2fb7 WatchSource:0}: Error finding container 97dbd1c6f9328bf48e52ea8d35c9fdd4a65f8189d879619461cd6a168f5a2fb7: Status 404 returned error can't find the container with id 97dbd1c6f9328bf48e52ea8d35c9fdd4a65f8189d879619461cd6a168f5a2fb7 Dec 03 08:09:03 crc kubenswrapper[4831]: I1203 08:09:03.568117 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9pkv"] Dec 03 08:09:03 crc kubenswrapper[4831]: I1203 08:09:03.688191 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9pkv" event={"ID":"72966ca5-c358-48b0-89b0-72c2f65d3541","Type":"ContainerStarted","Data":"97dbd1c6f9328bf48e52ea8d35c9fdd4a65f8189d879619461cd6a168f5a2fb7"} Dec 03 08:09:04 crc kubenswrapper[4831]: I1203 08:09:04.699699 4831 generic.go:334] "Generic (PLEG): container finished" podID="72966ca5-c358-48b0-89b0-72c2f65d3541" containerID="dbf195ff59e97fc7b51368189d9aa5b6194e00d816202d8635fec54c80ac55d6" exitCode=0 Dec 03 08:09:04 crc kubenswrapper[4831]: I1203 08:09:04.699815 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9pkv" event={"ID":"72966ca5-c358-48b0-89b0-72c2f65d3541","Type":"ContainerDied","Data":"dbf195ff59e97fc7b51368189d9aa5b6194e00d816202d8635fec54c80ac55d6"} Dec 03 08:09:06 crc kubenswrapper[4831]: I1203 08:09:06.724923 4831 generic.go:334] "Generic (PLEG): container finished" podID="72966ca5-c358-48b0-89b0-72c2f65d3541" containerID="7623728496c0a772f9adadd7adaac8815ff78cee3eb214742ab9450d124870a2" exitCode=0 Dec 03 08:09:06 crc kubenswrapper[4831]: I1203 08:09:06.725012 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9pkv" event={"ID":"72966ca5-c358-48b0-89b0-72c2f65d3541","Type":"ContainerDied","Data":"7623728496c0a772f9adadd7adaac8815ff78cee3eb214742ab9450d124870a2"} Dec 03 08:09:07 crc kubenswrapper[4831]: I1203 08:09:07.014166 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:09:07 crc kubenswrapper[4831]: E1203 08:09:07.015079 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:09:07 crc kubenswrapper[4831]: I1203 08:09:07.737689 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9pkv" event={"ID":"72966ca5-c358-48b0-89b0-72c2f65d3541","Type":"ContainerStarted","Data":"92bcfa7ea2637f475c5841e06dc54a27d01be9b5b6b4feaa4459f822686a60ac"} Dec 03 08:09:07 crc kubenswrapper[4831]: I1203 08:09:07.781276 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-b9pkv" podStartSLOduration=3.123231921 podStartE2EDuration="5.781116995s" podCreationTimestamp="2025-12-03 08:09:02 +0000 UTC" firstStartedPulling="2025-12-03 08:09:04.702058414 +0000 UTC m=+5882.045641932" lastFinishedPulling="2025-12-03 08:09:07.359943478 +0000 UTC m=+5884.703527006" observedRunningTime="2025-12-03 08:09:07.759259864 +0000 UTC m=+5885.102843392" watchObservedRunningTime="2025-12-03 08:09:07.781116995 +0000 UTC m=+5885.124700503" Dec 03 08:09:09 crc kubenswrapper[4831]: E1203 08:09:09.316537 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.234:51312->38.102.83.234:39573: write tcp 38.102.83.234:51312->38.102.83.234:39573: write: broken pipe Dec 03 08:09:12 crc kubenswrapper[4831]: I1203 08:09:12.088762 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-pxdqm"] Dec 03 08:09:12 crc kubenswrapper[4831]: I1203 08:09:12.130099 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c263-account-create-update-kj8xl"] Dec 03 08:09:12 crc kubenswrapper[4831]: I1203 08:09:12.142127 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-pxdqm"] Dec 03 08:09:12 crc kubenswrapper[4831]: I1203 08:09:12.151993 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c263-account-create-update-kj8xl"] Dec 03 08:09:13 crc kubenswrapper[4831]: I1203 08:09:13.024844 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c1dacad-6332-45b2-94be-7b25a1e3c463" path="/var/lib/kubelet/pods/3c1dacad-6332-45b2-94be-7b25a1e3c463/volumes" Dec 03 08:09:13 crc kubenswrapper[4831]: I1203 08:09:13.026067 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6436bc1e-96d4-47b2-9724-214eef860853" path="/var/lib/kubelet/pods/6436bc1e-96d4-47b2-9724-214eef860853/volumes" Dec 03 08:09:13 crc kubenswrapper[4831]: I1203 08:09:13.027059 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:13 crc kubenswrapper[4831]: I1203 08:09:13.027115 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:13 crc kubenswrapper[4831]: I1203 08:09:13.071811 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:13 crc kubenswrapper[4831]: I1203 08:09:13.870382 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:13 crc kubenswrapper[4831]: I1203 08:09:13.953000 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9pkv"] Dec 03 08:09:15 crc kubenswrapper[4831]: I1203 08:09:15.826796 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-b9pkv" podUID="72966ca5-c358-48b0-89b0-72c2f65d3541" containerName="registry-server" containerID="cri-o://92bcfa7ea2637f475c5841e06dc54a27d01be9b5b6b4feaa4459f822686a60ac" gracePeriod=2 Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.349192 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.437493 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72966ca5-c358-48b0-89b0-72c2f65d3541-utilities\") pod \"72966ca5-c358-48b0-89b0-72c2f65d3541\" (UID: \"72966ca5-c358-48b0-89b0-72c2f65d3541\") " Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.437557 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72966ca5-c358-48b0-89b0-72c2f65d3541-catalog-content\") pod \"72966ca5-c358-48b0-89b0-72c2f65d3541\" (UID: \"72966ca5-c358-48b0-89b0-72c2f65d3541\") " Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.437576 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qppnd\" (UniqueName: \"kubernetes.io/projected/72966ca5-c358-48b0-89b0-72c2f65d3541-kube-api-access-qppnd\") pod \"72966ca5-c358-48b0-89b0-72c2f65d3541\" (UID: \"72966ca5-c358-48b0-89b0-72c2f65d3541\") " Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.439591 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72966ca5-c358-48b0-89b0-72c2f65d3541-utilities" (OuterVolumeSpecName: "utilities") pod "72966ca5-c358-48b0-89b0-72c2f65d3541" (UID: "72966ca5-c358-48b0-89b0-72c2f65d3541"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.445595 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72966ca5-c358-48b0-89b0-72c2f65d3541-kube-api-access-qppnd" (OuterVolumeSpecName: "kube-api-access-qppnd") pod "72966ca5-c358-48b0-89b0-72c2f65d3541" (UID: "72966ca5-c358-48b0-89b0-72c2f65d3541"). InnerVolumeSpecName "kube-api-access-qppnd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.470376 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72966ca5-c358-48b0-89b0-72c2f65d3541-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "72966ca5-c358-48b0-89b0-72c2f65d3541" (UID: "72966ca5-c358-48b0-89b0-72c2f65d3541"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.538832 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/72966ca5-c358-48b0-89b0-72c2f65d3541-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.538863 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/72966ca5-c358-48b0-89b0-72c2f65d3541-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.538875 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qppnd\" (UniqueName: \"kubernetes.io/projected/72966ca5-c358-48b0-89b0-72c2f65d3541-kube-api-access-qppnd\") on node \"crc\" DevicePath \"\"" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.851462 4831 generic.go:334] "Generic (PLEG): container finished" podID="72966ca5-c358-48b0-89b0-72c2f65d3541" containerID="92bcfa7ea2637f475c5841e06dc54a27d01be9b5b6b4feaa4459f822686a60ac" exitCode=0 Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.851551 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-b9pkv" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.851558 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9pkv" event={"ID":"72966ca5-c358-48b0-89b0-72c2f65d3541","Type":"ContainerDied","Data":"92bcfa7ea2637f475c5841e06dc54a27d01be9b5b6b4feaa4459f822686a60ac"} Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.851714 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-b9pkv" event={"ID":"72966ca5-c358-48b0-89b0-72c2f65d3541","Type":"ContainerDied","Data":"97dbd1c6f9328bf48e52ea8d35c9fdd4a65f8189d879619461cd6a168f5a2fb7"} Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.851745 4831 scope.go:117] "RemoveContainer" containerID="92bcfa7ea2637f475c5841e06dc54a27d01be9b5b6b4feaa4459f822686a60ac" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.895410 4831 scope.go:117] "RemoveContainer" containerID="7623728496c0a772f9adadd7adaac8815ff78cee3eb214742ab9450d124870a2" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.929723 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9pkv"] Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.940594 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-b9pkv"] Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.947024 4831 scope.go:117] "RemoveContainer" containerID="dbf195ff59e97fc7b51368189d9aa5b6194e00d816202d8635fec54c80ac55d6" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.992202 4831 scope.go:117] "RemoveContainer" containerID="92bcfa7ea2637f475c5841e06dc54a27d01be9b5b6b4feaa4459f822686a60ac" Dec 03 08:09:16 crc kubenswrapper[4831]: E1203 08:09:16.992744 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92bcfa7ea2637f475c5841e06dc54a27d01be9b5b6b4feaa4459f822686a60ac\": container with ID starting with 92bcfa7ea2637f475c5841e06dc54a27d01be9b5b6b4feaa4459f822686a60ac not found: ID does not exist" containerID="92bcfa7ea2637f475c5841e06dc54a27d01be9b5b6b4feaa4459f822686a60ac" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.992791 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92bcfa7ea2637f475c5841e06dc54a27d01be9b5b6b4feaa4459f822686a60ac"} err="failed to get container status \"92bcfa7ea2637f475c5841e06dc54a27d01be9b5b6b4feaa4459f822686a60ac\": rpc error: code = NotFound desc = could not find container \"92bcfa7ea2637f475c5841e06dc54a27d01be9b5b6b4feaa4459f822686a60ac\": container with ID starting with 92bcfa7ea2637f475c5841e06dc54a27d01be9b5b6b4feaa4459f822686a60ac not found: ID does not exist" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.992819 4831 scope.go:117] "RemoveContainer" containerID="7623728496c0a772f9adadd7adaac8815ff78cee3eb214742ab9450d124870a2" Dec 03 08:09:16 crc kubenswrapper[4831]: E1203 08:09:16.993089 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7623728496c0a772f9adadd7adaac8815ff78cee3eb214742ab9450d124870a2\": container with ID starting with 7623728496c0a772f9adadd7adaac8815ff78cee3eb214742ab9450d124870a2 not found: ID does not exist" containerID="7623728496c0a772f9adadd7adaac8815ff78cee3eb214742ab9450d124870a2" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.993166 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7623728496c0a772f9adadd7adaac8815ff78cee3eb214742ab9450d124870a2"} err="failed to get container status \"7623728496c0a772f9adadd7adaac8815ff78cee3eb214742ab9450d124870a2\": rpc error: code = NotFound desc = could not find container \"7623728496c0a772f9adadd7adaac8815ff78cee3eb214742ab9450d124870a2\": container with ID starting with 7623728496c0a772f9adadd7adaac8815ff78cee3eb214742ab9450d124870a2 not found: ID does not exist" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.993187 4831 scope.go:117] "RemoveContainer" containerID="dbf195ff59e97fc7b51368189d9aa5b6194e00d816202d8635fec54c80ac55d6" Dec 03 08:09:16 crc kubenswrapper[4831]: E1203 08:09:16.993587 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbf195ff59e97fc7b51368189d9aa5b6194e00d816202d8635fec54c80ac55d6\": container with ID starting with dbf195ff59e97fc7b51368189d9aa5b6194e00d816202d8635fec54c80ac55d6 not found: ID does not exist" containerID="dbf195ff59e97fc7b51368189d9aa5b6194e00d816202d8635fec54c80ac55d6" Dec 03 08:09:16 crc kubenswrapper[4831]: I1203 08:09:16.993650 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbf195ff59e97fc7b51368189d9aa5b6194e00d816202d8635fec54c80ac55d6"} err="failed to get container status \"dbf195ff59e97fc7b51368189d9aa5b6194e00d816202d8635fec54c80ac55d6\": rpc error: code = NotFound desc = could not find container \"dbf195ff59e97fc7b51368189d9aa5b6194e00d816202d8635fec54c80ac55d6\": container with ID starting with dbf195ff59e97fc7b51368189d9aa5b6194e00d816202d8635fec54c80ac55d6 not found: ID does not exist" Dec 03 08:09:17 crc kubenswrapper[4831]: I1203 08:09:17.024079 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72966ca5-c358-48b0-89b0-72c2f65d3541" path="/var/lib/kubelet/pods/72966ca5-c358-48b0-89b0-72c2f65d3541/volumes" Dec 03 08:09:19 crc kubenswrapper[4831]: I1203 08:09:19.043182 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-2lq75"] Dec 03 08:09:19 crc kubenswrapper[4831]: I1203 08:09:19.043674 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-2lq75"] Dec 03 08:09:20 crc kubenswrapper[4831]: I1203 08:09:20.238381 4831 scope.go:117] "RemoveContainer" containerID="b89ce73699b3850b1974a1ba8c4558871b6a7ee61752e50e902b696603da1a08" Dec 03 08:09:20 crc kubenswrapper[4831]: I1203 08:09:20.279808 4831 scope.go:117] "RemoveContainer" containerID="3feb284151e5983b4315f80a434ce03e91c574931dc5027b83e41550404414f1" Dec 03 08:09:21 crc kubenswrapper[4831]: I1203 08:09:21.013474 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:09:21 crc kubenswrapper[4831]: E1203 08:09:21.013737 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:09:21 crc kubenswrapper[4831]: I1203 08:09:21.024882 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05cc059b-0638-4e4c-8410-ace0ba4f391a" path="/var/lib/kubelet/pods/05cc059b-0638-4e4c-8410-ace0ba4f391a/volumes" Dec 03 08:09:22 crc kubenswrapper[4831]: I1203 08:09:22.989370 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dcd7t"] Dec 03 08:09:22 crc kubenswrapper[4831]: E1203 08:09:22.990180 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72966ca5-c358-48b0-89b0-72c2f65d3541" containerName="extract-content" Dec 03 08:09:22 crc kubenswrapper[4831]: I1203 08:09:22.990200 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="72966ca5-c358-48b0-89b0-72c2f65d3541" containerName="extract-content" Dec 03 08:09:22 crc kubenswrapper[4831]: E1203 08:09:22.990220 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72966ca5-c358-48b0-89b0-72c2f65d3541" containerName="extract-utilities" Dec 03 08:09:22 crc kubenswrapper[4831]: I1203 08:09:22.990229 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="72966ca5-c358-48b0-89b0-72c2f65d3541" containerName="extract-utilities" Dec 03 08:09:22 crc kubenswrapper[4831]: E1203 08:09:22.990255 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72966ca5-c358-48b0-89b0-72c2f65d3541" containerName="registry-server" Dec 03 08:09:22 crc kubenswrapper[4831]: I1203 08:09:22.990267 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="72966ca5-c358-48b0-89b0-72c2f65d3541" containerName="registry-server" Dec 03 08:09:22 crc kubenswrapper[4831]: I1203 08:09:22.990632 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="72966ca5-c358-48b0-89b0-72c2f65d3541" containerName="registry-server" Dec 03 08:09:22 crc kubenswrapper[4831]: I1203 08:09:22.991540 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:22 crc kubenswrapper[4831]: I1203 08:09:22.994593 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-rbgbv" Dec 03 08:09:22 crc kubenswrapper[4831]: I1203 08:09:22.994950 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.006398 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-9lvg5"] Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.009261 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.024593 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dcd7t"] Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.040050 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9lvg5"] Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.179647 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-var-run-ovn\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.179730 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6a8df1d7-27d5-417c-b10e-379dad30e5cf-var-lib\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.179957 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-var-run\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.180019 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6a8df1d7-27d5-417c-b10e-379dad30e5cf-var-run\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.180174 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6a8df1d7-27d5-417c-b10e-379dad30e5cf-var-log\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.180250 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dm7nb\" (UniqueName: \"kubernetes.io/projected/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-kube-api-access-dm7nb\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.180283 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6a8df1d7-27d5-417c-b10e-379dad30e5cf-etc-ovs\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.180415 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-var-log-ovn\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.180530 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-scripts\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.180554 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mr7b\" (UniqueName: \"kubernetes.io/projected/6a8df1d7-27d5-417c-b10e-379dad30e5cf-kube-api-access-2mr7b\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.180593 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a8df1d7-27d5-417c-b10e-379dad30e5cf-scripts\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282466 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-var-run\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282519 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6a8df1d7-27d5-417c-b10e-379dad30e5cf-var-run\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282552 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6a8df1d7-27d5-417c-b10e-379dad30e5cf-var-log\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282589 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dm7nb\" (UniqueName: \"kubernetes.io/projected/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-kube-api-access-dm7nb\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282606 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6a8df1d7-27d5-417c-b10e-379dad30e5cf-etc-ovs\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282651 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-var-log-ovn\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282692 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-scripts\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282709 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2mr7b\" (UniqueName: \"kubernetes.io/projected/6a8df1d7-27d5-417c-b10e-379dad30e5cf-kube-api-access-2mr7b\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282730 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a8df1d7-27d5-417c-b10e-379dad30e5cf-scripts\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282754 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-var-run-ovn\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282772 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6a8df1d7-27d5-417c-b10e-379dad30e5cf-var-lib\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282843 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-var-log-ovn\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282916 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/6a8df1d7-27d5-417c-b10e-379dad30e5cf-var-lib\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282926 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/6a8df1d7-27d5-417c-b10e-379dad30e5cf-etc-ovs\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282983 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-var-run\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.282985 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-var-run-ovn\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.283051 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/6a8df1d7-27d5-417c-b10e-379dad30e5cf-var-log\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.285697 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/6a8df1d7-27d5-417c-b10e-379dad30e5cf-scripts\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.285702 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-scripts\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.285785 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/6a8df1d7-27d5-417c-b10e-379dad30e5cf-var-run\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.310677 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mr7b\" (UniqueName: \"kubernetes.io/projected/6a8df1d7-27d5-417c-b10e-379dad30e5cf-kube-api-access-2mr7b\") pod \"ovn-controller-ovs-9lvg5\" (UID: \"6a8df1d7-27d5-417c-b10e-379dad30e5cf\") " pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.318351 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dm7nb\" (UniqueName: \"kubernetes.io/projected/0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8-kube-api-access-dm7nb\") pod \"ovn-controller-dcd7t\" (UID: \"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8\") " pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.364939 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:23 crc kubenswrapper[4831]: I1203 08:09:23.613806 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.115435 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dcd7t"] Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.194556 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-9lvg5"] Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.560051 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-vmqws"] Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.562106 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.564945 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.595139 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vmqws"] Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.619669 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a5d61e42-ab2f-40b3-8b9f-bb480e485790-ovs-rundir\") pod \"ovn-controller-metrics-vmqws\" (UID: \"a5d61e42-ab2f-40b3-8b9f-bb480e485790\") " pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.619788 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jd5c7\" (UniqueName: \"kubernetes.io/projected/a5d61e42-ab2f-40b3-8b9f-bb480e485790-kube-api-access-jd5c7\") pod \"ovn-controller-metrics-vmqws\" (UID: \"a5d61e42-ab2f-40b3-8b9f-bb480e485790\") " pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.619834 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5d61e42-ab2f-40b3-8b9f-bb480e485790-config\") pod \"ovn-controller-metrics-vmqws\" (UID: \"a5d61e42-ab2f-40b3-8b9f-bb480e485790\") " pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.619900 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a5d61e42-ab2f-40b3-8b9f-bb480e485790-ovn-rundir\") pod \"ovn-controller-metrics-vmqws\" (UID: \"a5d61e42-ab2f-40b3-8b9f-bb480e485790\") " pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.721805 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a5d61e42-ab2f-40b3-8b9f-bb480e485790-ovs-rundir\") pod \"ovn-controller-metrics-vmqws\" (UID: \"a5d61e42-ab2f-40b3-8b9f-bb480e485790\") " pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.722109 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jd5c7\" (UniqueName: \"kubernetes.io/projected/a5d61e42-ab2f-40b3-8b9f-bb480e485790-kube-api-access-jd5c7\") pod \"ovn-controller-metrics-vmqws\" (UID: \"a5d61e42-ab2f-40b3-8b9f-bb480e485790\") " pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.722205 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5d61e42-ab2f-40b3-8b9f-bb480e485790-config\") pod \"ovn-controller-metrics-vmqws\" (UID: \"a5d61e42-ab2f-40b3-8b9f-bb480e485790\") " pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.722396 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a5d61e42-ab2f-40b3-8b9f-bb480e485790-ovn-rundir\") pod \"ovn-controller-metrics-vmqws\" (UID: \"a5d61e42-ab2f-40b3-8b9f-bb480e485790\") " pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.722693 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a5d61e42-ab2f-40b3-8b9f-bb480e485790-ovn-rundir\") pod \"ovn-controller-metrics-vmqws\" (UID: \"a5d61e42-ab2f-40b3-8b9f-bb480e485790\") " pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.722815 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a5d61e42-ab2f-40b3-8b9f-bb480e485790-ovs-rundir\") pod \"ovn-controller-metrics-vmqws\" (UID: \"a5d61e42-ab2f-40b3-8b9f-bb480e485790\") " pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.723096 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5d61e42-ab2f-40b3-8b9f-bb480e485790-config\") pod \"ovn-controller-metrics-vmqws\" (UID: \"a5d61e42-ab2f-40b3-8b9f-bb480e485790\") " pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.759974 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jd5c7\" (UniqueName: \"kubernetes.io/projected/a5d61e42-ab2f-40b3-8b9f-bb480e485790-kube-api-access-jd5c7\") pod \"ovn-controller-metrics-vmqws\" (UID: \"a5d61e42-ab2f-40b3-8b9f-bb480e485790\") " pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.891538 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-vmqws" Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.925015 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9lvg5" event={"ID":"6a8df1d7-27d5-417c-b10e-379dad30e5cf","Type":"ContainerStarted","Data":"fb83f402d894c9755726780b91297009f0c68b0f616528d857234219a842d5ab"} Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.925057 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9lvg5" event={"ID":"6a8df1d7-27d5-417c-b10e-379dad30e5cf","Type":"ContainerStarted","Data":"a918e2b854df63a08eb0554d1c8d4bc9c41d96604f4dbbd6f59188d7d12b18ff"} Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.937339 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dcd7t" event={"ID":"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8","Type":"ContainerStarted","Data":"e4ff3c66b7a33202f54243939177c11c9e445ddf3416ac97e95bbe9961129798"} Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.937393 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dcd7t" event={"ID":"0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8","Type":"ContainerStarted","Data":"e7da525808c911a304d9594996fa6b905e0beb0c553fd7bc8f024fda0cd30253"} Dec 03 08:09:24 crc kubenswrapper[4831]: I1203 08:09:24.939494 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-dcd7t" Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.227156 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dcd7t" podStartSLOduration=3.227137438 podStartE2EDuration="3.227137438s" podCreationTimestamp="2025-12-03 08:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:09:24.996071301 +0000 UTC m=+5902.339654819" watchObservedRunningTime="2025-12-03 08:09:25.227137438 +0000 UTC m=+5902.570720946" Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.229377 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-create-vtngw"] Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.230690 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-vtngw" Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.238776 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-vtngw"] Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.352935 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tdmm\" (UniqueName: \"kubernetes.io/projected/2c6118f2-abda-41fc-9160-cf14d0742581-kube-api-access-4tdmm\") pod \"octavia-db-create-vtngw\" (UID: \"2c6118f2-abda-41fc-9160-cf14d0742581\") " pod="openstack/octavia-db-create-vtngw" Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.353268 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6118f2-abda-41fc-9160-cf14d0742581-operator-scripts\") pod \"octavia-db-create-vtngw\" (UID: \"2c6118f2-abda-41fc-9160-cf14d0742581\") " pod="openstack/octavia-db-create-vtngw" Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.405848 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-vmqws"] Dec 03 08:09:25 crc kubenswrapper[4831]: W1203 08:09:25.407030 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5d61e42_ab2f_40b3_8b9f_bb480e485790.slice/crio-b6f2182af218e160bdc54b5005b1c56e283d44457cc6bf9d9a34dc8470b617f9 WatchSource:0}: Error finding container b6f2182af218e160bdc54b5005b1c56e283d44457cc6bf9d9a34dc8470b617f9: Status 404 returned error can't find the container with id b6f2182af218e160bdc54b5005b1c56e283d44457cc6bf9d9a34dc8470b617f9 Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.454647 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6118f2-abda-41fc-9160-cf14d0742581-operator-scripts\") pod \"octavia-db-create-vtngw\" (UID: \"2c6118f2-abda-41fc-9160-cf14d0742581\") " pod="openstack/octavia-db-create-vtngw" Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.454768 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tdmm\" (UniqueName: \"kubernetes.io/projected/2c6118f2-abda-41fc-9160-cf14d0742581-kube-api-access-4tdmm\") pod \"octavia-db-create-vtngw\" (UID: \"2c6118f2-abda-41fc-9160-cf14d0742581\") " pod="openstack/octavia-db-create-vtngw" Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.455705 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6118f2-abda-41fc-9160-cf14d0742581-operator-scripts\") pod \"octavia-db-create-vtngw\" (UID: \"2c6118f2-abda-41fc-9160-cf14d0742581\") " pod="openstack/octavia-db-create-vtngw" Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.477572 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tdmm\" (UniqueName: \"kubernetes.io/projected/2c6118f2-abda-41fc-9160-cf14d0742581-kube-api-access-4tdmm\") pod \"octavia-db-create-vtngw\" (UID: \"2c6118f2-abda-41fc-9160-cf14d0742581\") " pod="openstack/octavia-db-create-vtngw" Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.549732 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-vtngw" Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.948691 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vmqws" event={"ID":"a5d61e42-ab2f-40b3-8b9f-bb480e485790","Type":"ContainerStarted","Data":"a392925fff0cfecde4194edd2f88c9a7d7b6d714953b862c15c80efff61686f3"} Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.949074 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-vmqws" event={"ID":"a5d61e42-ab2f-40b3-8b9f-bb480e485790","Type":"ContainerStarted","Data":"b6f2182af218e160bdc54b5005b1c56e283d44457cc6bf9d9a34dc8470b617f9"} Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.950169 4831 generic.go:334] "Generic (PLEG): container finished" podID="6a8df1d7-27d5-417c-b10e-379dad30e5cf" containerID="fb83f402d894c9755726780b91297009f0c68b0f616528d857234219a842d5ab" exitCode=0 Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.950260 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9lvg5" event={"ID":"6a8df1d7-27d5-417c-b10e-379dad30e5cf","Type":"ContainerDied","Data":"fb83f402d894c9755726780b91297009f0c68b0f616528d857234219a842d5ab"} Dec 03 08:09:25 crc kubenswrapper[4831]: I1203 08:09:25.971471 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-vmqws" podStartSLOduration=1.971455291 podStartE2EDuration="1.971455291s" podCreationTimestamp="2025-12-03 08:09:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:09:25.963590896 +0000 UTC m=+5903.307174404" watchObservedRunningTime="2025-12-03 08:09:25.971455291 +0000 UTC m=+5903.315038789" Dec 03 08:09:26 crc kubenswrapper[4831]: I1203 08:09:26.013667 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-create-vtngw"] Dec 03 08:09:26 crc kubenswrapper[4831]: I1203 08:09:26.960001 4831 generic.go:334] "Generic (PLEG): container finished" podID="2c6118f2-abda-41fc-9160-cf14d0742581" containerID="a43e7f1eadecaf8f2edb5a97b55f867a2382ed52337791bb6e47c00b2f9a4094" exitCode=0 Dec 03 08:09:26 crc kubenswrapper[4831]: I1203 08:09:26.960546 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-vtngw" event={"ID":"2c6118f2-abda-41fc-9160-cf14d0742581","Type":"ContainerDied","Data":"a43e7f1eadecaf8f2edb5a97b55f867a2382ed52337791bb6e47c00b2f9a4094"} Dec 03 08:09:26 crc kubenswrapper[4831]: I1203 08:09:26.960574 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-vtngw" event={"ID":"2c6118f2-abda-41fc-9160-cf14d0742581","Type":"ContainerStarted","Data":"85a694c65bbd3de0a042ef08106132dae4f325ce961e918f860682d0aa6c6e4d"} Dec 03 08:09:26 crc kubenswrapper[4831]: I1203 08:09:26.964245 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9lvg5" event={"ID":"6a8df1d7-27d5-417c-b10e-379dad30e5cf","Type":"ContainerStarted","Data":"955037ca996ab9a34b84d7343f8f6ae1a47dd320b9612c21f5c82abc3394bca4"} Dec 03 08:09:26 crc kubenswrapper[4831]: I1203 08:09:26.964274 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-9lvg5" event={"ID":"6a8df1d7-27d5-417c-b10e-379dad30e5cf","Type":"ContainerStarted","Data":"5f5cb0e20741307c28c6ea4c90e8a9bb19a792c0a882ae6faed307a5104758e6"} Dec 03 08:09:26 crc kubenswrapper[4831]: I1203 08:09:26.997779 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-9lvg5" podStartSLOduration=4.997759307 podStartE2EDuration="4.997759307s" podCreationTimestamp="2025-12-03 08:09:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:09:26.993072581 +0000 UTC m=+5904.336656109" watchObservedRunningTime="2025-12-03 08:09:26.997759307 +0000 UTC m=+5904.341342815" Dec 03 08:09:27 crc kubenswrapper[4831]: I1203 08:09:27.512184 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-9d49-account-create-update-vtgjt"] Dec 03 08:09:27 crc kubenswrapper[4831]: I1203 08:09:27.513539 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9d49-account-create-update-vtgjt" Dec 03 08:09:27 crc kubenswrapper[4831]: I1203 08:09:27.516629 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-db-secret" Dec 03 08:09:27 crc kubenswrapper[4831]: I1203 08:09:27.524237 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-9d49-account-create-update-vtgjt"] Dec 03 08:09:27 crc kubenswrapper[4831]: I1203 08:09:27.606054 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693d9075-b11e-4420-b6d2-53f50b6cbeaf-operator-scripts\") pod \"octavia-9d49-account-create-update-vtgjt\" (UID: \"693d9075-b11e-4420-b6d2-53f50b6cbeaf\") " pod="openstack/octavia-9d49-account-create-update-vtgjt" Dec 03 08:09:27 crc kubenswrapper[4831]: I1203 08:09:27.606525 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kbzhw\" (UniqueName: \"kubernetes.io/projected/693d9075-b11e-4420-b6d2-53f50b6cbeaf-kube-api-access-kbzhw\") pod \"octavia-9d49-account-create-update-vtgjt\" (UID: \"693d9075-b11e-4420-b6d2-53f50b6cbeaf\") " pod="openstack/octavia-9d49-account-create-update-vtgjt" Dec 03 08:09:27 crc kubenswrapper[4831]: I1203 08:09:27.709068 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693d9075-b11e-4420-b6d2-53f50b6cbeaf-operator-scripts\") pod \"octavia-9d49-account-create-update-vtgjt\" (UID: \"693d9075-b11e-4420-b6d2-53f50b6cbeaf\") " pod="openstack/octavia-9d49-account-create-update-vtgjt" Dec 03 08:09:27 crc kubenswrapper[4831]: I1203 08:09:27.709233 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kbzhw\" (UniqueName: \"kubernetes.io/projected/693d9075-b11e-4420-b6d2-53f50b6cbeaf-kube-api-access-kbzhw\") pod \"octavia-9d49-account-create-update-vtgjt\" (UID: \"693d9075-b11e-4420-b6d2-53f50b6cbeaf\") " pod="openstack/octavia-9d49-account-create-update-vtgjt" Dec 03 08:09:27 crc kubenswrapper[4831]: I1203 08:09:27.710553 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693d9075-b11e-4420-b6d2-53f50b6cbeaf-operator-scripts\") pod \"octavia-9d49-account-create-update-vtgjt\" (UID: \"693d9075-b11e-4420-b6d2-53f50b6cbeaf\") " pod="openstack/octavia-9d49-account-create-update-vtgjt" Dec 03 08:09:27 crc kubenswrapper[4831]: I1203 08:09:27.737702 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kbzhw\" (UniqueName: \"kubernetes.io/projected/693d9075-b11e-4420-b6d2-53f50b6cbeaf-kube-api-access-kbzhw\") pod \"octavia-9d49-account-create-update-vtgjt\" (UID: \"693d9075-b11e-4420-b6d2-53f50b6cbeaf\") " pod="openstack/octavia-9d49-account-create-update-vtgjt" Dec 03 08:09:27 crc kubenswrapper[4831]: I1203 08:09:27.846991 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9d49-account-create-update-vtgjt" Dec 03 08:09:27 crc kubenswrapper[4831]: I1203 08:09:27.987630 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:27 crc kubenswrapper[4831]: I1203 08:09:27.987952 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:28 crc kubenswrapper[4831]: W1203 08:09:28.346611 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod693d9075_b11e_4420_b6d2_53f50b6cbeaf.slice/crio-4a022ae3b43f859214d138b4efca9b97ebbfa9e648c6c58efbd3a91893e6debc WatchSource:0}: Error finding container 4a022ae3b43f859214d138b4efca9b97ebbfa9e648c6c58efbd3a91893e6debc: Status 404 returned error can't find the container with id 4a022ae3b43f859214d138b4efca9b97ebbfa9e648c6c58efbd3a91893e6debc Dec 03 08:09:28 crc kubenswrapper[4831]: I1203 08:09:28.350473 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-9d49-account-create-update-vtgjt"] Dec 03 08:09:28 crc kubenswrapper[4831]: I1203 08:09:28.527993 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-vtngw" Dec 03 08:09:28 crc kubenswrapper[4831]: I1203 08:09:28.725404 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6118f2-abda-41fc-9160-cf14d0742581-operator-scripts\") pod \"2c6118f2-abda-41fc-9160-cf14d0742581\" (UID: \"2c6118f2-abda-41fc-9160-cf14d0742581\") " Dec 03 08:09:28 crc kubenswrapper[4831]: I1203 08:09:28.726175 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c6118f2-abda-41fc-9160-cf14d0742581-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2c6118f2-abda-41fc-9160-cf14d0742581" (UID: "2c6118f2-abda-41fc-9160-cf14d0742581"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:09:28 crc kubenswrapper[4831]: I1203 08:09:28.726430 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tdmm\" (UniqueName: \"kubernetes.io/projected/2c6118f2-abda-41fc-9160-cf14d0742581-kube-api-access-4tdmm\") pod \"2c6118f2-abda-41fc-9160-cf14d0742581\" (UID: \"2c6118f2-abda-41fc-9160-cf14d0742581\") " Dec 03 08:09:28 crc kubenswrapper[4831]: I1203 08:09:28.727544 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2c6118f2-abda-41fc-9160-cf14d0742581-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:09:28 crc kubenswrapper[4831]: I1203 08:09:28.731248 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c6118f2-abda-41fc-9160-cf14d0742581-kube-api-access-4tdmm" (OuterVolumeSpecName: "kube-api-access-4tdmm") pod "2c6118f2-abda-41fc-9160-cf14d0742581" (UID: "2c6118f2-abda-41fc-9160-cf14d0742581"). InnerVolumeSpecName "kube-api-access-4tdmm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:09:28 crc kubenswrapper[4831]: I1203 08:09:28.829140 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tdmm\" (UniqueName: \"kubernetes.io/projected/2c6118f2-abda-41fc-9160-cf14d0742581-kube-api-access-4tdmm\") on node \"crc\" DevicePath \"\"" Dec 03 08:09:29 crc kubenswrapper[4831]: I1203 08:09:29.000046 4831 generic.go:334] "Generic (PLEG): container finished" podID="693d9075-b11e-4420-b6d2-53f50b6cbeaf" containerID="2f0f9521b47b441378ea565ea86104d2d347b013bcb1156239e119eb5d036712" exitCode=0 Dec 03 08:09:29 crc kubenswrapper[4831]: I1203 08:09:29.000123 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9d49-account-create-update-vtgjt" event={"ID":"693d9075-b11e-4420-b6d2-53f50b6cbeaf","Type":"ContainerDied","Data":"2f0f9521b47b441378ea565ea86104d2d347b013bcb1156239e119eb5d036712"} Dec 03 08:09:29 crc kubenswrapper[4831]: I1203 08:09:29.000196 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9d49-account-create-update-vtgjt" event={"ID":"693d9075-b11e-4420-b6d2-53f50b6cbeaf","Type":"ContainerStarted","Data":"4a022ae3b43f859214d138b4efca9b97ebbfa9e648c6c58efbd3a91893e6debc"} Dec 03 08:09:29 crc kubenswrapper[4831]: I1203 08:09:29.003427 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-create-vtngw" Dec 03 08:09:29 crc kubenswrapper[4831]: I1203 08:09:29.003427 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-create-vtngw" event={"ID":"2c6118f2-abda-41fc-9160-cf14d0742581","Type":"ContainerDied","Data":"85a694c65bbd3de0a042ef08106132dae4f325ce961e918f860682d0aa6c6e4d"} Dec 03 08:09:29 crc kubenswrapper[4831]: I1203 08:09:29.003479 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="85a694c65bbd3de0a042ef08106132dae4f325ce961e918f860682d0aa6c6e4d" Dec 03 08:09:30 crc kubenswrapper[4831]: I1203 08:09:30.457602 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9d49-account-create-update-vtgjt" Dec 03 08:09:30 crc kubenswrapper[4831]: I1203 08:09:30.560968 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693d9075-b11e-4420-b6d2-53f50b6cbeaf-operator-scripts\") pod \"693d9075-b11e-4420-b6d2-53f50b6cbeaf\" (UID: \"693d9075-b11e-4420-b6d2-53f50b6cbeaf\") " Dec 03 08:09:30 crc kubenswrapper[4831]: I1203 08:09:30.561564 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/693d9075-b11e-4420-b6d2-53f50b6cbeaf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "693d9075-b11e-4420-b6d2-53f50b6cbeaf" (UID: "693d9075-b11e-4420-b6d2-53f50b6cbeaf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:09:30 crc kubenswrapper[4831]: I1203 08:09:30.561572 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kbzhw\" (UniqueName: \"kubernetes.io/projected/693d9075-b11e-4420-b6d2-53f50b6cbeaf-kube-api-access-kbzhw\") pod \"693d9075-b11e-4420-b6d2-53f50b6cbeaf\" (UID: \"693d9075-b11e-4420-b6d2-53f50b6cbeaf\") " Dec 03 08:09:30 crc kubenswrapper[4831]: I1203 08:09:30.562121 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/693d9075-b11e-4420-b6d2-53f50b6cbeaf-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:09:30 crc kubenswrapper[4831]: I1203 08:09:30.567681 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/693d9075-b11e-4420-b6d2-53f50b6cbeaf-kube-api-access-kbzhw" (OuterVolumeSpecName: "kube-api-access-kbzhw") pod "693d9075-b11e-4420-b6d2-53f50b6cbeaf" (UID: "693d9075-b11e-4420-b6d2-53f50b6cbeaf"). InnerVolumeSpecName "kube-api-access-kbzhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:09:30 crc kubenswrapper[4831]: I1203 08:09:30.664575 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kbzhw\" (UniqueName: \"kubernetes.io/projected/693d9075-b11e-4420-b6d2-53f50b6cbeaf-kube-api-access-kbzhw\") on node \"crc\" DevicePath \"\"" Dec 03 08:09:31 crc kubenswrapper[4831]: I1203 08:09:31.021198 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-9d49-account-create-update-vtgjt" Dec 03 08:09:31 crc kubenswrapper[4831]: I1203 08:09:31.022857 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-9d49-account-create-update-vtgjt" event={"ID":"693d9075-b11e-4420-b6d2-53f50b6cbeaf","Type":"ContainerDied","Data":"4a022ae3b43f859214d138b4efca9b97ebbfa9e648c6c58efbd3a91893e6debc"} Dec 03 08:09:31 crc kubenswrapper[4831]: I1203 08:09:31.022893 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4a022ae3b43f859214d138b4efca9b97ebbfa9e648c6c58efbd3a91893e6debc" Dec 03 08:09:32 crc kubenswrapper[4831]: I1203 08:09:32.036391 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-4cvbb"] Dec 03 08:09:32 crc kubenswrapper[4831]: I1203 08:09:32.045032 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-4cvbb"] Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.025849 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e" path="/var/lib/kubelet/pods/5acc32bd-3dfa-482f-aac8-2a2d9e7d3b3e/volumes" Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.499145 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-persistence-db-create-6xrql"] Dec 03 08:09:33 crc kubenswrapper[4831]: E1203 08:09:33.499507 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="693d9075-b11e-4420-b6d2-53f50b6cbeaf" containerName="mariadb-account-create-update" Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.499520 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="693d9075-b11e-4420-b6d2-53f50b6cbeaf" containerName="mariadb-account-create-update" Dec 03 08:09:33 crc kubenswrapper[4831]: E1203 08:09:33.499564 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c6118f2-abda-41fc-9160-cf14d0742581" containerName="mariadb-database-create" Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.499571 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c6118f2-abda-41fc-9160-cf14d0742581" containerName="mariadb-database-create" Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.499737 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c6118f2-abda-41fc-9160-cf14d0742581" containerName="mariadb-database-create" Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.499753 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="693d9075-b11e-4420-b6d2-53f50b6cbeaf" containerName="mariadb-account-create-update" Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.500371 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-6xrql" Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.531770 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-6xrql"] Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.620029 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5c8c1d0-6e54-4b99-9f75-f4620e738213-operator-scripts\") pod \"octavia-persistence-db-create-6xrql\" (UID: \"f5c8c1d0-6e54-4b99-9f75-f4620e738213\") " pod="openstack/octavia-persistence-db-create-6xrql" Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.620135 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lpqzp\" (UniqueName: \"kubernetes.io/projected/f5c8c1d0-6e54-4b99-9f75-f4620e738213-kube-api-access-lpqzp\") pod \"octavia-persistence-db-create-6xrql\" (UID: \"f5c8c1d0-6e54-4b99-9f75-f4620e738213\") " pod="openstack/octavia-persistence-db-create-6xrql" Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.722145 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lpqzp\" (UniqueName: \"kubernetes.io/projected/f5c8c1d0-6e54-4b99-9f75-f4620e738213-kube-api-access-lpqzp\") pod \"octavia-persistence-db-create-6xrql\" (UID: \"f5c8c1d0-6e54-4b99-9f75-f4620e738213\") " pod="openstack/octavia-persistence-db-create-6xrql" Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.722291 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5c8c1d0-6e54-4b99-9f75-f4620e738213-operator-scripts\") pod \"octavia-persistence-db-create-6xrql\" (UID: \"f5c8c1d0-6e54-4b99-9f75-f4620e738213\") " pod="openstack/octavia-persistence-db-create-6xrql" Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.722983 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5c8c1d0-6e54-4b99-9f75-f4620e738213-operator-scripts\") pod \"octavia-persistence-db-create-6xrql\" (UID: \"f5c8c1d0-6e54-4b99-9f75-f4620e738213\") " pod="openstack/octavia-persistence-db-create-6xrql" Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.741843 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lpqzp\" (UniqueName: \"kubernetes.io/projected/f5c8c1d0-6e54-4b99-9f75-f4620e738213-kube-api-access-lpqzp\") pod \"octavia-persistence-db-create-6xrql\" (UID: \"f5c8c1d0-6e54-4b99-9f75-f4620e738213\") " pod="openstack/octavia-persistence-db-create-6xrql" Dec 03 08:09:33 crc kubenswrapper[4831]: I1203 08:09:33.823983 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-6xrql" Dec 03 08:09:34 crc kubenswrapper[4831]: I1203 08:09:34.013040 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:09:34 crc kubenswrapper[4831]: E1203 08:09:34.013705 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:09:34 crc kubenswrapper[4831]: I1203 08:09:34.402994 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-persistence-db-create-6xrql"] Dec 03 08:09:34 crc kubenswrapper[4831]: I1203 08:09:34.618838 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-b4f4-account-create-update-zldmg"] Dec 03 08:09:34 crc kubenswrapper[4831]: I1203 08:09:34.620534 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-b4f4-account-create-update-zldmg" Dec 03 08:09:34 crc kubenswrapper[4831]: I1203 08:09:34.622903 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-persistence-db-secret" Dec 03 08:09:34 crc kubenswrapper[4831]: I1203 08:09:34.629845 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-b4f4-account-create-update-zldmg"] Dec 03 08:09:34 crc kubenswrapper[4831]: I1203 08:09:34.744892 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llw6v\" (UniqueName: \"kubernetes.io/projected/9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3-kube-api-access-llw6v\") pod \"octavia-b4f4-account-create-update-zldmg\" (UID: \"9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3\") " pod="openstack/octavia-b4f4-account-create-update-zldmg" Dec 03 08:09:34 crc kubenswrapper[4831]: I1203 08:09:34.745090 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3-operator-scripts\") pod \"octavia-b4f4-account-create-update-zldmg\" (UID: \"9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3\") " pod="openstack/octavia-b4f4-account-create-update-zldmg" Dec 03 08:09:34 crc kubenswrapper[4831]: I1203 08:09:34.846775 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llw6v\" (UniqueName: \"kubernetes.io/projected/9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3-kube-api-access-llw6v\") pod \"octavia-b4f4-account-create-update-zldmg\" (UID: \"9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3\") " pod="openstack/octavia-b4f4-account-create-update-zldmg" Dec 03 08:09:34 crc kubenswrapper[4831]: I1203 08:09:34.846957 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3-operator-scripts\") pod \"octavia-b4f4-account-create-update-zldmg\" (UID: \"9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3\") " pod="openstack/octavia-b4f4-account-create-update-zldmg" Dec 03 08:09:34 crc kubenswrapper[4831]: I1203 08:09:34.847900 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3-operator-scripts\") pod \"octavia-b4f4-account-create-update-zldmg\" (UID: \"9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3\") " pod="openstack/octavia-b4f4-account-create-update-zldmg" Dec 03 08:09:34 crc kubenswrapper[4831]: I1203 08:09:34.870568 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llw6v\" (UniqueName: \"kubernetes.io/projected/9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3-kube-api-access-llw6v\") pod \"octavia-b4f4-account-create-update-zldmg\" (UID: \"9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3\") " pod="openstack/octavia-b4f4-account-create-update-zldmg" Dec 03 08:09:34 crc kubenswrapper[4831]: I1203 08:09:34.937689 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-b4f4-account-create-update-zldmg" Dec 03 08:09:35 crc kubenswrapper[4831]: I1203 08:09:35.081011 4831 generic.go:334] "Generic (PLEG): container finished" podID="f5c8c1d0-6e54-4b99-9f75-f4620e738213" containerID="19ba57eb0cab60fd76b8207156a52ab068b6b3cd8cb4107a9d3a3285e3c219cc" exitCode=0 Dec 03 08:09:35 crc kubenswrapper[4831]: I1203 08:09:35.081052 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-6xrql" event={"ID":"f5c8c1d0-6e54-4b99-9f75-f4620e738213","Type":"ContainerDied","Data":"19ba57eb0cab60fd76b8207156a52ab068b6b3cd8cb4107a9d3a3285e3c219cc"} Dec 03 08:09:35 crc kubenswrapper[4831]: I1203 08:09:35.081081 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-6xrql" event={"ID":"f5c8c1d0-6e54-4b99-9f75-f4620e738213","Type":"ContainerStarted","Data":"967b171755c407b50e84103f864fbe043d8130015db0754fd0f347f93bf3b613"} Dec 03 08:09:35 crc kubenswrapper[4831]: W1203 08:09:35.441713 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9cdbd5f3_bcc4_4a5c_87be_e2fa05c6b2a3.slice/crio-96850d680df44d8e5596d5bc200a9a8c27ed50b4fe7b81b4f9272a26cc348af2 WatchSource:0}: Error finding container 96850d680df44d8e5596d5bc200a9a8c27ed50b4fe7b81b4f9272a26cc348af2: Status 404 returned error can't find the container with id 96850d680df44d8e5596d5bc200a9a8c27ed50b4fe7b81b4f9272a26cc348af2 Dec 03 08:09:35 crc kubenswrapper[4831]: I1203 08:09:35.448837 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-b4f4-account-create-update-zldmg"] Dec 03 08:09:36 crc kubenswrapper[4831]: I1203 08:09:36.097011 4831 generic.go:334] "Generic (PLEG): container finished" podID="9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3" containerID="dd23da52a8cb037b3156127215b643e85357e4d660c950e512c3a1a1c6c75628" exitCode=0 Dec 03 08:09:36 crc kubenswrapper[4831]: I1203 08:09:36.097105 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-b4f4-account-create-update-zldmg" event={"ID":"9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3","Type":"ContainerDied","Data":"dd23da52a8cb037b3156127215b643e85357e4d660c950e512c3a1a1c6c75628"} Dec 03 08:09:36 crc kubenswrapper[4831]: I1203 08:09:36.097171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-b4f4-account-create-update-zldmg" event={"ID":"9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3","Type":"ContainerStarted","Data":"96850d680df44d8e5596d5bc200a9a8c27ed50b4fe7b81b4f9272a26cc348af2"} Dec 03 08:09:36 crc kubenswrapper[4831]: I1203 08:09:36.522597 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-6xrql" Dec 03 08:09:36 crc kubenswrapper[4831]: I1203 08:09:36.684628 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lpqzp\" (UniqueName: \"kubernetes.io/projected/f5c8c1d0-6e54-4b99-9f75-f4620e738213-kube-api-access-lpqzp\") pod \"f5c8c1d0-6e54-4b99-9f75-f4620e738213\" (UID: \"f5c8c1d0-6e54-4b99-9f75-f4620e738213\") " Dec 03 08:09:36 crc kubenswrapper[4831]: I1203 08:09:36.684803 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5c8c1d0-6e54-4b99-9f75-f4620e738213-operator-scripts\") pod \"f5c8c1d0-6e54-4b99-9f75-f4620e738213\" (UID: \"f5c8c1d0-6e54-4b99-9f75-f4620e738213\") " Dec 03 08:09:36 crc kubenswrapper[4831]: I1203 08:09:36.686155 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f5c8c1d0-6e54-4b99-9f75-f4620e738213-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f5c8c1d0-6e54-4b99-9f75-f4620e738213" (UID: "f5c8c1d0-6e54-4b99-9f75-f4620e738213"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:09:36 crc kubenswrapper[4831]: I1203 08:09:36.701712 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5c8c1d0-6e54-4b99-9f75-f4620e738213-kube-api-access-lpqzp" (OuterVolumeSpecName: "kube-api-access-lpqzp") pod "f5c8c1d0-6e54-4b99-9f75-f4620e738213" (UID: "f5c8c1d0-6e54-4b99-9f75-f4620e738213"). InnerVolumeSpecName "kube-api-access-lpqzp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:09:36 crc kubenswrapper[4831]: I1203 08:09:36.787173 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lpqzp\" (UniqueName: \"kubernetes.io/projected/f5c8c1d0-6e54-4b99-9f75-f4620e738213-kube-api-access-lpqzp\") on node \"crc\" DevicePath \"\"" Dec 03 08:09:36 crc kubenswrapper[4831]: I1203 08:09:36.787435 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f5c8c1d0-6e54-4b99-9f75-f4620e738213-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:09:37 crc kubenswrapper[4831]: I1203 08:09:37.111394 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-persistence-db-create-6xrql" event={"ID":"f5c8c1d0-6e54-4b99-9f75-f4620e738213","Type":"ContainerDied","Data":"967b171755c407b50e84103f864fbe043d8130015db0754fd0f347f93bf3b613"} Dec 03 08:09:37 crc kubenswrapper[4831]: I1203 08:09:37.111761 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="967b171755c407b50e84103f864fbe043d8130015db0754fd0f347f93bf3b613" Dec 03 08:09:37 crc kubenswrapper[4831]: I1203 08:09:37.111461 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-persistence-db-create-6xrql" Dec 03 08:09:37 crc kubenswrapper[4831]: I1203 08:09:37.588524 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-b4f4-account-create-update-zldmg" Dec 03 08:09:37 crc kubenswrapper[4831]: I1203 08:09:37.705953 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3-operator-scripts\") pod \"9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3\" (UID: \"9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3\") " Dec 03 08:09:37 crc kubenswrapper[4831]: I1203 08:09:37.706154 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llw6v\" (UniqueName: \"kubernetes.io/projected/9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3-kube-api-access-llw6v\") pod \"9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3\" (UID: \"9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3\") " Dec 03 08:09:37 crc kubenswrapper[4831]: I1203 08:09:37.706742 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3" (UID: "9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:09:37 crc kubenswrapper[4831]: I1203 08:09:37.711139 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3-kube-api-access-llw6v" (OuterVolumeSpecName: "kube-api-access-llw6v") pod "9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3" (UID: "9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3"). InnerVolumeSpecName "kube-api-access-llw6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:09:37 crc kubenswrapper[4831]: I1203 08:09:37.808631 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:09:37 crc kubenswrapper[4831]: I1203 08:09:37.809024 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llw6v\" (UniqueName: \"kubernetes.io/projected/9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3-kube-api-access-llw6v\") on node \"crc\" DevicePath \"\"" Dec 03 08:09:38 crc kubenswrapper[4831]: I1203 08:09:38.123345 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-b4f4-account-create-update-zldmg" event={"ID":"9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3","Type":"ContainerDied","Data":"96850d680df44d8e5596d5bc200a9a8c27ed50b4fe7b81b4f9272a26cc348af2"} Dec 03 08:09:38 crc kubenswrapper[4831]: I1203 08:09:38.123899 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="96850d680df44d8e5596d5bc200a9a8c27ed50b4fe7b81b4f9272a26cc348af2" Dec 03 08:09:38 crc kubenswrapper[4831]: I1203 08:09:38.123469 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-b4f4-account-create-update-zldmg" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.616924 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-api-6655c79d96-8cnxb"] Dec 03 08:09:40 crc kubenswrapper[4831]: E1203 08:09:40.619426 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3" containerName="mariadb-account-create-update" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.619446 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3" containerName="mariadb-account-create-update" Dec 03 08:09:40 crc kubenswrapper[4831]: E1203 08:09:40.619482 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5c8c1d0-6e54-4b99-9f75-f4620e738213" containerName="mariadb-database-create" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.619492 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5c8c1d0-6e54-4b99-9f75-f4620e738213" containerName="mariadb-database-create" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.619747 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5c8c1d0-6e54-4b99-9f75-f4620e738213" containerName="mariadb-database-create" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.619768 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3" containerName="mariadb-account-create-update" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.621802 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.625812 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-scripts" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.626256 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-octavia-dockercfg-q2dzx" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.626528 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-api-config-data" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.633215 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6655c79d96-8cnxb"] Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.816969 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6653bb-4e85-401b-a22c-f834ceea376b-config-data\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.817588 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ed6653bb-4e85-401b-a22c-f834ceea376b-config-data-merged\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.817736 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/ed6653bb-4e85-401b-a22c-f834ceea376b-octavia-run\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.817882 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed6653bb-4e85-401b-a22c-f834ceea376b-scripts\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.818058 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed6653bb-4e85-401b-a22c-f834ceea376b-combined-ca-bundle\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.919651 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6653bb-4e85-401b-a22c-f834ceea376b-config-data\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.919706 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ed6653bb-4e85-401b-a22c-f834ceea376b-config-data-merged\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.919735 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/ed6653bb-4e85-401b-a22c-f834ceea376b-octavia-run\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.919768 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed6653bb-4e85-401b-a22c-f834ceea376b-scripts\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.919804 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed6653bb-4e85-401b-a22c-f834ceea376b-combined-ca-bundle\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.920638 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"octavia-run\" (UniqueName: \"kubernetes.io/empty-dir/ed6653bb-4e85-401b-a22c-f834ceea376b-octavia-run\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.920694 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/ed6653bb-4e85-401b-a22c-f834ceea376b-config-data-merged\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.926186 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed6653bb-4e85-401b-a22c-f834ceea376b-config-data\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.926684 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed6653bb-4e85-401b-a22c-f834ceea376b-scripts\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.930484 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed6653bb-4e85-401b-a22c-f834ceea376b-combined-ca-bundle\") pod \"octavia-api-6655c79d96-8cnxb\" (UID: \"ed6653bb-4e85-401b-a22c-f834ceea376b\") " pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:40 crc kubenswrapper[4831]: I1203 08:09:40.939222 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:41 crc kubenswrapper[4831]: I1203 08:09:41.434788 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-api-6655c79d96-8cnxb"] Dec 03 08:09:41 crc kubenswrapper[4831]: W1203 08:09:41.438128 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded6653bb_4e85_401b_a22c_f834ceea376b.slice/crio-be8da86f2ec1c540b3ed8bc5630f5aeb0075c57b3550f8297d77ad3463bbf5ea WatchSource:0}: Error finding container be8da86f2ec1c540b3ed8bc5630f5aeb0075c57b3550f8297d77ad3463bbf5ea: Status 404 returned error can't find the container with id be8da86f2ec1c540b3ed8bc5630f5aeb0075c57b3550f8297d77ad3463bbf5ea Dec 03 08:09:42 crc kubenswrapper[4831]: I1203 08:09:42.242707 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6655c79d96-8cnxb" event={"ID":"ed6653bb-4e85-401b-a22c-f834ceea376b","Type":"ContainerStarted","Data":"be8da86f2ec1c540b3ed8bc5630f5aeb0075c57b3550f8297d77ad3463bbf5ea"} Dec 03 08:09:46 crc kubenswrapper[4831]: I1203 08:09:46.013379 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:09:46 crc kubenswrapper[4831]: E1203 08:09:46.013958 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:09:52 crc kubenswrapper[4831]: I1203 08:09:52.375788 4831 generic.go:334] "Generic (PLEG): container finished" podID="ed6653bb-4e85-401b-a22c-f834ceea376b" containerID="bc59d62ce8053950954f91aa8955098d52e72d1f999ede7d733abd98a5c3a6f7" exitCode=0 Dec 03 08:09:52 crc kubenswrapper[4831]: I1203 08:09:52.376306 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6655c79d96-8cnxb" event={"ID":"ed6653bb-4e85-401b-a22c-f834ceea376b","Type":"ContainerDied","Data":"bc59d62ce8053950954f91aa8955098d52e72d1f999ede7d733abd98a5c3a6f7"} Dec 03 08:09:53 crc kubenswrapper[4831]: I1203 08:09:53.405601 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6655c79d96-8cnxb" event={"ID":"ed6653bb-4e85-401b-a22c-f834ceea376b","Type":"ContainerStarted","Data":"5373e3c0fdaf148ec93c8891828ad9d5cc96e1f208a2f05cf67a5880e329eb61"} Dec 03 08:09:53 crc kubenswrapper[4831]: I1203 08:09:53.405984 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-api-6655c79d96-8cnxb" event={"ID":"ed6653bb-4e85-401b-a22c-f834ceea376b","Type":"ContainerStarted","Data":"4827acee73c53bbf07b2fb1de718aebe583d146ceec5320e9e25e4d238c00e60"} Dec 03 08:09:53 crc kubenswrapper[4831]: I1203 08:09:53.408042 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:53 crc kubenswrapper[4831]: I1203 08:09:53.408165 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:09:53 crc kubenswrapper[4831]: I1203 08:09:53.453923 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-api-6655c79d96-8cnxb" podStartSLOduration=3.579970263 podStartE2EDuration="13.453899886s" podCreationTimestamp="2025-12-03 08:09:40 +0000 UTC" firstStartedPulling="2025-12-03 08:09:41.440444064 +0000 UTC m=+5918.784027572" lastFinishedPulling="2025-12-03 08:09:51.314373687 +0000 UTC m=+5928.657957195" observedRunningTime="2025-12-03 08:09:53.438636731 +0000 UTC m=+5930.782220259" watchObservedRunningTime="2025-12-03 08:09:53.453899886 +0000 UTC m=+5930.797483404" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.120370 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-rsyslog-7vxgf"] Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.123218 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.126457 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-config-data" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.130978 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"octavia-hmport-map" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.131282 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-rsyslog-scripts" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.148638 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-7vxgf"] Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.246835 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d81b16a8-4c45-4d34-8c90-1e6cd00ead93-hm-ports\") pod \"octavia-rsyslog-7vxgf\" (UID: \"d81b16a8-4c45-4d34-8c90-1e6cd00ead93\") " pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.247184 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d81b16a8-4c45-4d34-8c90-1e6cd00ead93-scripts\") pod \"octavia-rsyslog-7vxgf\" (UID: \"d81b16a8-4c45-4d34-8c90-1e6cd00ead93\") " pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.247310 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d81b16a8-4c45-4d34-8c90-1e6cd00ead93-config-data-merged\") pod \"octavia-rsyslog-7vxgf\" (UID: \"d81b16a8-4c45-4d34-8c90-1e6cd00ead93\") " pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.247368 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d81b16a8-4c45-4d34-8c90-1e6cd00ead93-config-data\") pod \"octavia-rsyslog-7vxgf\" (UID: \"d81b16a8-4c45-4d34-8c90-1e6cd00ead93\") " pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.348820 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d81b16a8-4c45-4d34-8c90-1e6cd00ead93-config-data-merged\") pod \"octavia-rsyslog-7vxgf\" (UID: \"d81b16a8-4c45-4d34-8c90-1e6cd00ead93\") " pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.348885 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d81b16a8-4c45-4d34-8c90-1e6cd00ead93-config-data\") pod \"octavia-rsyslog-7vxgf\" (UID: \"d81b16a8-4c45-4d34-8c90-1e6cd00ead93\") " pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.348931 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d81b16a8-4c45-4d34-8c90-1e6cd00ead93-hm-ports\") pod \"octavia-rsyslog-7vxgf\" (UID: \"d81b16a8-4c45-4d34-8c90-1e6cd00ead93\") " pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.348996 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d81b16a8-4c45-4d34-8c90-1e6cd00ead93-scripts\") pod \"octavia-rsyslog-7vxgf\" (UID: \"d81b16a8-4c45-4d34-8c90-1e6cd00ead93\") " pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.349888 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d81b16a8-4c45-4d34-8c90-1e6cd00ead93-config-data-merged\") pod \"octavia-rsyslog-7vxgf\" (UID: \"d81b16a8-4c45-4d34-8c90-1e6cd00ead93\") " pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.350452 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d81b16a8-4c45-4d34-8c90-1e6cd00ead93-hm-ports\") pod \"octavia-rsyslog-7vxgf\" (UID: \"d81b16a8-4c45-4d34-8c90-1e6cd00ead93\") " pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.355012 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d81b16a8-4c45-4d34-8c90-1e6cd00ead93-config-data\") pod \"octavia-rsyslog-7vxgf\" (UID: \"d81b16a8-4c45-4d34-8c90-1e6cd00ead93\") " pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.355483 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d81b16a8-4c45-4d34-8c90-1e6cd00ead93-scripts\") pod \"octavia-rsyslog-7vxgf\" (UID: \"d81b16a8-4c45-4d34-8c90-1e6cd00ead93\") " pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.491175 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.799347 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-image-upload-59f8cff499-8hjch"] Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.801882 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-8hjch" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.804812 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-config-data" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.811500 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-8hjch"] Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.967704 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5368cba-aa23-4696-92e8-f5c2306d8f57-httpd-config\") pod \"octavia-image-upload-59f8cff499-8hjch\" (UID: \"f5368cba-aa23-4696-92e8-f5c2306d8f57\") " pod="openstack/octavia-image-upload-59f8cff499-8hjch" Dec 03 08:09:55 crc kubenswrapper[4831]: I1203 08:09:55.967745 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/f5368cba-aa23-4696-92e8-f5c2306d8f57-amphora-image\") pod \"octavia-image-upload-59f8cff499-8hjch\" (UID: \"f5368cba-aa23-4696-92e8-f5c2306d8f57\") " pod="openstack/octavia-image-upload-59f8cff499-8hjch" Dec 03 08:09:56 crc kubenswrapper[4831]: I1203 08:09:56.069857 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5368cba-aa23-4696-92e8-f5c2306d8f57-httpd-config\") pod \"octavia-image-upload-59f8cff499-8hjch\" (UID: \"f5368cba-aa23-4696-92e8-f5c2306d8f57\") " pod="openstack/octavia-image-upload-59f8cff499-8hjch" Dec 03 08:09:56 crc kubenswrapper[4831]: I1203 08:09:56.069898 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/f5368cba-aa23-4696-92e8-f5c2306d8f57-amphora-image\") pod \"octavia-image-upload-59f8cff499-8hjch\" (UID: \"f5368cba-aa23-4696-92e8-f5c2306d8f57\") " pod="openstack/octavia-image-upload-59f8cff499-8hjch" Dec 03 08:09:56 crc kubenswrapper[4831]: I1203 08:09:56.070875 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/f5368cba-aa23-4696-92e8-f5c2306d8f57-amphora-image\") pod \"octavia-image-upload-59f8cff499-8hjch\" (UID: \"f5368cba-aa23-4696-92e8-f5c2306d8f57\") " pod="openstack/octavia-image-upload-59f8cff499-8hjch" Dec 03 08:09:56 crc kubenswrapper[4831]: W1203 08:09:56.072198 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd81b16a8_4c45_4d34_8c90_1e6cd00ead93.slice/crio-69ee4ba843a9e16b53a6d438fab52740d4f4d03002597b633f24e3472fcd93a4 WatchSource:0}: Error finding container 69ee4ba843a9e16b53a6d438fab52740d4f4d03002597b633f24e3472fcd93a4: Status 404 returned error can't find the container with id 69ee4ba843a9e16b53a6d438fab52740d4f4d03002597b633f24e3472fcd93a4 Dec 03 08:09:56 crc kubenswrapper[4831]: I1203 08:09:56.083719 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-rsyslog-7vxgf"] Dec 03 08:09:56 crc kubenswrapper[4831]: I1203 08:09:56.087828 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5368cba-aa23-4696-92e8-f5c2306d8f57-httpd-config\") pod \"octavia-image-upload-59f8cff499-8hjch\" (UID: \"f5368cba-aa23-4696-92e8-f5c2306d8f57\") " pod="openstack/octavia-image-upload-59f8cff499-8hjch" Dec 03 08:09:56 crc kubenswrapper[4831]: I1203 08:09:56.136828 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-8hjch" Dec 03 08:09:56 crc kubenswrapper[4831]: I1203 08:09:56.455444 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-7vxgf" event={"ID":"d81b16a8-4c45-4d34-8c90-1e6cd00ead93","Type":"ContainerStarted","Data":"69ee4ba843a9e16b53a6d438fab52740d4f4d03002597b633f24e3472fcd93a4"} Dec 03 08:09:56 crc kubenswrapper[4831]: I1203 08:09:56.704061 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-8hjch"] Dec 03 08:09:57 crc kubenswrapper[4831]: I1203 08:09:57.468305 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-8hjch" event={"ID":"f5368cba-aa23-4696-92e8-f5c2306d8f57","Type":"ContainerStarted","Data":"edf185dea521ca18382b402d0ae6dced43cbc3690b62ce48b00e1493957da8e7"} Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.414537 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.422483 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-9lvg5" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.551492 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-dcd7t-config-qkj8c"] Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.553164 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.563851 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dcd7t-config-qkj8c"] Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.591717 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.618668 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-run-ovn\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.618712 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42c97ca1-7beb-4558-81d7-415fe4c11105-scripts\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.618755 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-log-ovn\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.618778 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rk2f\" (UniqueName: \"kubernetes.io/projected/42c97ca1-7beb-4558-81d7-415fe4c11105-kube-api-access-2rk2f\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.618850 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/42c97ca1-7beb-4558-81d7-415fe4c11105-additional-scripts\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.618875 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-run\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.707975 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-dcd7t" podUID="0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8" containerName="ovn-controller" probeResult="failure" output=< Dec 03 08:09:58 crc kubenswrapper[4831]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Dec 03 08:09:58 crc kubenswrapper[4831]: > Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.720736 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-log-ovn\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.720837 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2rk2f\" (UniqueName: \"kubernetes.io/projected/42c97ca1-7beb-4558-81d7-415fe4c11105-kube-api-access-2rk2f\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.720929 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/42c97ca1-7beb-4558-81d7-415fe4c11105-additional-scripts\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.720965 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-run\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.721062 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-run-ovn\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.721093 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42c97ca1-7beb-4558-81d7-415fe4c11105-scripts\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.723154 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42c97ca1-7beb-4558-81d7-415fe4c11105-scripts\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.723873 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-run\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.723920 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/42c97ca1-7beb-4558-81d7-415fe4c11105-additional-scripts\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.723935 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-run-ovn\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.723971 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-log-ovn\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.756582 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rk2f\" (UniqueName: \"kubernetes.io/projected/42c97ca1-7beb-4558-81d7-415fe4c11105-kube-api-access-2rk2f\") pod \"ovn-controller-dcd7t-config-qkj8c\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:58 crc kubenswrapper[4831]: I1203 08:09:58.917639 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:09:59 crc kubenswrapper[4831]: I1203 08:09:59.391844 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-dcd7t-config-qkj8c"] Dec 03 08:09:59 crc kubenswrapper[4831]: I1203 08:09:59.491852 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-7vxgf" event={"ID":"d81b16a8-4c45-4d34-8c90-1e6cd00ead93","Type":"ContainerStarted","Data":"ad58aa9ccc7953ec50173e7c8056c25d0abe1fb49df105f5acd57ffc44ef60e5"} Dec 03 08:09:59 crc kubenswrapper[4831]: I1203 08:09:59.494693 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dcd7t-config-qkj8c" event={"ID":"42c97ca1-7beb-4558-81d7-415fe4c11105","Type":"ContainerStarted","Data":"26238d0a71af10cecf43d2394ab143efbd30d6b11031b63e52d710a7bf1424cc"} Dec 03 08:10:00 crc kubenswrapper[4831]: I1203 08:10:00.012738 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:10:00 crc kubenswrapper[4831]: E1203 08:10:00.013090 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:10:00 crc kubenswrapper[4831]: I1203 08:10:00.507258 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dcd7t-config-qkj8c" event={"ID":"42c97ca1-7beb-4558-81d7-415fe4c11105","Type":"ContainerStarted","Data":"8d49e18455a26e04c7b8e4d1f7356be7506a77d06a07c4d9aa202bf278a68a87"} Dec 03 08:10:00 crc kubenswrapper[4831]: I1203 08:10:00.530083 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-dcd7t-config-qkj8c" podStartSLOduration=2.530061858 podStartE2EDuration="2.530061858s" podCreationTimestamp="2025-12-03 08:09:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:10:00.522886593 +0000 UTC m=+5937.866470121" watchObservedRunningTime="2025-12-03 08:10:00.530061858 +0000 UTC m=+5937.873645376" Dec 03 08:10:00 crc kubenswrapper[4831]: I1203 08:10:00.983972 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-db-sync-mk5pz"] Dec 03 08:10:00 crc kubenswrapper[4831]: I1203 08:10:00.991369 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:00 crc kubenswrapper[4831]: I1203 08:10:00.995130 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-scripts" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:00.997356 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-mk5pz"] Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.071499 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-combined-ca-bundle\") pod \"octavia-db-sync-mk5pz\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.071542 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-scripts\") pod \"octavia-db-sync-mk5pz\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.071570 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-config-data\") pod \"octavia-db-sync-mk5pz\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.071587 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-config-data-merged\") pod \"octavia-db-sync-mk5pz\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.173454 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-combined-ca-bundle\") pod \"octavia-db-sync-mk5pz\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.173500 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-scripts\") pod \"octavia-db-sync-mk5pz\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.173527 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-config-data\") pod \"octavia-db-sync-mk5pz\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.173542 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-config-data-merged\") pod \"octavia-db-sync-mk5pz\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.174048 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-config-data-merged\") pod \"octavia-db-sync-mk5pz\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.179937 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-config-data\") pod \"octavia-db-sync-mk5pz\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.179932 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-scripts\") pod \"octavia-db-sync-mk5pz\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.180512 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-combined-ca-bundle\") pod \"octavia-db-sync-mk5pz\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.331421 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.518676 4831 generic.go:334] "Generic (PLEG): container finished" podID="42c97ca1-7beb-4558-81d7-415fe4c11105" containerID="8d49e18455a26e04c7b8e4d1f7356be7506a77d06a07c4d9aa202bf278a68a87" exitCode=0 Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.519038 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dcd7t-config-qkj8c" event={"ID":"42c97ca1-7beb-4558-81d7-415fe4c11105","Type":"ContainerDied","Data":"8d49e18455a26e04c7b8e4d1f7356be7506a77d06a07c4d9aa202bf278a68a87"} Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.525521 4831 generic.go:334] "Generic (PLEG): container finished" podID="d81b16a8-4c45-4d34-8c90-1e6cd00ead93" containerID="ad58aa9ccc7953ec50173e7c8056c25d0abe1fb49df105f5acd57ffc44ef60e5" exitCode=0 Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.525560 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-7vxgf" event={"ID":"d81b16a8-4c45-4d34-8c90-1e6cd00ead93","Type":"ContainerDied","Data":"ad58aa9ccc7953ec50173e7c8056c25d0abe1fb49df105f5acd57ffc44ef60e5"} Dec 03 08:10:01 crc kubenswrapper[4831]: I1203 08:10:01.823527 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-db-sync-mk5pz"] Dec 03 08:10:01 crc kubenswrapper[4831]: W1203 08:10:01.826079 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7cd6ee4f_27fd_4e4a_aea7_5862219fdf3c.slice/crio-f19bfe2e98bdcc30b4f996d99b2dfbb1cc4906abd20457ebd3e87bd59215b589 WatchSource:0}: Error finding container f19bfe2e98bdcc30b4f996d99b2dfbb1cc4906abd20457ebd3e87bd59215b589: Status 404 returned error can't find the container with id f19bfe2e98bdcc30b4f996d99b2dfbb1cc4906abd20457ebd3e87bd59215b589 Dec 03 08:10:02 crc kubenswrapper[4831]: I1203 08:10:02.535558 4831 generic.go:334] "Generic (PLEG): container finished" podID="7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c" containerID="9c743015b310deec86cc3b40b4ece090a4fba60b8366f77ecc886cbd5e3e3e3e" exitCode=0 Dec 03 08:10:02 crc kubenswrapper[4831]: I1203 08:10:02.535676 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-mk5pz" event={"ID":"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c","Type":"ContainerDied","Data":"9c743015b310deec86cc3b40b4ece090a4fba60b8366f77ecc886cbd5e3e3e3e"} Dec 03 08:10:02 crc kubenswrapper[4831]: I1203 08:10:02.535936 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-mk5pz" event={"ID":"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c","Type":"ContainerStarted","Data":"f19bfe2e98bdcc30b4f996d99b2dfbb1cc4906abd20457ebd3e87bd59215b589"} Dec 03 08:10:03 crc kubenswrapper[4831]: I1203 08:10:03.658592 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-dcd7t" Dec 03 08:10:11 crc kubenswrapper[4831]: I1203 08:10:11.012852 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:10:11 crc kubenswrapper[4831]: E1203 08:10:11.013843 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:10:13 crc kubenswrapper[4831]: E1203 08:10:13.097562 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified" Dec 03 08:10:13 crc kubenswrapper[4831]: E1203 08:10:13.098154 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:octavia-rsyslog,Image:quay.io/podified-antelope-centos9/openstack-rsyslog:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n554h577h575h5fdh684h65ch5dbh7ch64chc7h5f7h56bh667h68fhc5h95h5b7h55bh7fh646h56ch688h65ch574h5bfh57fh5b9h5d5h5c7h594h677hb4q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:MGMT_CIDR,Value:172.24.0.0/16,ValueFrom:nil,},EnvVar{Name:MGMT_GATEWAY,Value:172.23.0.150,ValueFrom:nil,},EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-merged,ReadOnly:false,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-merged,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:octavia-rsyslog-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pgrep -r DRST rsyslog],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:15,PeriodSeconds:13,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/pgrep -r DRST rsyslog],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:15,PeriodSeconds:15,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-rsyslog-7vxgf_openstack(d81b16a8-4c45-4d34-8c90-1e6cd00ead93): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 08:10:13 crc kubenswrapper[4831]: E1203 08:10:13.099758 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"octavia-rsyslog\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/octavia-rsyslog-7vxgf" podUID="d81b16a8-4c45-4d34-8c90-1e6cd00ead93" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.209703 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.307786 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-run-ovn\") pod \"42c97ca1-7beb-4558-81d7-415fe4c11105\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.307886 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2rk2f\" (UniqueName: \"kubernetes.io/projected/42c97ca1-7beb-4558-81d7-415fe4c11105-kube-api-access-2rk2f\") pod \"42c97ca1-7beb-4558-81d7-415fe4c11105\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.308088 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/42c97ca1-7beb-4558-81d7-415fe4c11105-additional-scripts\") pod \"42c97ca1-7beb-4558-81d7-415fe4c11105\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.308163 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42c97ca1-7beb-4558-81d7-415fe4c11105-scripts\") pod \"42c97ca1-7beb-4558-81d7-415fe4c11105\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.308293 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-run\") pod \"42c97ca1-7beb-4558-81d7-415fe4c11105\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.308373 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-log-ovn\") pod \"42c97ca1-7beb-4558-81d7-415fe4c11105\" (UID: \"42c97ca1-7beb-4558-81d7-415fe4c11105\") " Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.309188 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "42c97ca1-7beb-4558-81d7-415fe4c11105" (UID: "42c97ca1-7beb-4558-81d7-415fe4c11105"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.309262 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "42c97ca1-7beb-4558-81d7-415fe4c11105" (UID: "42c97ca1-7beb-4558-81d7-415fe4c11105"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.310551 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-run" (OuterVolumeSpecName: "var-run") pod "42c97ca1-7beb-4558-81d7-415fe4c11105" (UID: "42c97ca1-7beb-4558-81d7-415fe4c11105"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.315138 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42c97ca1-7beb-4558-81d7-415fe4c11105-scripts" (OuterVolumeSpecName: "scripts") pod "42c97ca1-7beb-4558-81d7-415fe4c11105" (UID: "42c97ca1-7beb-4558-81d7-415fe4c11105"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.316455 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/42c97ca1-7beb-4558-81d7-415fe4c11105-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "42c97ca1-7beb-4558-81d7-415fe4c11105" (UID: "42c97ca1-7beb-4558-81d7-415fe4c11105"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.346088 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42c97ca1-7beb-4558-81d7-415fe4c11105-kube-api-access-2rk2f" (OuterVolumeSpecName: "kube-api-access-2rk2f") pod "42c97ca1-7beb-4558-81d7-415fe4c11105" (UID: "42c97ca1-7beb-4558-81d7-415fe4c11105"). InnerVolumeSpecName "kube-api-access-2rk2f". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.410045 4831 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/42c97ca1-7beb-4558-81d7-415fe4c11105-additional-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.410085 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/42c97ca1-7beb-4558-81d7-415fe4c11105-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.410096 4831 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-run\") on node \"crc\" DevicePath \"\"" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.410105 4831 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-log-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.410115 4831 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/42c97ca1-7beb-4558-81d7-415fe4c11105-var-run-ovn\") on node \"crc\" DevicePath \"\"" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.410124 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2rk2f\" (UniqueName: \"kubernetes.io/projected/42c97ca1-7beb-4558-81d7-415fe4c11105-kube-api-access-2rk2f\") on node \"crc\" DevicePath \"\"" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.701541 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-dcd7t-config-qkj8c" event={"ID":"42c97ca1-7beb-4558-81d7-415fe4c11105","Type":"ContainerDied","Data":"26238d0a71af10cecf43d2394ab143efbd30d6b11031b63e52d710a7bf1424cc"} Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.701586 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26238d0a71af10cecf43d2394ab143efbd30d6b11031b63e52d710a7bf1424cc" Dec 03 08:10:13 crc kubenswrapper[4831]: I1203 08:10:13.701596 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-dcd7t-config-qkj8c" Dec 03 08:10:13 crc kubenswrapper[4831]: E1203 08:10:13.702185 4831 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/gthiemonge/octavia-amphora-image:latest" Dec 03 08:10:13 crc kubenswrapper[4831]: E1203 08:10:13.702343 4831 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/gthiemonge/octavia-amphora-image,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DEST_DIR,Value:/usr/local/apache2/htdocs,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:amphora-image,ReadOnly:false,MountPath:/usr/local/apache2/htdocs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-image-upload-59f8cff499-8hjch_openstack(f5368cba-aa23-4696-92e8-f5c2306d8f57): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Dec 03 08:10:13 crc kubenswrapper[4831]: E1203 08:10:13.703465 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/octavia-image-upload-59f8cff499-8hjch" podUID="f5368cba-aa23-4696-92e8-f5c2306d8f57" Dec 03 08:10:14 crc kubenswrapper[4831]: I1203 08:10:14.315728 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-dcd7t-config-qkj8c"] Dec 03 08:10:14 crc kubenswrapper[4831]: I1203 08:10:14.323995 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-dcd7t-config-qkj8c"] Dec 03 08:10:14 crc kubenswrapper[4831]: I1203 08:10:14.712899 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-mk5pz" event={"ID":"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c","Type":"ContainerStarted","Data":"e4dd7560ac1a7be74e9ad6ee92475b899ef996e4a5122626bf36aece0fa1fe41"} Dec 03 08:10:14 crc kubenswrapper[4831]: E1203 08:10:14.715304 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/gthiemonge/octavia-amphora-image\\\"\"" pod="openstack/octavia-image-upload-59f8cff499-8hjch" podUID="f5368cba-aa23-4696-92e8-f5c2306d8f57" Dec 03 08:10:14 crc kubenswrapper[4831]: I1203 08:10:14.753158 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-db-sync-mk5pz" podStartSLOduration=14.753142004 podStartE2EDuration="14.753142004s" podCreationTimestamp="2025-12-03 08:10:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:10:14.749682107 +0000 UTC m=+5952.093265635" watchObservedRunningTime="2025-12-03 08:10:14.753142004 +0000 UTC m=+5952.096725512" Dec 03 08:10:15 crc kubenswrapper[4831]: I1203 08:10:15.043074 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42c97ca1-7beb-4558-81d7-415fe4c11105" path="/var/lib/kubelet/pods/42c97ca1-7beb-4558-81d7-415fe4c11105/volumes" Dec 03 08:10:15 crc kubenswrapper[4831]: I1203 08:10:15.268461 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:10:15 crc kubenswrapper[4831]: I1203 08:10:15.553299 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-api-6655c79d96-8cnxb" Dec 03 08:10:17 crc kubenswrapper[4831]: I1203 08:10:17.741832 4831 generic.go:334] "Generic (PLEG): container finished" podID="7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c" containerID="e4dd7560ac1a7be74e9ad6ee92475b899ef996e4a5122626bf36aece0fa1fe41" exitCode=0 Dec 03 08:10:17 crc kubenswrapper[4831]: I1203 08:10:17.742187 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-mk5pz" event={"ID":"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c","Type":"ContainerDied","Data":"e4dd7560ac1a7be74e9ad6ee92475b899ef996e4a5122626bf36aece0fa1fe41"} Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.170267 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.230069 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-config-data-merged\") pod \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.230246 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-config-data\") pod \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.230301 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-scripts\") pod \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.230447 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-combined-ca-bundle\") pod \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\" (UID: \"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c\") " Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.235890 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-scripts" (OuterVolumeSpecName: "scripts") pod "7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c" (UID: "7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.236404 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-config-data" (OuterVolumeSpecName: "config-data") pod "7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c" (UID: "7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.255777 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c" (UID: "7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.256780 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c" (UID: "7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.336086 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.336130 4831 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-config-data-merged\") on node \"crc\" DevicePath \"\"" Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.336143 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.336153 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.766288 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-db-sync-mk5pz" event={"ID":"7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c","Type":"ContainerDied","Data":"f19bfe2e98bdcc30b4f996d99b2dfbb1cc4906abd20457ebd3e87bd59215b589"} Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.766341 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f19bfe2e98bdcc30b4f996d99b2dfbb1cc4906abd20457ebd3e87bd59215b589" Dec 03 08:10:19 crc kubenswrapper[4831]: I1203 08:10:19.766406 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-db-sync-mk5pz" Dec 03 08:10:20 crc kubenswrapper[4831]: I1203 08:10:20.401958 4831 scope.go:117] "RemoveContainer" containerID="4118cdbf98f0a5b447c2361fdcae316c630d4c8a1e091b507663859a4ee3db73" Dec 03 08:10:20 crc kubenswrapper[4831]: I1203 08:10:20.461427 4831 scope.go:117] "RemoveContainer" containerID="c04360e9abf6313203910473b490fd4400547523df0b433bc6b72f055e338698" Dec 03 08:10:26 crc kubenswrapper[4831]: I1203 08:10:26.013840 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:10:26 crc kubenswrapper[4831]: E1203 08:10:26.015248 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:10:27 crc kubenswrapper[4831]: I1203 08:10:27.862498 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-8hjch" event={"ID":"f5368cba-aa23-4696-92e8-f5c2306d8f57","Type":"ContainerStarted","Data":"86be6c69db64015e9961e09e26f73a4012299a4035f6c21a00af8fa37cd69399"} Dec 03 08:10:29 crc kubenswrapper[4831]: I1203 08:10:29.889446 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-rsyslog-7vxgf" event={"ID":"d81b16a8-4c45-4d34-8c90-1e6cd00ead93","Type":"ContainerStarted","Data":"455616724e8217e224528ab35700e8fbbb2a351bc537107053c3420c7df19836"} Dec 03 08:10:29 crc kubenswrapper[4831]: I1203 08:10:29.890358 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:10:29 crc kubenswrapper[4831]: I1203 08:10:29.912518 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-rsyslog-7vxgf" podStartSLOduration=2.097233265 podStartE2EDuration="34.912499111s" podCreationTimestamp="2025-12-03 08:09:55 +0000 UTC" firstStartedPulling="2025-12-03 08:09:56.075852323 +0000 UTC m=+5933.419435821" lastFinishedPulling="2025-12-03 08:10:28.891118149 +0000 UTC m=+5966.234701667" observedRunningTime="2025-12-03 08:10:29.911772829 +0000 UTC m=+5967.255356337" watchObservedRunningTime="2025-12-03 08:10:29.912499111 +0000 UTC m=+5967.256082629" Dec 03 08:10:31 crc kubenswrapper[4831]: I1203 08:10:31.931628 4831 generic.go:334] "Generic (PLEG): container finished" podID="f5368cba-aa23-4696-92e8-f5c2306d8f57" containerID="86be6c69db64015e9961e09e26f73a4012299a4035f6c21a00af8fa37cd69399" exitCode=0 Dec 03 08:10:31 crc kubenswrapper[4831]: I1203 08:10:31.931844 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-8hjch" event={"ID":"f5368cba-aa23-4696-92e8-f5c2306d8f57","Type":"ContainerDied","Data":"86be6c69db64015e9961e09e26f73a4012299a4035f6c21a00af8fa37cd69399"} Dec 03 08:10:33 crc kubenswrapper[4831]: I1203 08:10:33.957643 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-8hjch" event={"ID":"f5368cba-aa23-4696-92e8-f5c2306d8f57","Type":"ContainerStarted","Data":"89e6452bf9ca30fc50daa2d46dc543e742431bfc0287ed2a85aae4e916107226"} Dec 03 08:10:40 crc kubenswrapper[4831]: I1203 08:10:40.604257 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-rsyslog-7vxgf" Dec 03 08:10:40 crc kubenswrapper[4831]: I1203 08:10:40.627883 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-image-upload-59f8cff499-8hjch" podStartSLOduration=9.143589979 podStartE2EDuration="45.627859824s" podCreationTimestamp="2025-12-03 08:09:55 +0000 UTC" firstStartedPulling="2025-12-03 08:09:56.709463477 +0000 UTC m=+5934.053046985" lastFinishedPulling="2025-12-03 08:10:33.193733312 +0000 UTC m=+5970.537316830" observedRunningTime="2025-12-03 08:10:33.985755711 +0000 UTC m=+5971.329339269" watchObservedRunningTime="2025-12-03 08:10:40.627859824 +0000 UTC m=+5977.971443342" Dec 03 08:10:41 crc kubenswrapper[4831]: I1203 08:10:41.021509 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:10:41 crc kubenswrapper[4831]: E1203 08:10:41.021777 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:10:54 crc kubenswrapper[4831]: I1203 08:10:54.013030 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:10:54 crc kubenswrapper[4831]: E1203 08:10:54.014272 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:10:54 crc kubenswrapper[4831]: I1203 08:10:54.063616 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-8hjch"] Dec 03 08:10:54 crc kubenswrapper[4831]: I1203 08:10:54.063885 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/octavia-image-upload-59f8cff499-8hjch" podUID="f5368cba-aa23-4696-92e8-f5c2306d8f57" containerName="octavia-amphora-httpd" containerID="cri-o://89e6452bf9ca30fc50daa2d46dc543e742431bfc0287ed2a85aae4e916107226" gracePeriod=30 Dec 03 08:10:54 crc kubenswrapper[4831]: I1203 08:10:54.666926 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-8hjch" Dec 03 08:10:54 crc kubenswrapper[4831]: I1203 08:10:54.829859 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5368cba-aa23-4696-92e8-f5c2306d8f57-httpd-config\") pod \"f5368cba-aa23-4696-92e8-f5c2306d8f57\" (UID: \"f5368cba-aa23-4696-92e8-f5c2306d8f57\") " Dec 03 08:10:54 crc kubenswrapper[4831]: I1203 08:10:54.829956 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/f5368cba-aa23-4696-92e8-f5c2306d8f57-amphora-image\") pod \"f5368cba-aa23-4696-92e8-f5c2306d8f57\" (UID: \"f5368cba-aa23-4696-92e8-f5c2306d8f57\") " Dec 03 08:10:54 crc kubenswrapper[4831]: I1203 08:10:54.879745 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5368cba-aa23-4696-92e8-f5c2306d8f57-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "f5368cba-aa23-4696-92e8-f5c2306d8f57" (UID: "f5368cba-aa23-4696-92e8-f5c2306d8f57"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:10:54 crc kubenswrapper[4831]: I1203 08:10:54.882750 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5368cba-aa23-4696-92e8-f5c2306d8f57-amphora-image" (OuterVolumeSpecName: "amphora-image") pod "f5368cba-aa23-4696-92e8-f5c2306d8f57" (UID: "f5368cba-aa23-4696-92e8-f5c2306d8f57"). InnerVolumeSpecName "amphora-image". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:10:54 crc kubenswrapper[4831]: I1203 08:10:54.932290 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f5368cba-aa23-4696-92e8-f5c2306d8f57-httpd-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:10:54 crc kubenswrapper[4831]: I1203 08:10:54.932349 4831 reconciler_common.go:293] "Volume detached for volume \"amphora-image\" (UniqueName: \"kubernetes.io/empty-dir/f5368cba-aa23-4696-92e8-f5c2306d8f57-amphora-image\") on node \"crc\" DevicePath \"\"" Dec 03 08:10:55 crc kubenswrapper[4831]: I1203 08:10:55.192240 4831 generic.go:334] "Generic (PLEG): container finished" podID="f5368cba-aa23-4696-92e8-f5c2306d8f57" containerID="89e6452bf9ca30fc50daa2d46dc543e742431bfc0287ed2a85aae4e916107226" exitCode=0 Dec 03 08:10:55 crc kubenswrapper[4831]: I1203 08:10:55.192371 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-8hjch" event={"ID":"f5368cba-aa23-4696-92e8-f5c2306d8f57","Type":"ContainerDied","Data":"89e6452bf9ca30fc50daa2d46dc543e742431bfc0287ed2a85aae4e916107226"} Dec 03 08:10:55 crc kubenswrapper[4831]: I1203 08:10:55.192426 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-image-upload-59f8cff499-8hjch" event={"ID":"f5368cba-aa23-4696-92e8-f5c2306d8f57","Type":"ContainerDied","Data":"edf185dea521ca18382b402d0ae6dced43cbc3690b62ce48b00e1493957da8e7"} Dec 03 08:10:55 crc kubenswrapper[4831]: I1203 08:10:55.192443 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-image-upload-59f8cff499-8hjch" Dec 03 08:10:55 crc kubenswrapper[4831]: I1203 08:10:55.192458 4831 scope.go:117] "RemoveContainer" containerID="89e6452bf9ca30fc50daa2d46dc543e742431bfc0287ed2a85aae4e916107226" Dec 03 08:10:55 crc kubenswrapper[4831]: I1203 08:10:55.223967 4831 scope.go:117] "RemoveContainer" containerID="86be6c69db64015e9961e09e26f73a4012299a4035f6c21a00af8fa37cd69399" Dec 03 08:10:55 crc kubenswrapper[4831]: I1203 08:10:55.227081 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-8hjch"] Dec 03 08:10:55 crc kubenswrapper[4831]: I1203 08:10:55.240937 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-image-upload-59f8cff499-8hjch"] Dec 03 08:10:55 crc kubenswrapper[4831]: I1203 08:10:55.248962 4831 scope.go:117] "RemoveContainer" containerID="89e6452bf9ca30fc50daa2d46dc543e742431bfc0287ed2a85aae4e916107226" Dec 03 08:10:55 crc kubenswrapper[4831]: E1203 08:10:55.249517 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89e6452bf9ca30fc50daa2d46dc543e742431bfc0287ed2a85aae4e916107226\": container with ID starting with 89e6452bf9ca30fc50daa2d46dc543e742431bfc0287ed2a85aae4e916107226 not found: ID does not exist" containerID="89e6452bf9ca30fc50daa2d46dc543e742431bfc0287ed2a85aae4e916107226" Dec 03 08:10:55 crc kubenswrapper[4831]: I1203 08:10:55.249553 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89e6452bf9ca30fc50daa2d46dc543e742431bfc0287ed2a85aae4e916107226"} err="failed to get container status \"89e6452bf9ca30fc50daa2d46dc543e742431bfc0287ed2a85aae4e916107226\": rpc error: code = NotFound desc = could not find container \"89e6452bf9ca30fc50daa2d46dc543e742431bfc0287ed2a85aae4e916107226\": container with ID starting with 89e6452bf9ca30fc50daa2d46dc543e742431bfc0287ed2a85aae4e916107226 not found: ID does not exist" Dec 03 08:10:55 crc kubenswrapper[4831]: I1203 08:10:55.249575 4831 scope.go:117] "RemoveContainer" containerID="86be6c69db64015e9961e09e26f73a4012299a4035f6c21a00af8fa37cd69399" Dec 03 08:10:55 crc kubenswrapper[4831]: E1203 08:10:55.249965 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86be6c69db64015e9961e09e26f73a4012299a4035f6c21a00af8fa37cd69399\": container with ID starting with 86be6c69db64015e9961e09e26f73a4012299a4035f6c21a00af8fa37cd69399 not found: ID does not exist" containerID="86be6c69db64015e9961e09e26f73a4012299a4035f6c21a00af8fa37cd69399" Dec 03 08:10:55 crc kubenswrapper[4831]: I1203 08:10:55.249991 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86be6c69db64015e9961e09e26f73a4012299a4035f6c21a00af8fa37cd69399"} err="failed to get container status \"86be6c69db64015e9961e09e26f73a4012299a4035f6c21a00af8fa37cd69399\": rpc error: code = NotFound desc = could not find container \"86be6c69db64015e9961e09e26f73a4012299a4035f6c21a00af8fa37cd69399\": container with ID starting with 86be6c69db64015e9961e09e26f73a4012299a4035f6c21a00af8fa37cd69399 not found: ID does not exist" Dec 03 08:10:57 crc kubenswrapper[4831]: I1203 08:10:57.031223 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5368cba-aa23-4696-92e8-f5c2306d8f57" path="/var/lib/kubelet/pods/f5368cba-aa23-4696-92e8-f5c2306d8f57/volumes" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.113610 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-healthmanager-hw4hf"] Dec 03 08:11:00 crc kubenswrapper[4831]: E1203 08:11:00.114766 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c" containerName="init" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.114795 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c" containerName="init" Dec 03 08:11:00 crc kubenswrapper[4831]: E1203 08:11:00.114819 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5368cba-aa23-4696-92e8-f5c2306d8f57" containerName="octavia-amphora-httpd" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.114834 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5368cba-aa23-4696-92e8-f5c2306d8f57" containerName="octavia-amphora-httpd" Dec 03 08:11:00 crc kubenswrapper[4831]: E1203 08:11:00.114870 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42c97ca1-7beb-4558-81d7-415fe4c11105" containerName="ovn-config" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.114883 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="42c97ca1-7beb-4558-81d7-415fe4c11105" containerName="ovn-config" Dec 03 08:11:00 crc kubenswrapper[4831]: E1203 08:11:00.114909 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5368cba-aa23-4696-92e8-f5c2306d8f57" containerName="init" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.114921 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5368cba-aa23-4696-92e8-f5c2306d8f57" containerName="init" Dec 03 08:11:00 crc kubenswrapper[4831]: E1203 08:11:00.114943 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c" containerName="octavia-db-sync" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.114956 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c" containerName="octavia-db-sync" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.115385 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5368cba-aa23-4696-92e8-f5c2306d8f57" containerName="octavia-amphora-httpd" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.115424 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c" containerName="octavia-db-sync" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.115447 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="42c97ca1-7beb-4558-81d7-415fe4c11105" containerName="ovn-config" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.117514 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.120340 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-scripts" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.120564 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-healthmanager-config-data" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.120761 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-certs-secret" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.125573 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-hw4hf"] Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.260762 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-amphora-certs\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.260824 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-hm-ports\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.260846 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-config-data-merged\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.260867 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-combined-ca-bundle\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.261218 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-scripts\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.261283 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-config-data\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.363162 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-amphora-certs\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.363276 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-hm-ports\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.363312 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-config-data-merged\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.363465 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-combined-ca-bundle\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.363634 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-scripts\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.363689 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-config-data\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.364575 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-hm-ports\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.365520 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-config-data-merged\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.372215 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-combined-ca-bundle\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.373182 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-config-data\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.376270 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-amphora-certs\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.378822 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4394f7db-9b3d-425c-a57b-2c7bdcbbe251-scripts\") pod \"octavia-healthmanager-hw4hf\" (UID: \"4394f7db-9b3d-425c-a57b-2c7bdcbbe251\") " pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:00 crc kubenswrapper[4831]: I1203 08:11:00.492874 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:01 crc kubenswrapper[4831]: I1203 08:11:01.083966 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-healthmanager-hw4hf"] Dec 03 08:11:01 crc kubenswrapper[4831]: I1203 08:11:01.267045 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-hw4hf" event={"ID":"4394f7db-9b3d-425c-a57b-2c7bdcbbe251","Type":"ContainerStarted","Data":"563852d072d79a2930099454f125b8656ac391765c6ecfb7b7f7a586f128ee11"} Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.016298 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-housekeeping-2xwt9"] Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.018633 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.025654 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-scripts" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.027595 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-2xwt9"] Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.031476 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-housekeeping-config-data" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.206968 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-config-data-merged\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.208840 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-scripts\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.208877 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-amphora-certs\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.209087 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-combined-ca-bundle\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.209159 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-config-data\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.209200 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-hm-ports\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.280173 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-hw4hf" event={"ID":"4394f7db-9b3d-425c-a57b-2c7bdcbbe251","Type":"ContainerStarted","Data":"8d76f264f5aaea6ee42b54f31c73ec7991fd9a16df9771aa29bd07a98e283875"} Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.310619 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-config-data-merged\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.310899 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-scripts\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.311479 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-config-data-merged\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.311708 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-amphora-certs\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.312006 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-combined-ca-bundle\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.312140 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-config-data\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.312234 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-hm-ports\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.314103 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-hm-ports\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.317058 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-amphora-certs\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.321401 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-config-data\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.321775 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-combined-ca-bundle\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.329283 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d8c313f5-00e9-49ee-ab5e-3eefaaf09202-scripts\") pod \"octavia-housekeeping-2xwt9\" (UID: \"d8c313f5-00e9-49ee-ab5e-3eefaaf09202\") " pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.336189 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:02 crc kubenswrapper[4831]: I1203 08:11:02.901108 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-housekeeping-2xwt9"] Dec 03 08:11:02 crc kubenswrapper[4831]: W1203 08:11:02.901373 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd8c313f5_00e9_49ee_ab5e_3eefaaf09202.slice/crio-0640252ff10ade9686ecc01cb753546093d41592bf227dd25d6121bfe66cbac5 WatchSource:0}: Error finding container 0640252ff10ade9686ecc01cb753546093d41592bf227dd25d6121bfe66cbac5: Status 404 returned error can't find the container with id 0640252ff10ade9686ecc01cb753546093d41592bf227dd25d6121bfe66cbac5 Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.289673 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-2xwt9" event={"ID":"d8c313f5-00e9-49ee-ab5e-3eefaaf09202","Type":"ContainerStarted","Data":"0640252ff10ade9686ecc01cb753546093d41592bf227dd25d6121bfe66cbac5"} Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.581835 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/octavia-worker-qj6pc"] Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.583685 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.587914 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-scripts" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.589261 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"octavia-worker-config-data" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.601547 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-qj6pc"] Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.739620 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/c2473ca8-2ca7-4c12-afca-955d003ffa8b-amphora-certs\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.739814 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c2473ca8-2ca7-4c12-afca-955d003ffa8b-hm-ports\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.740175 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2473ca8-2ca7-4c12-afca-955d003ffa8b-combined-ca-bundle\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.740395 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c2473ca8-2ca7-4c12-afca-955d003ffa8b-config-data-merged\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.740518 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2473ca8-2ca7-4c12-afca-955d003ffa8b-scripts\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.740588 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2473ca8-2ca7-4c12-afca-955d003ffa8b-config-data\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.841683 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c2473ca8-2ca7-4c12-afca-955d003ffa8b-hm-ports\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.841773 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2473ca8-2ca7-4c12-afca-955d003ffa8b-combined-ca-bundle\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.841816 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c2473ca8-2ca7-4c12-afca-955d003ffa8b-config-data-merged\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.841850 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2473ca8-2ca7-4c12-afca-955d003ffa8b-scripts\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.841952 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2473ca8-2ca7-4c12-afca-955d003ffa8b-config-data\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.841995 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/c2473ca8-2ca7-4c12-afca-955d003ffa8b-amphora-certs\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.842982 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/c2473ca8-2ca7-4c12-afca-955d003ffa8b-config-data-merged\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.843182 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hm-ports\" (UniqueName: \"kubernetes.io/configmap/c2473ca8-2ca7-4c12-afca-955d003ffa8b-hm-ports\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.848511 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"amphora-certs\" (UniqueName: \"kubernetes.io/secret/c2473ca8-2ca7-4c12-afca-955d003ffa8b-amphora-certs\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.848740 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c2473ca8-2ca7-4c12-afca-955d003ffa8b-scripts\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.849234 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2473ca8-2ca7-4c12-afca-955d003ffa8b-combined-ca-bundle\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.852335 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c2473ca8-2ca7-4c12-afca-955d003ffa8b-config-data\") pod \"octavia-worker-qj6pc\" (UID: \"c2473ca8-2ca7-4c12-afca-955d003ffa8b\") " pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:03 crc kubenswrapper[4831]: I1203 08:11:03.961672 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:04 crc kubenswrapper[4831]: I1203 08:11:04.300186 4831 generic.go:334] "Generic (PLEG): container finished" podID="4394f7db-9b3d-425c-a57b-2c7bdcbbe251" containerID="8d76f264f5aaea6ee42b54f31c73ec7991fd9a16df9771aa29bd07a98e283875" exitCode=0 Dec 03 08:11:04 crc kubenswrapper[4831]: I1203 08:11:04.300263 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-hw4hf" event={"ID":"4394f7db-9b3d-425c-a57b-2c7bdcbbe251","Type":"ContainerDied","Data":"8d76f264f5aaea6ee42b54f31c73ec7991fd9a16df9771aa29bd07a98e283875"} Dec 03 08:11:04 crc kubenswrapper[4831]: I1203 08:11:04.513255 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/octavia-worker-qj6pc"] Dec 03 08:11:05 crc kubenswrapper[4831]: I1203 08:11:05.313153 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-qj6pc" event={"ID":"c2473ca8-2ca7-4c12-afca-955d003ffa8b","Type":"ContainerStarted","Data":"4d42fb41551fde921b13482a51893b793aa7ec07e83eb26a3df34c298954411d"} Dec 03 08:11:06 crc kubenswrapper[4831]: I1203 08:11:06.325022 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-2xwt9" event={"ID":"d8c313f5-00e9-49ee-ab5e-3eefaaf09202","Type":"ContainerStarted","Data":"51fa6fdde0159adf07763cff59af48fbdd43e13f78da2488eb4dd3814935170d"} Dec 03 08:11:06 crc kubenswrapper[4831]: I1203 08:11:06.328367 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-healthmanager-hw4hf" event={"ID":"4394f7db-9b3d-425c-a57b-2c7bdcbbe251","Type":"ContainerStarted","Data":"1e4b03804a38290c3bfb690ac4146f6368295df9cee997e05d0222229a3ec8a4"} Dec 03 08:11:06 crc kubenswrapper[4831]: I1203 08:11:06.328586 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:06 crc kubenswrapper[4831]: I1203 08:11:06.381010 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-healthmanager-hw4hf" podStartSLOduration=6.380989346 podStartE2EDuration="6.380989346s" podCreationTimestamp="2025-12-03 08:11:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:11:06.373904335 +0000 UTC m=+6003.717487843" watchObservedRunningTime="2025-12-03 08:11:06.380989346 +0000 UTC m=+6003.724572854" Dec 03 08:11:07 crc kubenswrapper[4831]: I1203 08:11:07.351448 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-2xwt9" event={"ID":"d8c313f5-00e9-49ee-ab5e-3eefaaf09202","Type":"ContainerDied","Data":"51fa6fdde0159adf07763cff59af48fbdd43e13f78da2488eb4dd3814935170d"} Dec 03 08:11:07 crc kubenswrapper[4831]: I1203 08:11:07.353261 4831 generic.go:334] "Generic (PLEG): container finished" podID="d8c313f5-00e9-49ee-ab5e-3eefaaf09202" containerID="51fa6fdde0159adf07763cff59af48fbdd43e13f78da2488eb4dd3814935170d" exitCode=0 Dec 03 08:11:08 crc kubenswrapper[4831]: I1203 08:11:08.013937 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:11:08 crc kubenswrapper[4831]: I1203 08:11:08.366894 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-housekeeping-2xwt9" event={"ID":"d8c313f5-00e9-49ee-ab5e-3eefaaf09202","Type":"ContainerStarted","Data":"43f7e69ff47cdd3ae3d8618308d22d7efc08e7edfa674c2b4c9729668009432a"} Dec 03 08:11:08 crc kubenswrapper[4831]: I1203 08:11:08.367770 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:08 crc kubenswrapper[4831]: I1203 08:11:08.370780 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-qj6pc" event={"ID":"c2473ca8-2ca7-4c12-afca-955d003ffa8b","Type":"ContainerStarted","Data":"6b652b2b43ba4cbce4a0152a4dc86ba809448f2e126eaa265fcbfe3837924c61"} Dec 03 08:11:08 crc kubenswrapper[4831]: I1203 08:11:08.375373 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"98f056b5c45cb7c3ebbde6e6e8b7c794ed7234028aab60b3d5caa65ca4fbade0"} Dec 03 08:11:08 crc kubenswrapper[4831]: I1203 08:11:08.401299 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-housekeeping-2xwt9" podStartSLOduration=5.458898412 podStartE2EDuration="7.401276901s" podCreationTimestamp="2025-12-03 08:11:01 +0000 UTC" firstStartedPulling="2025-12-03 08:11:02.911744139 +0000 UTC m=+6000.255327647" lastFinishedPulling="2025-12-03 08:11:04.854122628 +0000 UTC m=+6002.197706136" observedRunningTime="2025-12-03 08:11:08.386396328 +0000 UTC m=+6005.729979836" watchObservedRunningTime="2025-12-03 08:11:08.401276901 +0000 UTC m=+6005.744860419" Dec 03 08:11:09 crc kubenswrapper[4831]: I1203 08:11:09.388096 4831 generic.go:334] "Generic (PLEG): container finished" podID="c2473ca8-2ca7-4c12-afca-955d003ffa8b" containerID="6b652b2b43ba4cbce4a0152a4dc86ba809448f2e126eaa265fcbfe3837924c61" exitCode=0 Dec 03 08:11:09 crc kubenswrapper[4831]: I1203 08:11:09.388154 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-qj6pc" event={"ID":"c2473ca8-2ca7-4c12-afca-955d003ffa8b","Type":"ContainerDied","Data":"6b652b2b43ba4cbce4a0152a4dc86ba809448f2e126eaa265fcbfe3837924c61"} Dec 03 08:11:10 crc kubenswrapper[4831]: I1203 08:11:10.399902 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/octavia-worker-qj6pc" event={"ID":"c2473ca8-2ca7-4c12-afca-955d003ffa8b","Type":"ContainerStarted","Data":"a6eea9b0734a19a0eaf5b17befd39ab1740642189c703fb6d1119febc94b2734"} Dec 03 08:11:10 crc kubenswrapper[4831]: I1203 08:11:10.400472 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:10 crc kubenswrapper[4831]: I1203 08:11:10.431198 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/octavia-worker-qj6pc" podStartSLOduration=5.096028733 podStartE2EDuration="7.431173886s" podCreationTimestamp="2025-12-03 08:11:03 +0000 UTC" firstStartedPulling="2025-12-03 08:11:04.721414095 +0000 UTC m=+6002.064997603" lastFinishedPulling="2025-12-03 08:11:07.056559208 +0000 UTC m=+6004.400142756" observedRunningTime="2025-12-03 08:11:10.420238486 +0000 UTC m=+6007.763822004" watchObservedRunningTime="2025-12-03 08:11:10.431173886 +0000 UTC m=+6007.774757394" Dec 03 08:11:15 crc kubenswrapper[4831]: I1203 08:11:15.553415 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-healthmanager-hw4hf" Dec 03 08:11:17 crc kubenswrapper[4831]: I1203 08:11:17.378059 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-housekeeping-2xwt9" Dec 03 08:11:19 crc kubenswrapper[4831]: I1203 08:11:19.039179 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/octavia-worker-qj6pc" Dec 03 08:11:21 crc kubenswrapper[4831]: I1203 08:11:21.781913 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-t66fc"] Dec 03 08:11:21 crc kubenswrapper[4831]: I1203 08:11:21.784528 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:21 crc kubenswrapper[4831]: I1203 08:11:21.809699 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t66fc"] Dec 03 08:11:21 crc kubenswrapper[4831]: I1203 08:11:21.861376 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abe4130-4c2d-4339-ac99-b3a7635e9a41-catalog-content\") pod \"redhat-operators-t66fc\" (UID: \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\") " pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:21 crc kubenswrapper[4831]: I1203 08:11:21.861473 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abe4130-4c2d-4339-ac99-b3a7635e9a41-utilities\") pod \"redhat-operators-t66fc\" (UID: \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\") " pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:21 crc kubenswrapper[4831]: I1203 08:11:21.862066 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhhgn\" (UniqueName: \"kubernetes.io/projected/3abe4130-4c2d-4339-ac99-b3a7635e9a41-kube-api-access-nhhgn\") pod \"redhat-operators-t66fc\" (UID: \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\") " pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:21 crc kubenswrapper[4831]: I1203 08:11:21.963027 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhhgn\" (UniqueName: \"kubernetes.io/projected/3abe4130-4c2d-4339-ac99-b3a7635e9a41-kube-api-access-nhhgn\") pod \"redhat-operators-t66fc\" (UID: \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\") " pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:21 crc kubenswrapper[4831]: I1203 08:11:21.963114 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abe4130-4c2d-4339-ac99-b3a7635e9a41-catalog-content\") pod \"redhat-operators-t66fc\" (UID: \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\") " pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:21 crc kubenswrapper[4831]: I1203 08:11:21.963154 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abe4130-4c2d-4339-ac99-b3a7635e9a41-utilities\") pod \"redhat-operators-t66fc\" (UID: \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\") " pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:21 crc kubenswrapper[4831]: I1203 08:11:21.964048 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abe4130-4c2d-4339-ac99-b3a7635e9a41-utilities\") pod \"redhat-operators-t66fc\" (UID: \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\") " pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:21 crc kubenswrapper[4831]: I1203 08:11:21.964156 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abe4130-4c2d-4339-ac99-b3a7635e9a41-catalog-content\") pod \"redhat-operators-t66fc\" (UID: \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\") " pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:21 crc kubenswrapper[4831]: I1203 08:11:21.987449 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhhgn\" (UniqueName: \"kubernetes.io/projected/3abe4130-4c2d-4339-ac99-b3a7635e9a41-kube-api-access-nhhgn\") pod \"redhat-operators-t66fc\" (UID: \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\") " pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:22 crc kubenswrapper[4831]: I1203 08:11:22.130441 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:22 crc kubenswrapper[4831]: I1203 08:11:22.636825 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-t66fc"] Dec 03 08:11:23 crc kubenswrapper[4831]: I1203 08:11:23.550864 4831 generic.go:334] "Generic (PLEG): container finished" podID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" containerID="ca37bd780933814253b89f414dbf9f2ecfd4351d8b795b61a96f3a59aec6661d" exitCode=0 Dec 03 08:11:23 crc kubenswrapper[4831]: I1203 08:11:23.551234 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t66fc" event={"ID":"3abe4130-4c2d-4339-ac99-b3a7635e9a41","Type":"ContainerDied","Data":"ca37bd780933814253b89f414dbf9f2ecfd4351d8b795b61a96f3a59aec6661d"} Dec 03 08:11:23 crc kubenswrapper[4831]: I1203 08:11:23.551333 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t66fc" event={"ID":"3abe4130-4c2d-4339-ac99-b3a7635e9a41","Type":"ContainerStarted","Data":"20177518b96f057bc80529d9005eabb6c5373ee745d99104aea1a8d0c1397109"} Dec 03 08:11:25 crc kubenswrapper[4831]: I1203 08:11:25.592050 4831 generic.go:334] "Generic (PLEG): container finished" podID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" containerID="b4b6dc7ec160c989837675d381d7e780d135a60743b81f8ba6e3c91bc243bdd1" exitCode=0 Dec 03 08:11:25 crc kubenswrapper[4831]: I1203 08:11:25.592094 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t66fc" event={"ID":"3abe4130-4c2d-4339-ac99-b3a7635e9a41","Type":"ContainerDied","Data":"b4b6dc7ec160c989837675d381d7e780d135a60743b81f8ba6e3c91bc243bdd1"} Dec 03 08:11:26 crc kubenswrapper[4831]: I1203 08:11:26.555904 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ggj9x"] Dec 03 08:11:26 crc kubenswrapper[4831]: I1203 08:11:26.558575 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:26 crc kubenswrapper[4831]: I1203 08:11:26.569787 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggj9x"] Dec 03 08:11:26 crc kubenswrapper[4831]: I1203 08:11:26.694783 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjmjd\" (UniqueName: \"kubernetes.io/projected/9fcf622a-6353-4271-a029-78f78658bced-kube-api-access-zjmjd\") pod \"community-operators-ggj9x\" (UID: \"9fcf622a-6353-4271-a029-78f78658bced\") " pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:26 crc kubenswrapper[4831]: I1203 08:11:26.694847 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fcf622a-6353-4271-a029-78f78658bced-catalog-content\") pod \"community-operators-ggj9x\" (UID: \"9fcf622a-6353-4271-a029-78f78658bced\") " pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:26 crc kubenswrapper[4831]: I1203 08:11:26.694950 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fcf622a-6353-4271-a029-78f78658bced-utilities\") pod \"community-operators-ggj9x\" (UID: \"9fcf622a-6353-4271-a029-78f78658bced\") " pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:26 crc kubenswrapper[4831]: I1203 08:11:26.797091 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjmjd\" (UniqueName: \"kubernetes.io/projected/9fcf622a-6353-4271-a029-78f78658bced-kube-api-access-zjmjd\") pod \"community-operators-ggj9x\" (UID: \"9fcf622a-6353-4271-a029-78f78658bced\") " pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:26 crc kubenswrapper[4831]: I1203 08:11:26.797430 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fcf622a-6353-4271-a029-78f78658bced-catalog-content\") pod \"community-operators-ggj9x\" (UID: \"9fcf622a-6353-4271-a029-78f78658bced\") " pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:26 crc kubenswrapper[4831]: I1203 08:11:26.797508 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fcf622a-6353-4271-a029-78f78658bced-utilities\") pod \"community-operators-ggj9x\" (UID: \"9fcf622a-6353-4271-a029-78f78658bced\") " pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:26 crc kubenswrapper[4831]: I1203 08:11:26.797915 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fcf622a-6353-4271-a029-78f78658bced-catalog-content\") pod \"community-operators-ggj9x\" (UID: \"9fcf622a-6353-4271-a029-78f78658bced\") " pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:26 crc kubenswrapper[4831]: I1203 08:11:26.797938 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fcf622a-6353-4271-a029-78f78658bced-utilities\") pod \"community-operators-ggj9x\" (UID: \"9fcf622a-6353-4271-a029-78f78658bced\") " pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:26 crc kubenswrapper[4831]: I1203 08:11:26.816199 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjmjd\" (UniqueName: \"kubernetes.io/projected/9fcf622a-6353-4271-a029-78f78658bced-kube-api-access-zjmjd\") pod \"community-operators-ggj9x\" (UID: \"9fcf622a-6353-4271-a029-78f78658bced\") " pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:27 crc kubenswrapper[4831]: I1203 08:11:27.025797 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:27 crc kubenswrapper[4831]: I1203 08:11:27.657882 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ggj9x"] Dec 03 08:11:27 crc kubenswrapper[4831]: W1203 08:11:27.665656 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fcf622a_6353_4271_a029_78f78658bced.slice/crio-6ba54532fd10433a4cc6dcea14773e317c0a915e537c813b626a8e8b5c836225 WatchSource:0}: Error finding container 6ba54532fd10433a4cc6dcea14773e317c0a915e537c813b626a8e8b5c836225: Status 404 returned error can't find the container with id 6ba54532fd10433a4cc6dcea14773e317c0a915e537c813b626a8e8b5c836225 Dec 03 08:11:27 crc kubenswrapper[4831]: I1203 08:11:27.720794 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t66fc" event={"ID":"3abe4130-4c2d-4339-ac99-b3a7635e9a41","Type":"ContainerStarted","Data":"62553ba18632d33f30492c2e0df534214c5a2a053ae2c947e2a722121192e36e"} Dec 03 08:11:27 crc kubenswrapper[4831]: I1203 08:11:27.723551 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggj9x" event={"ID":"9fcf622a-6353-4271-a029-78f78658bced","Type":"ContainerStarted","Data":"6ba54532fd10433a4cc6dcea14773e317c0a915e537c813b626a8e8b5c836225"} Dec 03 08:11:27 crc kubenswrapper[4831]: I1203 08:11:27.746662 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-t66fc" podStartSLOduration=3.948074132 podStartE2EDuration="6.74664379s" podCreationTimestamp="2025-12-03 08:11:21 +0000 UTC" firstStartedPulling="2025-12-03 08:11:23.558401779 +0000 UTC m=+6020.901985287" lastFinishedPulling="2025-12-03 08:11:26.356971437 +0000 UTC m=+6023.700554945" observedRunningTime="2025-12-03 08:11:27.739291502 +0000 UTC m=+6025.082875040" watchObservedRunningTime="2025-12-03 08:11:27.74664379 +0000 UTC m=+6025.090227298" Dec 03 08:11:28 crc kubenswrapper[4831]: I1203 08:11:28.737772 4831 generic.go:334] "Generic (PLEG): container finished" podID="9fcf622a-6353-4271-a029-78f78658bced" containerID="6e51d5152c2308ea08778419359580facb796d43c890a61d1415f6a9ffa086fb" exitCode=0 Dec 03 08:11:28 crc kubenswrapper[4831]: I1203 08:11:28.737947 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggj9x" event={"ID":"9fcf622a-6353-4271-a029-78f78658bced","Type":"ContainerDied","Data":"6e51d5152c2308ea08778419359580facb796d43c890a61d1415f6a9ffa086fb"} Dec 03 08:11:29 crc kubenswrapper[4831]: I1203 08:11:29.753586 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggj9x" event={"ID":"9fcf622a-6353-4271-a029-78f78658bced","Type":"ContainerStarted","Data":"40e56c6b7027988f0dd26bccca1bbf661a036b6a9091d36df0fb24ddeeb4d6e3"} Dec 03 08:11:30 crc kubenswrapper[4831]: I1203 08:11:30.764955 4831 generic.go:334] "Generic (PLEG): container finished" podID="9fcf622a-6353-4271-a029-78f78658bced" containerID="40e56c6b7027988f0dd26bccca1bbf661a036b6a9091d36df0fb24ddeeb4d6e3" exitCode=0 Dec 03 08:11:30 crc kubenswrapper[4831]: I1203 08:11:30.765065 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggj9x" event={"ID":"9fcf622a-6353-4271-a029-78f78658bced","Type":"ContainerDied","Data":"40e56c6b7027988f0dd26bccca1bbf661a036b6a9091d36df0fb24ddeeb4d6e3"} Dec 03 08:11:32 crc kubenswrapper[4831]: I1203 08:11:32.131660 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:32 crc kubenswrapper[4831]: I1203 08:11:32.134165 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:32 crc kubenswrapper[4831]: I1203 08:11:32.791830 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggj9x" event={"ID":"9fcf622a-6353-4271-a029-78f78658bced","Type":"ContainerStarted","Data":"e558ffe4e1b0e501ba0d94e7afcbd62e8123fabb14b424dafa67972fb62f651c"} Dec 03 08:11:32 crc kubenswrapper[4831]: I1203 08:11:32.812241 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ggj9x" podStartSLOduration=3.7060632 podStartE2EDuration="6.812224618s" podCreationTimestamp="2025-12-03 08:11:26 +0000 UTC" firstStartedPulling="2025-12-03 08:11:28.740584359 +0000 UTC m=+6026.084167877" lastFinishedPulling="2025-12-03 08:11:31.846745787 +0000 UTC m=+6029.190329295" observedRunningTime="2025-12-03 08:11:32.809502443 +0000 UTC m=+6030.153085951" watchObservedRunningTime="2025-12-03 08:11:32.812224618 +0000 UTC m=+6030.155808126" Dec 03 08:11:33 crc kubenswrapper[4831]: I1203 08:11:33.187218 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t66fc" podUID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" containerName="registry-server" probeResult="failure" output=< Dec 03 08:11:33 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 03 08:11:33 crc kubenswrapper[4831]: > Dec 03 08:11:37 crc kubenswrapper[4831]: I1203 08:11:37.026522 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:37 crc kubenswrapper[4831]: I1203 08:11:37.027354 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:37 crc kubenswrapper[4831]: I1203 08:11:37.104050 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:37 crc kubenswrapper[4831]: I1203 08:11:37.915439 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:37 crc kubenswrapper[4831]: I1203 08:11:37.967131 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ggj9x"] Dec 03 08:11:39 crc kubenswrapper[4831]: I1203 08:11:39.868193 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ggj9x" podUID="9fcf622a-6353-4271-a029-78f78658bced" containerName="registry-server" containerID="cri-o://e558ffe4e1b0e501ba0d94e7afcbd62e8123fabb14b424dafa67972fb62f651c" gracePeriod=2 Dec 03 08:11:40 crc kubenswrapper[4831]: I1203 08:11:40.873490 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:40 crc kubenswrapper[4831]: I1203 08:11:40.878930 4831 generic.go:334] "Generic (PLEG): container finished" podID="9fcf622a-6353-4271-a029-78f78658bced" containerID="e558ffe4e1b0e501ba0d94e7afcbd62e8123fabb14b424dafa67972fb62f651c" exitCode=0 Dec 03 08:11:40 crc kubenswrapper[4831]: I1203 08:11:40.878966 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggj9x" event={"ID":"9fcf622a-6353-4271-a029-78f78658bced","Type":"ContainerDied","Data":"e558ffe4e1b0e501ba0d94e7afcbd62e8123fabb14b424dafa67972fb62f651c"} Dec 03 08:11:40 crc kubenswrapper[4831]: I1203 08:11:40.878989 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ggj9x" event={"ID":"9fcf622a-6353-4271-a029-78f78658bced","Type":"ContainerDied","Data":"6ba54532fd10433a4cc6dcea14773e317c0a915e537c813b626a8e8b5c836225"} Dec 03 08:11:40 crc kubenswrapper[4831]: I1203 08:11:40.879009 4831 scope.go:117] "RemoveContainer" containerID="e558ffe4e1b0e501ba0d94e7afcbd62e8123fabb14b424dafa67972fb62f651c" Dec 03 08:11:40 crc kubenswrapper[4831]: I1203 08:11:40.879118 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ggj9x" Dec 03 08:11:40 crc kubenswrapper[4831]: E1203 08:11:40.929478 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.234:37854->38.102.83.234:39573: write tcp 38.102.83.234:37854->38.102.83.234:39573: write: broken pipe Dec 03 08:11:40 crc kubenswrapper[4831]: I1203 08:11:40.931645 4831 scope.go:117] "RemoveContainer" containerID="40e56c6b7027988f0dd26bccca1bbf661a036b6a9091d36df0fb24ddeeb4d6e3" Dec 03 08:11:40 crc kubenswrapper[4831]: I1203 08:11:40.973021 4831 scope.go:117] "RemoveContainer" containerID="6e51d5152c2308ea08778419359580facb796d43c890a61d1415f6a9ffa086fb" Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.017239 4831 scope.go:117] "RemoveContainer" containerID="e558ffe4e1b0e501ba0d94e7afcbd62e8123fabb14b424dafa67972fb62f651c" Dec 03 08:11:41 crc kubenswrapper[4831]: E1203 08:11:41.017706 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e558ffe4e1b0e501ba0d94e7afcbd62e8123fabb14b424dafa67972fb62f651c\": container with ID starting with e558ffe4e1b0e501ba0d94e7afcbd62e8123fabb14b424dafa67972fb62f651c not found: ID does not exist" containerID="e558ffe4e1b0e501ba0d94e7afcbd62e8123fabb14b424dafa67972fb62f651c" Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.017759 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e558ffe4e1b0e501ba0d94e7afcbd62e8123fabb14b424dafa67972fb62f651c"} err="failed to get container status \"e558ffe4e1b0e501ba0d94e7afcbd62e8123fabb14b424dafa67972fb62f651c\": rpc error: code = NotFound desc = could not find container \"e558ffe4e1b0e501ba0d94e7afcbd62e8123fabb14b424dafa67972fb62f651c\": container with ID starting with e558ffe4e1b0e501ba0d94e7afcbd62e8123fabb14b424dafa67972fb62f651c not found: ID does not exist" Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.017799 4831 scope.go:117] "RemoveContainer" containerID="40e56c6b7027988f0dd26bccca1bbf661a036b6a9091d36df0fb24ddeeb4d6e3" Dec 03 08:11:41 crc kubenswrapper[4831]: E1203 08:11:41.018224 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40e56c6b7027988f0dd26bccca1bbf661a036b6a9091d36df0fb24ddeeb4d6e3\": container with ID starting with 40e56c6b7027988f0dd26bccca1bbf661a036b6a9091d36df0fb24ddeeb4d6e3 not found: ID does not exist" containerID="40e56c6b7027988f0dd26bccca1bbf661a036b6a9091d36df0fb24ddeeb4d6e3" Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.018260 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40e56c6b7027988f0dd26bccca1bbf661a036b6a9091d36df0fb24ddeeb4d6e3"} err="failed to get container status \"40e56c6b7027988f0dd26bccca1bbf661a036b6a9091d36df0fb24ddeeb4d6e3\": rpc error: code = NotFound desc = could not find container \"40e56c6b7027988f0dd26bccca1bbf661a036b6a9091d36df0fb24ddeeb4d6e3\": container with ID starting with 40e56c6b7027988f0dd26bccca1bbf661a036b6a9091d36df0fb24ddeeb4d6e3 not found: ID does not exist" Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.018281 4831 scope.go:117] "RemoveContainer" containerID="6e51d5152c2308ea08778419359580facb796d43c890a61d1415f6a9ffa086fb" Dec 03 08:11:41 crc kubenswrapper[4831]: E1203 08:11:41.018700 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e51d5152c2308ea08778419359580facb796d43c890a61d1415f6a9ffa086fb\": container with ID starting with 6e51d5152c2308ea08778419359580facb796d43c890a61d1415f6a9ffa086fb not found: ID does not exist" containerID="6e51d5152c2308ea08778419359580facb796d43c890a61d1415f6a9ffa086fb" Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.018722 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e51d5152c2308ea08778419359580facb796d43c890a61d1415f6a9ffa086fb"} err="failed to get container status \"6e51d5152c2308ea08778419359580facb796d43c890a61d1415f6a9ffa086fb\": rpc error: code = NotFound desc = could not find container \"6e51d5152c2308ea08778419359580facb796d43c890a61d1415f6a9ffa086fb\": container with ID starting with 6e51d5152c2308ea08778419359580facb796d43c890a61d1415f6a9ffa086fb not found: ID does not exist" Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.043937 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjmjd\" (UniqueName: \"kubernetes.io/projected/9fcf622a-6353-4271-a029-78f78658bced-kube-api-access-zjmjd\") pod \"9fcf622a-6353-4271-a029-78f78658bced\" (UID: \"9fcf622a-6353-4271-a029-78f78658bced\") " Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.044017 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fcf622a-6353-4271-a029-78f78658bced-catalog-content\") pod \"9fcf622a-6353-4271-a029-78f78658bced\" (UID: \"9fcf622a-6353-4271-a029-78f78658bced\") " Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.044094 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fcf622a-6353-4271-a029-78f78658bced-utilities\") pod \"9fcf622a-6353-4271-a029-78f78658bced\" (UID: \"9fcf622a-6353-4271-a029-78f78658bced\") " Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.045144 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fcf622a-6353-4271-a029-78f78658bced-utilities" (OuterVolumeSpecName: "utilities") pod "9fcf622a-6353-4271-a029-78f78658bced" (UID: "9fcf622a-6353-4271-a029-78f78658bced"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.072886 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fcf622a-6353-4271-a029-78f78658bced-kube-api-access-zjmjd" (OuterVolumeSpecName: "kube-api-access-zjmjd") pod "9fcf622a-6353-4271-a029-78f78658bced" (UID: "9fcf622a-6353-4271-a029-78f78658bced"). InnerVolumeSpecName "kube-api-access-zjmjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.127821 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fcf622a-6353-4271-a029-78f78658bced-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9fcf622a-6353-4271-a029-78f78658bced" (UID: "9fcf622a-6353-4271-a029-78f78658bced"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.146357 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9fcf622a-6353-4271-a029-78f78658bced-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.146620 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9fcf622a-6353-4271-a029-78f78658bced-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.146696 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjmjd\" (UniqueName: \"kubernetes.io/projected/9fcf622a-6353-4271-a029-78f78658bced-kube-api-access-zjmjd\") on node \"crc\" DevicePath \"\"" Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.229117 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ggj9x"] Dec 03 08:11:41 crc kubenswrapper[4831]: I1203 08:11:41.254787 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ggj9x"] Dec 03 08:11:43 crc kubenswrapper[4831]: I1203 08:11:43.025306 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fcf622a-6353-4271-a029-78f78658bced" path="/var/lib/kubelet/pods/9fcf622a-6353-4271-a029-78f78658bced/volumes" Dec 03 08:11:43 crc kubenswrapper[4831]: I1203 08:11:43.228449 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-t66fc" podUID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" containerName="registry-server" probeResult="failure" output=< Dec 03 08:11:43 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 03 08:11:43 crc kubenswrapper[4831]: > Dec 03 08:11:51 crc kubenswrapper[4831]: I1203 08:11:51.058715 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-25bxf"] Dec 03 08:11:51 crc kubenswrapper[4831]: I1203 08:11:51.077286 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-218e-account-create-update-drsks"] Dec 03 08:11:51 crc kubenswrapper[4831]: I1203 08:11:51.094599 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-25bxf"] Dec 03 08:11:51 crc kubenswrapper[4831]: I1203 08:11:51.109727 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-218e-account-create-update-drsks"] Dec 03 08:11:52 crc kubenswrapper[4831]: I1203 08:11:52.214939 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:52 crc kubenswrapper[4831]: I1203 08:11:52.322467 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:52 crc kubenswrapper[4831]: I1203 08:11:52.994292 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t66fc"] Dec 03 08:11:53 crc kubenswrapper[4831]: I1203 08:11:53.042526 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e47a0a9-fa16-47ce-aecb-2e300ac07ea2" path="/var/lib/kubelet/pods/0e47a0a9-fa16-47ce-aecb-2e300ac07ea2/volumes" Dec 03 08:11:53 crc kubenswrapper[4831]: I1203 08:11:53.043447 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27769491-0a4a-41cd-869a-83484c81873c" path="/var/lib/kubelet/pods/27769491-0a4a-41cd-869a-83484c81873c/volumes" Dec 03 08:11:54 crc kubenswrapper[4831]: I1203 08:11:54.059298 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-t66fc" podUID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" containerName="registry-server" containerID="cri-o://62553ba18632d33f30492c2e0df534214c5a2a053ae2c947e2a722121192e36e" gracePeriod=2 Dec 03 08:11:54 crc kubenswrapper[4831]: I1203 08:11:54.652775 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:54 crc kubenswrapper[4831]: I1203 08:11:54.750378 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhhgn\" (UniqueName: \"kubernetes.io/projected/3abe4130-4c2d-4339-ac99-b3a7635e9a41-kube-api-access-nhhgn\") pod \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\" (UID: \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\") " Dec 03 08:11:54 crc kubenswrapper[4831]: I1203 08:11:54.750541 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abe4130-4c2d-4339-ac99-b3a7635e9a41-utilities\") pod \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\" (UID: \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\") " Dec 03 08:11:54 crc kubenswrapper[4831]: I1203 08:11:54.750670 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abe4130-4c2d-4339-ac99-b3a7635e9a41-catalog-content\") pod \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\" (UID: \"3abe4130-4c2d-4339-ac99-b3a7635e9a41\") " Dec 03 08:11:54 crc kubenswrapper[4831]: I1203 08:11:54.751213 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3abe4130-4c2d-4339-ac99-b3a7635e9a41-utilities" (OuterVolumeSpecName: "utilities") pod "3abe4130-4c2d-4339-ac99-b3a7635e9a41" (UID: "3abe4130-4c2d-4339-ac99-b3a7635e9a41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:11:54 crc kubenswrapper[4831]: I1203 08:11:54.754958 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3abe4130-4c2d-4339-ac99-b3a7635e9a41-kube-api-access-nhhgn" (OuterVolumeSpecName: "kube-api-access-nhhgn") pod "3abe4130-4c2d-4339-ac99-b3a7635e9a41" (UID: "3abe4130-4c2d-4339-ac99-b3a7635e9a41"). InnerVolumeSpecName "kube-api-access-nhhgn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:11:54 crc kubenswrapper[4831]: I1203 08:11:54.851638 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3abe4130-4c2d-4339-ac99-b3a7635e9a41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3abe4130-4c2d-4339-ac99-b3a7635e9a41" (UID: "3abe4130-4c2d-4339-ac99-b3a7635e9a41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:11:54 crc kubenswrapper[4831]: I1203 08:11:54.852227 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhhgn\" (UniqueName: \"kubernetes.io/projected/3abe4130-4c2d-4339-ac99-b3a7635e9a41-kube-api-access-nhhgn\") on node \"crc\" DevicePath \"\"" Dec 03 08:11:54 crc kubenswrapper[4831]: I1203 08:11:54.852247 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3abe4130-4c2d-4339-ac99-b3a7635e9a41-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:11:54 crc kubenswrapper[4831]: I1203 08:11:54.852257 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3abe4130-4c2d-4339-ac99-b3a7635e9a41-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.069145 4831 generic.go:334] "Generic (PLEG): container finished" podID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" containerID="62553ba18632d33f30492c2e0df534214c5a2a053ae2c947e2a722121192e36e" exitCode=0 Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.069201 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t66fc" event={"ID":"3abe4130-4c2d-4339-ac99-b3a7635e9a41","Type":"ContainerDied","Data":"62553ba18632d33f30492c2e0df534214c5a2a053ae2c947e2a722121192e36e"} Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.069221 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-t66fc" Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.069247 4831 scope.go:117] "RemoveContainer" containerID="62553ba18632d33f30492c2e0df534214c5a2a053ae2c947e2a722121192e36e" Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.069234 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-t66fc" event={"ID":"3abe4130-4c2d-4339-ac99-b3a7635e9a41","Type":"ContainerDied","Data":"20177518b96f057bc80529d9005eabb6c5373ee745d99104aea1a8d0c1397109"} Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.091623 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-t66fc"] Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.095475 4831 scope.go:117] "RemoveContainer" containerID="b4b6dc7ec160c989837675d381d7e780d135a60743b81f8ba6e3c91bc243bdd1" Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.102120 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-t66fc"] Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.117595 4831 scope.go:117] "RemoveContainer" containerID="ca37bd780933814253b89f414dbf9f2ecfd4351d8b795b61a96f3a59aec6661d" Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.164588 4831 scope.go:117] "RemoveContainer" containerID="62553ba18632d33f30492c2e0df534214c5a2a053ae2c947e2a722121192e36e" Dec 03 08:11:55 crc kubenswrapper[4831]: E1203 08:11:55.165082 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62553ba18632d33f30492c2e0df534214c5a2a053ae2c947e2a722121192e36e\": container with ID starting with 62553ba18632d33f30492c2e0df534214c5a2a053ae2c947e2a722121192e36e not found: ID does not exist" containerID="62553ba18632d33f30492c2e0df534214c5a2a053ae2c947e2a722121192e36e" Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.165125 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62553ba18632d33f30492c2e0df534214c5a2a053ae2c947e2a722121192e36e"} err="failed to get container status \"62553ba18632d33f30492c2e0df534214c5a2a053ae2c947e2a722121192e36e\": rpc error: code = NotFound desc = could not find container \"62553ba18632d33f30492c2e0df534214c5a2a053ae2c947e2a722121192e36e\": container with ID starting with 62553ba18632d33f30492c2e0df534214c5a2a053ae2c947e2a722121192e36e not found: ID does not exist" Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.165152 4831 scope.go:117] "RemoveContainer" containerID="b4b6dc7ec160c989837675d381d7e780d135a60743b81f8ba6e3c91bc243bdd1" Dec 03 08:11:55 crc kubenswrapper[4831]: E1203 08:11:55.165478 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4b6dc7ec160c989837675d381d7e780d135a60743b81f8ba6e3c91bc243bdd1\": container with ID starting with b4b6dc7ec160c989837675d381d7e780d135a60743b81f8ba6e3c91bc243bdd1 not found: ID does not exist" containerID="b4b6dc7ec160c989837675d381d7e780d135a60743b81f8ba6e3c91bc243bdd1" Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.165507 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b6dc7ec160c989837675d381d7e780d135a60743b81f8ba6e3c91bc243bdd1"} err="failed to get container status \"b4b6dc7ec160c989837675d381d7e780d135a60743b81f8ba6e3c91bc243bdd1\": rpc error: code = NotFound desc = could not find container \"b4b6dc7ec160c989837675d381d7e780d135a60743b81f8ba6e3c91bc243bdd1\": container with ID starting with b4b6dc7ec160c989837675d381d7e780d135a60743b81f8ba6e3c91bc243bdd1 not found: ID does not exist" Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.165526 4831 scope.go:117] "RemoveContainer" containerID="ca37bd780933814253b89f414dbf9f2ecfd4351d8b795b61a96f3a59aec6661d" Dec 03 08:11:55 crc kubenswrapper[4831]: E1203 08:11:55.165837 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca37bd780933814253b89f414dbf9f2ecfd4351d8b795b61a96f3a59aec6661d\": container with ID starting with ca37bd780933814253b89f414dbf9f2ecfd4351d8b795b61a96f3a59aec6661d not found: ID does not exist" containerID="ca37bd780933814253b89f414dbf9f2ecfd4351d8b795b61a96f3a59aec6661d" Dec 03 08:11:55 crc kubenswrapper[4831]: I1203 08:11:55.165882 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca37bd780933814253b89f414dbf9f2ecfd4351d8b795b61a96f3a59aec6661d"} err="failed to get container status \"ca37bd780933814253b89f414dbf9f2ecfd4351d8b795b61a96f3a59aec6661d\": rpc error: code = NotFound desc = could not find container \"ca37bd780933814253b89f414dbf9f2ecfd4351d8b795b61a96f3a59aec6661d\": container with ID starting with ca37bd780933814253b89f414dbf9f2ecfd4351d8b795b61a96f3a59aec6661d not found: ID does not exist" Dec 03 08:11:57 crc kubenswrapper[4831]: I1203 08:11:57.029437 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" path="/var/lib/kubelet/pods/3abe4130-4c2d-4339-ac99-b3a7635e9a41/volumes" Dec 03 08:11:57 crc kubenswrapper[4831]: I1203 08:11:57.038087 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-n2s9j"] Dec 03 08:11:57 crc kubenswrapper[4831]: I1203 08:11:57.052483 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-n2s9j"] Dec 03 08:11:59 crc kubenswrapper[4831]: I1203 08:11:59.031147 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be0b20d0-fd57-4f34-a011-7b50b9fb7af9" path="/var/lib/kubelet/pods/be0b20d0-fd57-4f34-a011-7b50b9fb7af9/volumes" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.924760 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-8686484659-lc7t7"] Dec 03 08:12:06 crc kubenswrapper[4831]: E1203 08:12:06.930724 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" containerName="extract-content" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.930747 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" containerName="extract-content" Dec 03 08:12:06 crc kubenswrapper[4831]: E1203 08:12:06.930761 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fcf622a-6353-4271-a029-78f78658bced" containerName="extract-utilities" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.930768 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fcf622a-6353-4271-a029-78f78658bced" containerName="extract-utilities" Dec 03 08:12:06 crc kubenswrapper[4831]: E1203 08:12:06.930786 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fcf622a-6353-4271-a029-78f78658bced" containerName="registry-server" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.930791 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fcf622a-6353-4271-a029-78f78658bced" containerName="registry-server" Dec 03 08:12:06 crc kubenswrapper[4831]: E1203 08:12:06.930805 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" containerName="extract-utilities" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.930811 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" containerName="extract-utilities" Dec 03 08:12:06 crc kubenswrapper[4831]: E1203 08:12:06.930824 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9fcf622a-6353-4271-a029-78f78658bced" containerName="extract-content" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.930830 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fcf622a-6353-4271-a029-78f78658bced" containerName="extract-content" Dec 03 08:12:06 crc kubenswrapper[4831]: E1203 08:12:06.930854 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" containerName="registry-server" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.930860 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" containerName="registry-server" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.931030 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9fcf622a-6353-4271-a029-78f78658bced" containerName="registry-server" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.931049 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3abe4130-4c2d-4339-ac99-b3a7635e9a41" containerName="registry-server" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.932146 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.938496 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-4hq9n" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.938593 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.939024 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.941252 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Dec 03 08:12:06 crc kubenswrapper[4831]: I1203 08:12:06.945381 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-8686484659-lc7t7"] Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.001850 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.002495 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2cfd386b-0830-4781-ba21-109800ca3b3a" containerName="glance-log" containerID="cri-o://bf6e5a37fe4aade2d59fbb09879b583c1ef42ed33d5da75d3f75bc16643922b7" gracePeriod=30 Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.003253 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="2cfd386b-0830-4781-ba21-109800ca3b3a" containerName="glance-httpd" containerID="cri-o://6690fb5d84a0049fedb1a76e47388552d7104c6d764e7ba8724feb99cf5a3c36" gracePeriod=30 Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.039201 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-logs\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.039685 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-config-data\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.039892 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-horizon-secret-key\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.039957 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-scripts\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.039994 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4p8q\" (UniqueName: \"kubernetes.io/projected/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-kube-api-access-w4p8q\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.050373 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.050629 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" containerName="glance-log" containerID="cri-o://98b142fd3b5b021abe0d2d78e7c248a25da7614f73931058ce4310073980f966" gracePeriod=30 Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.052361 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" containerName="glance-httpd" containerID="cri-o://5c399a1e6ba809edc2fde19b39403671975d551377ce2d2380d31f78998d1f97" gracePeriod=30 Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.084780 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-684d94669-whgkj"] Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.086306 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.099629 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-684d94669-whgkj"] Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.141944 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-logs\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.142013 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-config-data\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.142117 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-horizon-secret-key\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.142169 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-scripts\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.142196 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4p8q\" (UniqueName: \"kubernetes.io/projected/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-kube-api-access-w4p8q\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.144044 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-scripts\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.144495 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-config-data\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.146862 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-logs\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.147518 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-horizon-secret-key\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.159233 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4p8q\" (UniqueName: \"kubernetes.io/projected/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-kube-api-access-w4p8q\") pod \"horizon-8686484659-lc7t7\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.216581 4831 generic.go:334] "Generic (PLEG): container finished" podID="2cfd386b-0830-4781-ba21-109800ca3b3a" containerID="bf6e5a37fe4aade2d59fbb09879b583c1ef42ed33d5da75d3f75bc16643922b7" exitCode=143 Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.216961 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cfd386b-0830-4781-ba21-109800ca3b3a","Type":"ContainerDied","Data":"bf6e5a37fe4aade2d59fbb09879b583c1ef42ed33d5da75d3f75bc16643922b7"} Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.218941 4831 generic.go:334] "Generic (PLEG): container finished" podID="5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" containerID="98b142fd3b5b021abe0d2d78e7c248a25da7614f73931058ce4310073980f966" exitCode=143 Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.219034 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567","Type":"ContainerDied","Data":"98b142fd3b5b021abe0d2d78e7c248a25da7614f73931058ce4310073980f966"} Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.244406 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9428499d-9f6e-4c02-befb-31ec9f68f080-config-data\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.244921 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9428499d-9f6e-4c02-befb-31ec9f68f080-logs\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.245307 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xnrv\" (UniqueName: \"kubernetes.io/projected/9428499d-9f6e-4c02-befb-31ec9f68f080-kube-api-access-4xnrv\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.245600 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9428499d-9f6e-4c02-befb-31ec9f68f080-horizon-secret-key\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.247654 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9428499d-9f6e-4c02-befb-31ec9f68f080-scripts\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.257340 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.350489 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9428499d-9f6e-4c02-befb-31ec9f68f080-config-data\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.350658 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9428499d-9f6e-4c02-befb-31ec9f68f080-logs\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.350692 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xnrv\" (UniqueName: \"kubernetes.io/projected/9428499d-9f6e-4c02-befb-31ec9f68f080-kube-api-access-4xnrv\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.350733 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9428499d-9f6e-4c02-befb-31ec9f68f080-horizon-secret-key\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.350800 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9428499d-9f6e-4c02-befb-31ec9f68f080-scripts\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.351264 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9428499d-9f6e-4c02-befb-31ec9f68f080-logs\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.351509 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9428499d-9f6e-4c02-befb-31ec9f68f080-scripts\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.352163 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9428499d-9f6e-4c02-befb-31ec9f68f080-config-data\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.365295 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9428499d-9f6e-4c02-befb-31ec9f68f080-horizon-secret-key\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.372125 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xnrv\" (UniqueName: \"kubernetes.io/projected/9428499d-9f6e-4c02-befb-31ec9f68f080-kube-api-access-4xnrv\") pod \"horizon-684d94669-whgkj\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.572556 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.658587 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8686484659-lc7t7"] Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.680686 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-54959bccf-pc4kb"] Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.682263 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.696574 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54959bccf-pc4kb"] Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.760049 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8686484659-lc7t7"] Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.859657 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/acdd083e-b37d-4cdf-b856-3eabe411df76-horizon-secret-key\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.859789 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acdd083e-b37d-4cdf-b856-3eabe411df76-config-data\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.859830 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4wc8\" (UniqueName: \"kubernetes.io/projected/acdd083e-b37d-4cdf-b856-3eabe411df76-kube-api-access-k4wc8\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.860063 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acdd083e-b37d-4cdf-b856-3eabe411df76-logs\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.860182 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acdd083e-b37d-4cdf-b856-3eabe411df76-scripts\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.962110 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acdd083e-b37d-4cdf-b856-3eabe411df76-config-data\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.962483 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4wc8\" (UniqueName: \"kubernetes.io/projected/acdd083e-b37d-4cdf-b856-3eabe411df76-kube-api-access-k4wc8\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.962570 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acdd083e-b37d-4cdf-b856-3eabe411df76-logs\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.962658 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acdd083e-b37d-4cdf-b856-3eabe411df76-scripts\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.962692 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/acdd083e-b37d-4cdf-b856-3eabe411df76-horizon-secret-key\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.963497 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acdd083e-b37d-4cdf-b856-3eabe411df76-logs\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.963626 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acdd083e-b37d-4cdf-b856-3eabe411df76-scripts\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.963931 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acdd083e-b37d-4cdf-b856-3eabe411df76-config-data\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.968679 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/acdd083e-b37d-4cdf-b856-3eabe411df76-horizon-secret-key\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:07 crc kubenswrapper[4831]: I1203 08:12:07.976861 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4wc8\" (UniqueName: \"kubernetes.io/projected/acdd083e-b37d-4cdf-b856-3eabe411df76-kube-api-access-k4wc8\") pod \"horizon-54959bccf-pc4kb\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:08 crc kubenswrapper[4831]: I1203 08:12:08.001780 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:08 crc kubenswrapper[4831]: I1203 08:12:08.235901 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-684d94669-whgkj"] Dec 03 08:12:08 crc kubenswrapper[4831]: I1203 08:12:08.265476 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8686484659-lc7t7" event={"ID":"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f","Type":"ContainerStarted","Data":"845271e17fc364bb89434ede012c7a0db3031a2fb9d2eff0b802a2e988f388c0"} Dec 03 08:12:08 crc kubenswrapper[4831]: I1203 08:12:08.627181 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-54959bccf-pc4kb"] Dec 03 08:12:09 crc kubenswrapper[4831]: I1203 08:12:09.278555 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-684d94669-whgkj" event={"ID":"9428499d-9f6e-4c02-befb-31ec9f68f080","Type":"ContainerStarted","Data":"d57a8b255e5e43e1fb2603522d9e8d381d07fb636fe540e86d50c9bcdf1ad3e2"} Dec 03 08:12:09 crc kubenswrapper[4831]: I1203 08:12:09.280418 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54959bccf-pc4kb" event={"ID":"acdd083e-b37d-4cdf-b856-3eabe411df76","Type":"ContainerStarted","Data":"57938041de8357ccd9e7d6ab3253c6dd6bbb96a926ea4ccfe497d978bf3b5789"} Dec 03 08:12:10 crc kubenswrapper[4831]: I1203 08:12:10.253173 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.1.52:9292/healthcheck\": read tcp 10.217.0.2:49952->10.217.1.52:9292: read: connection reset by peer" Dec 03 08:12:10 crc kubenswrapper[4831]: I1203 08:12:10.253212 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.1.52:9292/healthcheck\": read tcp 10.217.0.2:49944->10.217.1.52:9292: read: connection reset by peer" Dec 03 08:12:10 crc kubenswrapper[4831]: I1203 08:12:10.292310 4831 generic.go:334] "Generic (PLEG): container finished" podID="2cfd386b-0830-4781-ba21-109800ca3b3a" containerID="6690fb5d84a0049fedb1a76e47388552d7104c6d764e7ba8724feb99cf5a3c36" exitCode=0 Dec 03 08:12:10 crc kubenswrapper[4831]: I1203 08:12:10.292355 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cfd386b-0830-4781-ba21-109800ca3b3a","Type":"ContainerDied","Data":"6690fb5d84a0049fedb1a76e47388552d7104c6d764e7ba8724feb99cf5a3c36"} Dec 03 08:12:10 crc kubenswrapper[4831]: I1203 08:12:10.295633 4831 generic.go:334] "Generic (PLEG): container finished" podID="5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" containerID="5c399a1e6ba809edc2fde19b39403671975d551377ce2d2380d31f78998d1f97" exitCode=0 Dec 03 08:12:10 crc kubenswrapper[4831]: I1203 08:12:10.295670 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567","Type":"ContainerDied","Data":"5c399a1e6ba809edc2fde19b39403671975d551377ce2d2380d31f78998d1f97"} Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.353054 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.370138 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567","Type":"ContainerDied","Data":"52e1de03c0bf5638b37329693fbb73f5c59d41d3285b11f533718bf1cf38d3dc"} Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.370194 4831 scope.go:117] "RemoveContainer" containerID="5c399a1e6ba809edc2fde19b39403671975d551377ce2d2380d31f78998d1f97" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.370345 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.379414 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-684d94669-whgkj" event={"ID":"9428499d-9f6e-4c02-befb-31ec9f68f080","Type":"ContainerStarted","Data":"45fb9cdffcc2165dabb2cb2b3c73d02dd5995aeb6eb738a4cff516331b5e38f8"} Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.385468 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54959bccf-pc4kb" event={"ID":"acdd083e-b37d-4cdf-b856-3eabe411df76","Type":"ContainerStarted","Data":"cabda8be23ce9da31fff09b2b69121c21e28f01612c4d672affbf094037a9161"} Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.400408 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.404567 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8686484659-lc7t7" event={"ID":"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f","Type":"ContainerStarted","Data":"f7671b418fa36a50c5d5c171e0f1d8106072742f881b63c834b8c5b2bf3bcd32"} Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.429832 4831 scope.go:117] "RemoveContainer" containerID="98b142fd3b5b021abe0d2d78e7c248a25da7614f73931058ce4310073980f966" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.445708 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-config-data\") pod \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.445758 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-ceph\") pod \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.445809 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-scripts\") pod \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.445843 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-combined-ca-bundle\") pod \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.445878 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-httpd-run\") pod \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.445916 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72rw2\" (UniqueName: \"kubernetes.io/projected/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-kube-api-access-72rw2\") pod \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.445980 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-logs\") pod \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\" (UID: \"5af73d5c-0ede-4bfc-b7c5-46a6d0f49567\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.446745 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" (UID: "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.447067 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-logs" (OuterVolumeSpecName: "logs") pod "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" (UID: "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.476749 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-kube-api-access-72rw2" (OuterVolumeSpecName: "kube-api-access-72rw2") pod "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" (UID: "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567"). InnerVolumeSpecName "kube-api-access-72rw2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.484569 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-scripts" (OuterVolumeSpecName: "scripts") pod "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" (UID: "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.485128 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-ceph" (OuterVolumeSpecName: "ceph") pod "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" (UID: "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.548164 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-combined-ca-bundle\") pod \"2cfd386b-0830-4781-ba21-109800ca3b3a\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.548265 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2cfd386b-0830-4781-ba21-109800ca3b3a-ceph\") pod \"2cfd386b-0830-4781-ba21-109800ca3b3a\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.548298 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cfd386b-0830-4781-ba21-109800ca3b3a-logs\") pod \"2cfd386b-0830-4781-ba21-109800ca3b3a\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.549332 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-config-data\") pod \"2cfd386b-0830-4781-ba21-109800ca3b3a\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.549430 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46k8j\" (UniqueName: \"kubernetes.io/projected/2cfd386b-0830-4781-ba21-109800ca3b3a-kube-api-access-46k8j\") pod \"2cfd386b-0830-4781-ba21-109800ca3b3a\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.549598 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cfd386b-0830-4781-ba21-109800ca3b3a-httpd-run\") pod \"2cfd386b-0830-4781-ba21-109800ca3b3a\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.549613 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cfd386b-0830-4781-ba21-109800ca3b3a-logs" (OuterVolumeSpecName: "logs") pod "2cfd386b-0830-4781-ba21-109800ca3b3a" (UID: "2cfd386b-0830-4781-ba21-109800ca3b3a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.549641 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-scripts\") pod \"2cfd386b-0830-4781-ba21-109800ca3b3a\" (UID: \"2cfd386b-0830-4781-ba21-109800ca3b3a\") " Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.549897 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2cfd386b-0830-4781-ba21-109800ca3b3a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2cfd386b-0830-4781-ba21-109800ca3b3a" (UID: "2cfd386b-0830-4781-ba21-109800ca3b3a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.550553 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2cfd386b-0830-4781-ba21-109800ca3b3a-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.550567 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.550579 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.550588 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.550598 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-72rw2\" (UniqueName: \"kubernetes.io/projected/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-kube-api-access-72rw2\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.550607 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.550614 4831 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2cfd386b-0830-4781-ba21-109800ca3b3a-httpd-run\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.570577 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cfd386b-0830-4781-ba21-109800ca3b3a-kube-api-access-46k8j" (OuterVolumeSpecName: "kube-api-access-46k8j") pod "2cfd386b-0830-4781-ba21-109800ca3b3a" (UID: "2cfd386b-0830-4781-ba21-109800ca3b3a"). InnerVolumeSpecName "kube-api-access-46k8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.571849 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-scripts" (OuterVolumeSpecName: "scripts") pod "2cfd386b-0830-4781-ba21-109800ca3b3a" (UID: "2cfd386b-0830-4781-ba21-109800ca3b3a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.584393 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cfd386b-0830-4781-ba21-109800ca3b3a-ceph" (OuterVolumeSpecName: "ceph") pod "2cfd386b-0830-4781-ba21-109800ca3b3a" (UID: "2cfd386b-0830-4781-ba21-109800ca3b3a"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.647924 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" (UID: "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.653912 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.653940 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.653950 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/2cfd386b-0830-4781-ba21-109800ca3b3a-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.653958 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46k8j\" (UniqueName: \"kubernetes.io/projected/2cfd386b-0830-4781-ba21-109800ca3b3a-kube-api-access-46k8j\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.669633 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cfd386b-0830-4781-ba21-109800ca3b3a" (UID: "2cfd386b-0830-4781-ba21-109800ca3b3a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.675163 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-config-data" (OuterVolumeSpecName: "config-data") pod "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" (UID: "5af73d5c-0ede-4bfc-b7c5-46a6d0f49567"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.690908 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-config-data" (OuterVolumeSpecName: "config-data") pod "2cfd386b-0830-4781-ba21-109800ca3b3a" (UID: "2cfd386b-0830-4781-ba21-109800ca3b3a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.756135 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.756212 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cfd386b-0830-4781-ba21-109800ca3b3a-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:15 crc kubenswrapper[4831]: I1203 08:12:15.756232 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.013022 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.036152 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.042723 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:12:16 crc kubenswrapper[4831]: E1203 08:12:16.043115 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cfd386b-0830-4781-ba21-109800ca3b3a" containerName="glance-log" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.043134 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cfd386b-0830-4781-ba21-109800ca3b3a" containerName="glance-log" Dec 03 08:12:16 crc kubenswrapper[4831]: E1203 08:12:16.043143 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cfd386b-0830-4781-ba21-109800ca3b3a" containerName="glance-httpd" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.043150 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cfd386b-0830-4781-ba21-109800ca3b3a" containerName="glance-httpd" Dec 03 08:12:16 crc kubenswrapper[4831]: E1203 08:12:16.043175 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" containerName="glance-log" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.043181 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" containerName="glance-log" Dec 03 08:12:16 crc kubenswrapper[4831]: E1203 08:12:16.043202 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" containerName="glance-httpd" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.043207 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" containerName="glance-httpd" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.043385 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cfd386b-0830-4781-ba21-109800ca3b3a" containerName="glance-httpd" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.043404 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" containerName="glance-log" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.043415 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cfd386b-0830-4781-ba21-109800ca3b3a" containerName="glance-log" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.043423 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" containerName="glance-httpd" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.046504 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.050212 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.064807 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.164767 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/34509753-3280-4c6f-91c5-c96cd04044a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.164821 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34509753-3280-4c6f-91c5-c96cd04044a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.164854 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34509753-3280-4c6f-91c5-c96cd04044a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.164877 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34509753-3280-4c6f-91c5-c96cd04044a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.164923 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34509753-3280-4c6f-91c5-c96cd04044a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.165091 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34509753-3280-4c6f-91c5-c96cd04044a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.165153 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqmm7\" (UniqueName: \"kubernetes.io/projected/34509753-3280-4c6f-91c5-c96cd04044a3-kube-api-access-cqmm7\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.267119 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/34509753-3280-4c6f-91c5-c96cd04044a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.267876 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34509753-3280-4c6f-91c5-c96cd04044a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.267925 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34509753-3280-4c6f-91c5-c96cd04044a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.267953 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34509753-3280-4c6f-91c5-c96cd04044a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.268027 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34509753-3280-4c6f-91c5-c96cd04044a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.268143 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34509753-3280-4c6f-91c5-c96cd04044a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.268178 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqmm7\" (UniqueName: \"kubernetes.io/projected/34509753-3280-4c6f-91c5-c96cd04044a3-kube-api-access-cqmm7\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.268884 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34509753-3280-4c6f-91c5-c96cd04044a3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.270671 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34509753-3280-4c6f-91c5-c96cd04044a3-logs\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.272242 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34509753-3280-4c6f-91c5-c96cd04044a3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.272701 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/34509753-3280-4c6f-91c5-c96cd04044a3-ceph\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.273461 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34509753-3280-4c6f-91c5-c96cd04044a3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.285519 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34509753-3280-4c6f-91c5-c96cd04044a3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.286292 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqmm7\" (UniqueName: \"kubernetes.io/projected/34509753-3280-4c6f-91c5-c96cd04044a3-kube-api-access-cqmm7\") pod \"glance-default-internal-api-0\" (UID: \"34509753-3280-4c6f-91c5-c96cd04044a3\") " pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.365067 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.416308 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8686484659-lc7t7" event={"ID":"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f","Type":"ContainerStarted","Data":"4b30e49b85f19483632d3406a2b8c323cb1f4a08f16fc633981378ca2004b5e9"} Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.416502 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8686484659-lc7t7" podUID="0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" containerName="horizon-log" containerID="cri-o://f7671b418fa36a50c5d5c171e0f1d8106072742f881b63c834b8c5b2bf3bcd32" gracePeriod=30 Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.417013 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-8686484659-lc7t7" podUID="0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" containerName="horizon" containerID="cri-o://4b30e49b85f19483632d3406a2b8c323cb1f4a08f16fc633981378ca2004b5e9" gracePeriod=30 Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.440602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"2cfd386b-0830-4781-ba21-109800ca3b3a","Type":"ContainerDied","Data":"782ddc0118cfb6e3f3715dca6f541629bf48f9e38ecf204d54d06643b7c9498d"} Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.440661 4831 scope.go:117] "RemoveContainer" containerID="6690fb5d84a0049fedb1a76e47388552d7104c6d764e7ba8724feb99cf5a3c36" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.440782 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.449554 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-684d94669-whgkj" event={"ID":"9428499d-9f6e-4c02-befb-31ec9f68f080","Type":"ContainerStarted","Data":"47ca578bc43b993014e9c42c32e5eb1ab9784d9ad0ffcb6988755c993efaea18"} Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.453813 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-8686484659-lc7t7" podStartSLOduration=3.266352564 podStartE2EDuration="10.453799711s" podCreationTimestamp="2025-12-03 08:12:06 +0000 UTC" firstStartedPulling="2025-12-03 08:12:07.756546997 +0000 UTC m=+6065.100130505" lastFinishedPulling="2025-12-03 08:12:14.943994144 +0000 UTC m=+6072.287577652" observedRunningTime="2025-12-03 08:12:16.440742904 +0000 UTC m=+6073.784326402" watchObservedRunningTime="2025-12-03 08:12:16.453799711 +0000 UTC m=+6073.797383219" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.457848 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54959bccf-pc4kb" event={"ID":"acdd083e-b37d-4cdf-b856-3eabe411df76","Type":"ContainerStarted","Data":"b0050ee94f57530d896caebecbbc9c0375ab39b99d2ef48d7fcb100dc77c32a1"} Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.471804 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-684d94669-whgkj" podStartSLOduration=2.799715986 podStartE2EDuration="9.471789241s" podCreationTimestamp="2025-12-03 08:12:07 +0000 UTC" firstStartedPulling="2025-12-03 08:12:08.309592643 +0000 UTC m=+6065.653176141" lastFinishedPulling="2025-12-03 08:12:14.981665888 +0000 UTC m=+6072.325249396" observedRunningTime="2025-12-03 08:12:16.467000691 +0000 UTC m=+6073.810584219" watchObservedRunningTime="2025-12-03 08:12:16.471789241 +0000 UTC m=+6073.815372749" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.501795 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-54959bccf-pc4kb" podStartSLOduration=3.174138048 podStartE2EDuration="9.501782105s" podCreationTimestamp="2025-12-03 08:12:07 +0000 UTC" firstStartedPulling="2025-12-03 08:12:08.655442815 +0000 UTC m=+6065.999026323" lastFinishedPulling="2025-12-03 08:12:14.983086872 +0000 UTC m=+6072.326670380" observedRunningTime="2025-12-03 08:12:16.488685678 +0000 UTC m=+6073.832269206" watchObservedRunningTime="2025-12-03 08:12:16.501782105 +0000 UTC m=+6073.845365613" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.524744 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.529982 4831 scope.go:117] "RemoveContainer" containerID="bf6e5a37fe4aade2d59fbb09879b583c1ef42ed33d5da75d3f75bc16643922b7" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.538987 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.547369 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.562722 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.568194 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.616447 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.685106 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdf75c26-8fcd-4895-954b-c02971d19231-logs\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.685183 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf75c26-8fcd-4895-954b-c02971d19231-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.685223 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdf75c26-8fcd-4895-954b-c02971d19231-ceph\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.685270 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdf75c26-8fcd-4895-954b-c02971d19231-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.685286 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq8bx\" (UniqueName: \"kubernetes.io/projected/cdf75c26-8fcd-4895-954b-c02971d19231-kube-api-access-lq8bx\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.685354 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdf75c26-8fcd-4895-954b-c02971d19231-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.685407 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf75c26-8fcd-4895-954b-c02971d19231-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.786522 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdf75c26-8fcd-4895-954b-c02971d19231-logs\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.786860 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf75c26-8fcd-4895-954b-c02971d19231-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.786918 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdf75c26-8fcd-4895-954b-c02971d19231-ceph\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.786954 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdf75c26-8fcd-4895-954b-c02971d19231-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.786981 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq8bx\" (UniqueName: \"kubernetes.io/projected/cdf75c26-8fcd-4895-954b-c02971d19231-kube-api-access-lq8bx\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.787046 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdf75c26-8fcd-4895-954b-c02971d19231-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.787116 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf75c26-8fcd-4895-954b-c02971d19231-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.787383 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cdf75c26-8fcd-4895-954b-c02971d19231-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.787754 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cdf75c26-8fcd-4895-954b-c02971d19231-logs\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.792779 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cdf75c26-8fcd-4895-954b-c02971d19231-scripts\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.793282 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cdf75c26-8fcd-4895-954b-c02971d19231-config-data\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.793956 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/cdf75c26-8fcd-4895-954b-c02971d19231-ceph\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.796387 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cdf75c26-8fcd-4895-954b-c02971d19231-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.803421 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq8bx\" (UniqueName: \"kubernetes.io/projected/cdf75c26-8fcd-4895-954b-c02971d19231-kube-api-access-lq8bx\") pod \"glance-default-external-api-0\" (UID: \"cdf75c26-8fcd-4895-954b-c02971d19231\") " pod="openstack/glance-default-external-api-0" Dec 03 08:12:16 crc kubenswrapper[4831]: I1203 08:12:16.896525 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Dec 03 08:12:17 crc kubenswrapper[4831]: I1203 08:12:17.024904 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cfd386b-0830-4781-ba21-109800ca3b3a" path="/var/lib/kubelet/pods/2cfd386b-0830-4781-ba21-109800ca3b3a/volumes" Dec 03 08:12:17 crc kubenswrapper[4831]: I1203 08:12:17.025652 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5af73d5c-0ede-4bfc-b7c5-46a6d0f49567" path="/var/lib/kubelet/pods/5af73d5c-0ede-4bfc-b7c5-46a6d0f49567/volumes" Dec 03 08:12:17 crc kubenswrapper[4831]: I1203 08:12:17.076234 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Dec 03 08:12:17 crc kubenswrapper[4831]: I1203 08:12:17.260872 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:17 crc kubenswrapper[4831]: I1203 08:12:17.469751 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34509753-3280-4c6f-91c5-c96cd04044a3","Type":"ContainerStarted","Data":"1070c9ad01642a151379f048cf6cc78b34f45b5140e08a33cbc43bbfa5e6bcad"} Dec 03 08:12:17 crc kubenswrapper[4831]: I1203 08:12:17.544937 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Dec 03 08:12:17 crc kubenswrapper[4831]: I1203 08:12:17.574504 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:17 crc kubenswrapper[4831]: I1203 08:12:17.574582 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:18 crc kubenswrapper[4831]: I1203 08:12:18.004614 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:18 crc kubenswrapper[4831]: I1203 08:12:18.004999 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:18 crc kubenswrapper[4831]: I1203 08:12:18.498248 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34509753-3280-4c6f-91c5-c96cd04044a3","Type":"ContainerStarted","Data":"ed4359f100a0c82dc5a6f0593fe7d3b217dd3f72940fd38e83434d11e494bcbe"} Dec 03 08:12:18 crc kubenswrapper[4831]: I1203 08:12:18.501010 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdf75c26-8fcd-4895-954b-c02971d19231","Type":"ContainerStarted","Data":"7893905c6da443fefed75fb9c1f21c5bd2094ef9db22ec4a235f30734050afe6"} Dec 03 08:12:18 crc kubenswrapper[4831]: I1203 08:12:18.501069 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdf75c26-8fcd-4895-954b-c02971d19231","Type":"ContainerStarted","Data":"5cb36e886c63bc858d082534be269963094c66c8e60c8f85f72a9a4afd10ee2b"} Dec 03 08:12:19 crc kubenswrapper[4831]: I1203 08:12:19.512688 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"34509753-3280-4c6f-91c5-c96cd04044a3","Type":"ContainerStarted","Data":"554c8190ee54bbcc59b4754acce9fbf08c2db16dcff656cfb552c3cd422b2181"} Dec 03 08:12:19 crc kubenswrapper[4831]: I1203 08:12:19.519190 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"cdf75c26-8fcd-4895-954b-c02971d19231","Type":"ContainerStarted","Data":"5f791b5af1e27b4e88aec8e58a02c6c35ee98174a5171a4768429cc8566e98f4"} Dec 03 08:12:19 crc kubenswrapper[4831]: I1203 08:12:19.586782 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=3.586763493 podStartE2EDuration="3.586763493s" podCreationTimestamp="2025-12-03 08:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:12:19.579571899 +0000 UTC m=+6076.923155417" watchObservedRunningTime="2025-12-03 08:12:19.586763493 +0000 UTC m=+6076.930347001" Dec 03 08:12:19 crc kubenswrapper[4831]: I1203 08:12:19.622224 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=3.622193996 podStartE2EDuration="3.622193996s" podCreationTimestamp="2025-12-03 08:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:12:19.608261012 +0000 UTC m=+6076.951844520" watchObservedRunningTime="2025-12-03 08:12:19.622193996 +0000 UTC m=+6076.965777524" Dec 03 08:12:20 crc kubenswrapper[4831]: I1203 08:12:20.602415 4831 scope.go:117] "RemoveContainer" containerID="0de074cd3685fe1e0ac59b03b11acc0dc7d15626ae841da1acc1f73815b2f917" Dec 03 08:12:20 crc kubenswrapper[4831]: I1203 08:12:20.637141 4831 scope.go:117] "RemoveContainer" containerID="a5474d58cb26367afdfdcf3687160c61d1384a6a99e81a9df015b289d7275ce0" Dec 03 08:12:20 crc kubenswrapper[4831]: I1203 08:12:20.691930 4831 scope.go:117] "RemoveContainer" containerID="be3cce8c69525fd8502a9b86191a3cf670815ece566d0489c60f1396e30559a1" Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.045660 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-eb20-account-create-update-pv52t"] Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.065950 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-h98qc"] Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.076217 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-h98qc"] Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.083809 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-eb20-account-create-update-pv52t"] Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.365817 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.366165 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.413365 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.416428 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.586826 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.586860 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.897475 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.897524 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.934247 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 08:12:26 crc kubenswrapper[4831]: I1203 08:12:26.969291 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Dec 03 08:12:27 crc kubenswrapper[4831]: I1203 08:12:27.028025 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d8aef98-04c2-4da9-b3cc-da5f44144eea" path="/var/lib/kubelet/pods/8d8aef98-04c2-4da9-b3cc-da5f44144eea/volumes" Dec 03 08:12:27 crc kubenswrapper[4831]: I1203 08:12:27.029126 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd22683-04e1-40b1-b6b5-31caa4789313" path="/var/lib/kubelet/pods/edd22683-04e1-40b1-b6b5-31caa4789313/volumes" Dec 03 08:12:27 crc kubenswrapper[4831]: I1203 08:12:27.576706 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-684d94669-whgkj" podUID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.118:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.118:8080: connect: connection refused" Dec 03 08:12:27 crc kubenswrapper[4831]: I1203 08:12:27.596390 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 08:12:27 crc kubenswrapper[4831]: I1203 08:12:27.596431 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Dec 03 08:12:28 crc kubenswrapper[4831]: I1203 08:12:28.005270 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-54959bccf-pc4kb" podUID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.119:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8080: connect: connection refused" Dec 03 08:12:28 crc kubenswrapper[4831]: I1203 08:12:28.841855 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 08:12:28 crc kubenswrapper[4831]: I1203 08:12:28.842926 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 08:12:28 crc kubenswrapper[4831]: I1203 08:12:28.848077 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Dec 03 08:12:29 crc kubenswrapper[4831]: I1203 08:12:29.610422 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 08:12:29 crc kubenswrapper[4831]: I1203 08:12:29.610734 4831 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 03 08:12:30 crc kubenswrapper[4831]: I1203 08:12:30.098206 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 08:12:30 crc kubenswrapper[4831]: I1203 08:12:30.128663 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Dec 03 08:12:37 crc kubenswrapper[4831]: I1203 08:12:37.060554 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-kg6v2"] Dec 03 08:12:37 crc kubenswrapper[4831]: I1203 08:12:37.073801 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-kg6v2"] Dec 03 08:12:39 crc kubenswrapper[4831]: I1203 08:12:39.026695 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4836bb4-0fd6-403e-ad70-27483213715e" path="/var/lib/kubelet/pods/f4836bb4-0fd6-403e-ad70-27483213715e/volumes" Dec 03 08:12:39 crc kubenswrapper[4831]: I1203 08:12:39.367897 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:39 crc kubenswrapper[4831]: I1203 08:12:39.722959 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:41 crc kubenswrapper[4831]: I1203 08:12:41.244872 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-684d94669-whgkj" Dec 03 08:12:41 crc kubenswrapper[4831]: I1203 08:12:41.582877 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:12:41 crc kubenswrapper[4831]: I1203 08:12:41.664197 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-684d94669-whgkj"] Dec 03 08:12:41 crc kubenswrapper[4831]: I1203 08:12:41.734044 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-684d94669-whgkj" podUID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerName="horizon-log" containerID="cri-o://45fb9cdffcc2165dabb2cb2b3c73d02dd5995aeb6eb738a4cff516331b5e38f8" gracePeriod=30 Dec 03 08:12:41 crc kubenswrapper[4831]: I1203 08:12:41.734133 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-684d94669-whgkj" podUID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerName="horizon" containerID="cri-o://47ca578bc43b993014e9c42c32e5eb1ab9784d9ad0ffcb6988755c993efaea18" gracePeriod=30 Dec 03 08:12:45 crc kubenswrapper[4831]: I1203 08:12:45.782510 4831 generic.go:334] "Generic (PLEG): container finished" podID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerID="47ca578bc43b993014e9c42c32e5eb1ab9784d9ad0ffcb6988755c993efaea18" exitCode=0 Dec 03 08:12:45 crc kubenswrapper[4831]: I1203 08:12:45.782582 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-684d94669-whgkj" event={"ID":"9428499d-9f6e-4c02-befb-31ec9f68f080","Type":"ContainerDied","Data":"47ca578bc43b993014e9c42c32e5eb1ab9784d9ad0ffcb6988755c993efaea18"} Dec 03 08:12:46 crc kubenswrapper[4831]: I1203 08:12:46.820084 4831 generic.go:334] "Generic (PLEG): container finished" podID="0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" containerID="4b30e49b85f19483632d3406a2b8c323cb1f4a08f16fc633981378ca2004b5e9" exitCode=137 Dec 03 08:12:46 crc kubenswrapper[4831]: I1203 08:12:46.820394 4831 generic.go:334] "Generic (PLEG): container finished" podID="0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" containerID="f7671b418fa36a50c5d5c171e0f1d8106072742f881b63c834b8c5b2bf3bcd32" exitCode=137 Dec 03 08:12:46 crc kubenswrapper[4831]: I1203 08:12:46.820427 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8686484659-lc7t7" event={"ID":"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f","Type":"ContainerDied","Data":"4b30e49b85f19483632d3406a2b8c323cb1f4a08f16fc633981378ca2004b5e9"} Dec 03 08:12:46 crc kubenswrapper[4831]: I1203 08:12:46.820453 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8686484659-lc7t7" event={"ID":"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f","Type":"ContainerDied","Data":"f7671b418fa36a50c5d5c171e0f1d8106072742f881b63c834b8c5b2bf3bcd32"} Dec 03 08:12:46 crc kubenswrapper[4831]: I1203 08:12:46.820464 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-8686484659-lc7t7" event={"ID":"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f","Type":"ContainerDied","Data":"845271e17fc364bb89434ede012c7a0db3031a2fb9d2eff0b802a2e988f388c0"} Dec 03 08:12:46 crc kubenswrapper[4831]: I1203 08:12:46.820474 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="845271e17fc364bb89434ede012c7a0db3031a2fb9d2eff0b802a2e988f388c0" Dec 03 08:12:46 crc kubenswrapper[4831]: I1203 08:12:46.901467 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.072428 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-horizon-secret-key\") pod \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.072549 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-config-data\") pod \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.072642 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-logs\") pod \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.072739 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4p8q\" (UniqueName: \"kubernetes.io/projected/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-kube-api-access-w4p8q\") pod \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.073290 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-logs" (OuterVolumeSpecName: "logs") pod "0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" (UID: "0080a88b-0a3b-4a47-93fe-7e4cf451bb5f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.073414 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-scripts\") pod \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\" (UID: \"0080a88b-0a3b-4a47-93fe-7e4cf451bb5f\") " Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.074032 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.081592 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" (UID: "0080a88b-0a3b-4a47-93fe-7e4cf451bb5f"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.082443 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-kube-api-access-w4p8q" (OuterVolumeSpecName: "kube-api-access-w4p8q") pod "0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" (UID: "0080a88b-0a3b-4a47-93fe-7e4cf451bb5f"). InnerVolumeSpecName "kube-api-access-w4p8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.111763 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-scripts" (OuterVolumeSpecName: "scripts") pod "0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" (UID: "0080a88b-0a3b-4a47-93fe-7e4cf451bb5f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.121780 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-config-data" (OuterVolumeSpecName: "config-data") pod "0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" (UID: "0080a88b-0a3b-4a47-93fe-7e4cf451bb5f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.176762 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.176795 4831 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.176828 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.176840 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4p8q\" (UniqueName: \"kubernetes.io/projected/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f-kube-api-access-w4p8q\") on node \"crc\" DevicePath \"\"" Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.573993 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-684d94669-whgkj" podUID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.118:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.118:8080: connect: connection refused" Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.832089 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-8686484659-lc7t7" Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.897004 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-8686484659-lc7t7"] Dec 03 08:12:47 crc kubenswrapper[4831]: I1203 08:12:47.907324 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-8686484659-lc7t7"] Dec 03 08:12:49 crc kubenswrapper[4831]: I1203 08:12:49.039259 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" path="/var/lib/kubelet/pods/0080a88b-0a3b-4a47-93fe-7e4cf451bb5f/volumes" Dec 03 08:12:57 crc kubenswrapper[4831]: I1203 08:12:57.574805 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-684d94669-whgkj" podUID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.118:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.118:8080: connect: connection refused" Dec 03 08:13:07 crc kubenswrapper[4831]: I1203 08:13:07.574513 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-684d94669-whgkj" podUID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.118:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.118:8080: connect: connection refused" Dec 03 08:13:07 crc kubenswrapper[4831]: I1203 08:13:07.575214 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-684d94669-whgkj" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.226702 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-684d94669-whgkj" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.263911 4831 generic.go:334] "Generic (PLEG): container finished" podID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerID="45fb9cdffcc2165dabb2cb2b3c73d02dd5995aeb6eb738a4cff516331b5e38f8" exitCode=137 Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.263965 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-684d94669-whgkj" event={"ID":"9428499d-9f6e-4c02-befb-31ec9f68f080","Type":"ContainerDied","Data":"45fb9cdffcc2165dabb2cb2b3c73d02dd5995aeb6eb738a4cff516331b5e38f8"} Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.264051 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-684d94669-whgkj" event={"ID":"9428499d-9f6e-4c02-befb-31ec9f68f080","Type":"ContainerDied","Data":"d57a8b255e5e43e1fb2603522d9e8d381d07fb636fe540e86d50c9bcdf1ad3e2"} Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.264073 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-684d94669-whgkj" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.264099 4831 scope.go:117] "RemoveContainer" containerID="47ca578bc43b993014e9c42c32e5eb1ab9784d9ad0ffcb6988755c993efaea18" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.335886 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9428499d-9f6e-4c02-befb-31ec9f68f080-scripts\") pod \"9428499d-9f6e-4c02-befb-31ec9f68f080\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.335967 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9428499d-9f6e-4c02-befb-31ec9f68f080-config-data\") pod \"9428499d-9f6e-4c02-befb-31ec9f68f080\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.336044 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xnrv\" (UniqueName: \"kubernetes.io/projected/9428499d-9f6e-4c02-befb-31ec9f68f080-kube-api-access-4xnrv\") pod \"9428499d-9f6e-4c02-befb-31ec9f68f080\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.336255 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9428499d-9f6e-4c02-befb-31ec9f68f080-logs\") pod \"9428499d-9f6e-4c02-befb-31ec9f68f080\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.337135 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9428499d-9f6e-4c02-befb-31ec9f68f080-logs" (OuterVolumeSpecName: "logs") pod "9428499d-9f6e-4c02-befb-31ec9f68f080" (UID: "9428499d-9f6e-4c02-befb-31ec9f68f080"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.337229 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9428499d-9f6e-4c02-befb-31ec9f68f080-horizon-secret-key\") pod \"9428499d-9f6e-4c02-befb-31ec9f68f080\" (UID: \"9428499d-9f6e-4c02-befb-31ec9f68f080\") " Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.338244 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9428499d-9f6e-4c02-befb-31ec9f68f080-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.341559 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9428499d-9f6e-4c02-befb-31ec9f68f080-kube-api-access-4xnrv" (OuterVolumeSpecName: "kube-api-access-4xnrv") pod "9428499d-9f6e-4c02-befb-31ec9f68f080" (UID: "9428499d-9f6e-4c02-befb-31ec9f68f080"). InnerVolumeSpecName "kube-api-access-4xnrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.343489 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9428499d-9f6e-4c02-befb-31ec9f68f080-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "9428499d-9f6e-4c02-befb-31ec9f68f080" (UID: "9428499d-9f6e-4c02-befb-31ec9f68f080"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.380620 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9428499d-9f6e-4c02-befb-31ec9f68f080-scripts" (OuterVolumeSpecName: "scripts") pod "9428499d-9f6e-4c02-befb-31ec9f68f080" (UID: "9428499d-9f6e-4c02-befb-31ec9f68f080"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.385149 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9428499d-9f6e-4c02-befb-31ec9f68f080-config-data" (OuterVolumeSpecName: "config-data") pod "9428499d-9f6e-4c02-befb-31ec9f68f080" (UID: "9428499d-9f6e-4c02-befb-31ec9f68f080"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.440078 4831 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/9428499d-9f6e-4c02-befb-31ec9f68f080-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.440121 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/9428499d-9f6e-4c02-befb-31ec9f68f080-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.440135 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/9428499d-9f6e-4c02-befb-31ec9f68f080-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.440151 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xnrv\" (UniqueName: \"kubernetes.io/projected/9428499d-9f6e-4c02-befb-31ec9f68f080-kube-api-access-4xnrv\") on node \"crc\" DevicePath \"\"" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.507839 4831 scope.go:117] "RemoveContainer" containerID="45fb9cdffcc2165dabb2cb2b3c73d02dd5995aeb6eb738a4cff516331b5e38f8" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.529402 4831 scope.go:117] "RemoveContainer" containerID="47ca578bc43b993014e9c42c32e5eb1ab9784d9ad0ffcb6988755c993efaea18" Dec 03 08:13:12 crc kubenswrapper[4831]: E1203 08:13:12.530006 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47ca578bc43b993014e9c42c32e5eb1ab9784d9ad0ffcb6988755c993efaea18\": container with ID starting with 47ca578bc43b993014e9c42c32e5eb1ab9784d9ad0ffcb6988755c993efaea18 not found: ID does not exist" containerID="47ca578bc43b993014e9c42c32e5eb1ab9784d9ad0ffcb6988755c993efaea18" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.530052 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47ca578bc43b993014e9c42c32e5eb1ab9784d9ad0ffcb6988755c993efaea18"} err="failed to get container status \"47ca578bc43b993014e9c42c32e5eb1ab9784d9ad0ffcb6988755c993efaea18\": rpc error: code = NotFound desc = could not find container \"47ca578bc43b993014e9c42c32e5eb1ab9784d9ad0ffcb6988755c993efaea18\": container with ID starting with 47ca578bc43b993014e9c42c32e5eb1ab9784d9ad0ffcb6988755c993efaea18 not found: ID does not exist" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.530085 4831 scope.go:117] "RemoveContainer" containerID="45fb9cdffcc2165dabb2cb2b3c73d02dd5995aeb6eb738a4cff516331b5e38f8" Dec 03 08:13:12 crc kubenswrapper[4831]: E1203 08:13:12.530522 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"45fb9cdffcc2165dabb2cb2b3c73d02dd5995aeb6eb738a4cff516331b5e38f8\": container with ID starting with 45fb9cdffcc2165dabb2cb2b3c73d02dd5995aeb6eb738a4cff516331b5e38f8 not found: ID does not exist" containerID="45fb9cdffcc2165dabb2cb2b3c73d02dd5995aeb6eb738a4cff516331b5e38f8" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.530547 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"45fb9cdffcc2165dabb2cb2b3c73d02dd5995aeb6eb738a4cff516331b5e38f8"} err="failed to get container status \"45fb9cdffcc2165dabb2cb2b3c73d02dd5995aeb6eb738a4cff516331b5e38f8\": rpc error: code = NotFound desc = could not find container \"45fb9cdffcc2165dabb2cb2b3c73d02dd5995aeb6eb738a4cff516331b5e38f8\": container with ID starting with 45fb9cdffcc2165dabb2cb2b3c73d02dd5995aeb6eb738a4cff516331b5e38f8 not found: ID does not exist" Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.621471 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-684d94669-whgkj"] Dec 03 08:13:12 crc kubenswrapper[4831]: I1203 08:13:12.636035 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-684d94669-whgkj"] Dec 03 08:13:13 crc kubenswrapper[4831]: I1203 08:13:13.060145 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9428499d-9f6e-4c02-befb-31ec9f68f080" path="/var/lib/kubelet/pods/9428499d-9f6e-4c02-befb-31ec9f68f080/volumes" Dec 03 08:13:20 crc kubenswrapper[4831]: I1203 08:13:20.850380 4831 scope.go:117] "RemoveContainer" containerID="2bf8b40b5f8181eacfe88d3b130caa1eff0314b3f85e7e5b003597acb98746de" Dec 03 08:13:20 crc kubenswrapper[4831]: I1203 08:13:20.891329 4831 scope.go:117] "RemoveContainer" containerID="19ea6c2b93f47bab01399b4628a20b8a07fec2607fb8433e13f36493c877aba4" Dec 03 08:13:20 crc kubenswrapper[4831]: I1203 08:13:20.966629 4831 scope.go:117] "RemoveContainer" containerID="f66b2dd44e26e8b57f1d4f6970c830c14562f4786bbdd54f9d77f8f672414a21" Dec 03 08:13:23 crc kubenswrapper[4831]: I1203 08:13:23.966542 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-78874fb77c-dtxst"] Dec 03 08:13:23 crc kubenswrapper[4831]: E1203 08:13:23.967393 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" containerName="horizon" Dec 03 08:13:23 crc kubenswrapper[4831]: I1203 08:13:23.967410 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" containerName="horizon" Dec 03 08:13:23 crc kubenswrapper[4831]: E1203 08:13:23.967438 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerName="horizon" Dec 03 08:13:23 crc kubenswrapper[4831]: I1203 08:13:23.967447 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerName="horizon" Dec 03 08:13:23 crc kubenswrapper[4831]: E1203 08:13:23.967466 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerName="horizon-log" Dec 03 08:13:23 crc kubenswrapper[4831]: I1203 08:13:23.967476 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerName="horizon-log" Dec 03 08:13:23 crc kubenswrapper[4831]: E1203 08:13:23.967490 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" containerName="horizon-log" Dec 03 08:13:23 crc kubenswrapper[4831]: I1203 08:13:23.967502 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" containerName="horizon-log" Dec 03 08:13:23 crc kubenswrapper[4831]: I1203 08:13:23.967763 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerName="horizon" Dec 03 08:13:23 crc kubenswrapper[4831]: I1203 08:13:23.967777 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" containerName="horizon" Dec 03 08:13:23 crc kubenswrapper[4831]: I1203 08:13:23.967796 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0080a88b-0a3b-4a47-93fe-7e4cf451bb5f" containerName="horizon-log" Dec 03 08:13:23 crc kubenswrapper[4831]: I1203 08:13:23.967815 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="9428499d-9f6e-4c02-befb-31ec9f68f080" containerName="horizon-log" Dec 03 08:13:23 crc kubenswrapper[4831]: I1203 08:13:23.969147 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.005085 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78874fb77c-dtxst"] Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.045025 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-scripts\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.045071 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-config-data\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.045114 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-logs\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.045500 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ddkj\" (UniqueName: \"kubernetes.io/projected/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-kube-api-access-9ddkj\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.045680 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-horizon-secret-key\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.071056 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-45f9-account-create-update-5m2gl"] Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.081842 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-4tpvx"] Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.090340 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-45f9-account-create-update-5m2gl"] Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.097788 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-4tpvx"] Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.147639 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-scripts\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.147682 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-config-data\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.147736 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-logs\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.147833 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ddkj\" (UniqueName: \"kubernetes.io/projected/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-kube-api-access-9ddkj\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.147881 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-horizon-secret-key\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.148528 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-logs\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.148711 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-scripts\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.149840 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-config-data\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.153240 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-horizon-secret-key\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.166150 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ddkj\" (UniqueName: \"kubernetes.io/projected/e0210cd0-445e-4bb0-94a4-b7387a4ff3fe-kube-api-access-9ddkj\") pod \"horizon-78874fb77c-dtxst\" (UID: \"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe\") " pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.305867 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:24 crc kubenswrapper[4831]: W1203 08:13:24.964486 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode0210cd0_445e_4bb0_94a4_b7387a4ff3fe.slice/crio-0b340d0baa9aedff90b75203af8045500526370b5da2eaabeb1e2623df9851c6 WatchSource:0}: Error finding container 0b340d0baa9aedff90b75203af8045500526370b5da2eaabeb1e2623df9851c6: Status 404 returned error can't find the container with id 0b340d0baa9aedff90b75203af8045500526370b5da2eaabeb1e2623df9851c6 Dec 03 08:13:24 crc kubenswrapper[4831]: I1203 08:13:24.966521 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-78874fb77c-dtxst"] Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.031561 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2743a206-e77a-445a-bbbc-66987e3357a8" path="/var/lib/kubelet/pods/2743a206-e77a-445a-bbbc-66987e3357a8/volumes" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.032683 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5d56d55-d26e-4957-9997-825ddfb815b9" path="/var/lib/kubelet/pods/e5d56d55-d26e-4957-9997-825ddfb815b9/volumes" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.343219 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-create-lvjzd"] Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.345510 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-lvjzd" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.363054 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-lvjzd"] Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.390906 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blz45\" (UniqueName: \"kubernetes.io/projected/6b9f139e-f14a-4bc6-81b4-a9635b47c78e-kube-api-access-blz45\") pod \"heat-db-create-lvjzd\" (UID: \"6b9f139e-f14a-4bc6-81b4-a9635b47c78e\") " pod="openstack/heat-db-create-lvjzd" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.391185 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9f139e-f14a-4bc6-81b4-a9635b47c78e-operator-scripts\") pod \"heat-db-create-lvjzd\" (UID: \"6b9f139e-f14a-4bc6-81b4-a9635b47c78e\") " pod="openstack/heat-db-create-lvjzd" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.451805 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-e94a-account-create-update-rk45r"] Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.453241 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-e94a-account-create-update-rk45r" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.454094 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78874fb77c-dtxst" event={"ID":"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe","Type":"ContainerStarted","Data":"cedfc406995fa512e03057f2adf6e6617000310c3e001bb1d195286c4c643f26"} Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.454145 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78874fb77c-dtxst" event={"ID":"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe","Type":"ContainerStarted","Data":"1707b53db97b400f77a528f9754b83340dbcdc20af4a8c66c7de73d3a7b68c6f"} Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.454159 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-78874fb77c-dtxst" event={"ID":"e0210cd0-445e-4bb0-94a4-b7387a4ff3fe","Type":"ContainerStarted","Data":"0b340d0baa9aedff90b75203af8045500526370b5da2eaabeb1e2623df9851c6"} Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.456853 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-db-secret" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.463820 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-e94a-account-create-update-rk45r"] Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.492495 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9f139e-f14a-4bc6-81b4-a9635b47c78e-operator-scripts\") pod \"heat-db-create-lvjzd\" (UID: \"6b9f139e-f14a-4bc6-81b4-a9635b47c78e\") " pod="openstack/heat-db-create-lvjzd" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.492678 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc8nk\" (UniqueName: \"kubernetes.io/projected/c1fdc7ed-1970-42d9-a759-9b9fa3566070-kube-api-access-sc8nk\") pod \"heat-e94a-account-create-update-rk45r\" (UID: \"c1fdc7ed-1970-42d9-a759-9b9fa3566070\") " pod="openstack/heat-e94a-account-create-update-rk45r" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.500108 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1fdc7ed-1970-42d9-a759-9b9fa3566070-operator-scripts\") pod \"heat-e94a-account-create-update-rk45r\" (UID: \"c1fdc7ed-1970-42d9-a759-9b9fa3566070\") " pod="openstack/heat-e94a-account-create-update-rk45r" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.500240 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blz45\" (UniqueName: \"kubernetes.io/projected/6b9f139e-f14a-4bc6-81b4-a9635b47c78e-kube-api-access-blz45\") pod \"heat-db-create-lvjzd\" (UID: \"6b9f139e-f14a-4bc6-81b4-a9635b47c78e\") " pod="openstack/heat-db-create-lvjzd" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.502743 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9f139e-f14a-4bc6-81b4-a9635b47c78e-operator-scripts\") pod \"heat-db-create-lvjzd\" (UID: \"6b9f139e-f14a-4bc6-81b4-a9635b47c78e\") " pod="openstack/heat-db-create-lvjzd" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.530617 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blz45\" (UniqueName: \"kubernetes.io/projected/6b9f139e-f14a-4bc6-81b4-a9635b47c78e-kube-api-access-blz45\") pod \"heat-db-create-lvjzd\" (UID: \"6b9f139e-f14a-4bc6-81b4-a9635b47c78e\") " pod="openstack/heat-db-create-lvjzd" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.543007 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-78874fb77c-dtxst" podStartSLOduration=2.5429758 podStartE2EDuration="2.5429758s" podCreationTimestamp="2025-12-03 08:13:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:13:25.501652812 +0000 UTC m=+6142.845236320" watchObservedRunningTime="2025-12-03 08:13:25.5429758 +0000 UTC m=+6142.886559318" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.622157 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1fdc7ed-1970-42d9-a759-9b9fa3566070-operator-scripts\") pod \"heat-e94a-account-create-update-rk45r\" (UID: \"c1fdc7ed-1970-42d9-a759-9b9fa3566070\") " pod="openstack/heat-e94a-account-create-update-rk45r" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.622352 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc8nk\" (UniqueName: \"kubernetes.io/projected/c1fdc7ed-1970-42d9-a759-9b9fa3566070-kube-api-access-sc8nk\") pod \"heat-e94a-account-create-update-rk45r\" (UID: \"c1fdc7ed-1970-42d9-a759-9b9fa3566070\") " pod="openstack/heat-e94a-account-create-update-rk45r" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.622918 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1fdc7ed-1970-42d9-a759-9b9fa3566070-operator-scripts\") pod \"heat-e94a-account-create-update-rk45r\" (UID: \"c1fdc7ed-1970-42d9-a759-9b9fa3566070\") " pod="openstack/heat-e94a-account-create-update-rk45r" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.642990 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc8nk\" (UniqueName: \"kubernetes.io/projected/c1fdc7ed-1970-42d9-a759-9b9fa3566070-kube-api-access-sc8nk\") pod \"heat-e94a-account-create-update-rk45r\" (UID: \"c1fdc7ed-1970-42d9-a759-9b9fa3566070\") " pod="openstack/heat-e94a-account-create-update-rk45r" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.664022 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-lvjzd" Dec 03 08:13:25 crc kubenswrapper[4831]: I1203 08:13:25.778619 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-e94a-account-create-update-rk45r" Dec 03 08:13:26 crc kubenswrapper[4831]: I1203 08:13:26.190240 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-create-lvjzd"] Dec 03 08:13:26 crc kubenswrapper[4831]: W1203 08:13:26.197405 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6b9f139e_f14a_4bc6_81b4_a9635b47c78e.slice/crio-d76a71025db007c6b04d2620f94f5e4d81daef9e3bbe20ae4154b0ba1fc80f92 WatchSource:0}: Error finding container d76a71025db007c6b04d2620f94f5e4d81daef9e3bbe20ae4154b0ba1fc80f92: Status 404 returned error can't find the container with id d76a71025db007c6b04d2620f94f5e4d81daef9e3bbe20ae4154b0ba1fc80f92 Dec 03 08:13:26 crc kubenswrapper[4831]: I1203 08:13:26.303480 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-e94a-account-create-update-rk45r"] Dec 03 08:13:26 crc kubenswrapper[4831]: I1203 08:13:26.465531 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-lvjzd" event={"ID":"6b9f139e-f14a-4bc6-81b4-a9635b47c78e","Type":"ContainerStarted","Data":"d76a71025db007c6b04d2620f94f5e4d81daef9e3bbe20ae4154b0ba1fc80f92"} Dec 03 08:13:26 crc kubenswrapper[4831]: I1203 08:13:26.466851 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-e94a-account-create-update-rk45r" event={"ID":"c1fdc7ed-1970-42d9-a759-9b9fa3566070","Type":"ContainerStarted","Data":"7593aae9b9b4af044331c3c637f39b6f95219da88847404d14b7e9b93c8a4c69"} Dec 03 08:13:27 crc kubenswrapper[4831]: I1203 08:13:27.477248 4831 generic.go:334] "Generic (PLEG): container finished" podID="c1fdc7ed-1970-42d9-a759-9b9fa3566070" containerID="b15a651c9f0c4e0bd6d04443d3e2b8688b5688b55c1a07a6678c42b0483741a1" exitCode=0 Dec 03 08:13:27 crc kubenswrapper[4831]: I1203 08:13:27.477300 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-e94a-account-create-update-rk45r" event={"ID":"c1fdc7ed-1970-42d9-a759-9b9fa3566070","Type":"ContainerDied","Data":"b15a651c9f0c4e0bd6d04443d3e2b8688b5688b55c1a07a6678c42b0483741a1"} Dec 03 08:13:27 crc kubenswrapper[4831]: I1203 08:13:27.480579 4831 generic.go:334] "Generic (PLEG): container finished" podID="6b9f139e-f14a-4bc6-81b4-a9635b47c78e" containerID="64d4d0a737b87028c1806f952df218053da6065c4b1690bfcc007fb406fcc685" exitCode=0 Dec 03 08:13:27 crc kubenswrapper[4831]: I1203 08:13:27.480663 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-lvjzd" event={"ID":"6b9f139e-f14a-4bc6-81b4-a9635b47c78e","Type":"ContainerDied","Data":"64d4d0a737b87028c1806f952df218053da6065c4b1690bfcc007fb406fcc685"} Dec 03 08:13:27 crc kubenswrapper[4831]: I1203 08:13:27.598730 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:13:27 crc kubenswrapper[4831]: I1203 08:13:27.598782 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:13:28 crc kubenswrapper[4831]: I1203 08:13:28.923084 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-e94a-account-create-update-rk45r" Dec 03 08:13:28 crc kubenswrapper[4831]: I1203 08:13:28.937351 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-lvjzd" Dec 03 08:13:28 crc kubenswrapper[4831]: I1203 08:13:28.988116 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1fdc7ed-1970-42d9-a759-9b9fa3566070-operator-scripts\") pod \"c1fdc7ed-1970-42d9-a759-9b9fa3566070\" (UID: \"c1fdc7ed-1970-42d9-a759-9b9fa3566070\") " Dec 03 08:13:28 crc kubenswrapper[4831]: I1203 08:13:28.988405 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sc8nk\" (UniqueName: \"kubernetes.io/projected/c1fdc7ed-1970-42d9-a759-9b9fa3566070-kube-api-access-sc8nk\") pod \"c1fdc7ed-1970-42d9-a759-9b9fa3566070\" (UID: \"c1fdc7ed-1970-42d9-a759-9b9fa3566070\") " Dec 03 08:13:28 crc kubenswrapper[4831]: I1203 08:13:28.988642 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1fdc7ed-1970-42d9-a759-9b9fa3566070-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c1fdc7ed-1970-42d9-a759-9b9fa3566070" (UID: "c1fdc7ed-1970-42d9-a759-9b9fa3566070"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:13:28 crc kubenswrapper[4831]: I1203 08:13:28.989075 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c1fdc7ed-1970-42d9-a759-9b9fa3566070-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.010590 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1fdc7ed-1970-42d9-a759-9b9fa3566070-kube-api-access-sc8nk" (OuterVolumeSpecName: "kube-api-access-sc8nk") pod "c1fdc7ed-1970-42d9-a759-9b9fa3566070" (UID: "c1fdc7ed-1970-42d9-a759-9b9fa3566070"). InnerVolumeSpecName "kube-api-access-sc8nk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.090616 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9f139e-f14a-4bc6-81b4-a9635b47c78e-operator-scripts\") pod \"6b9f139e-f14a-4bc6-81b4-a9635b47c78e\" (UID: \"6b9f139e-f14a-4bc6-81b4-a9635b47c78e\") " Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.090658 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blz45\" (UniqueName: \"kubernetes.io/projected/6b9f139e-f14a-4bc6-81b4-a9635b47c78e-kube-api-access-blz45\") pod \"6b9f139e-f14a-4bc6-81b4-a9635b47c78e\" (UID: \"6b9f139e-f14a-4bc6-81b4-a9635b47c78e\") " Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.091502 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b9f139e-f14a-4bc6-81b4-a9635b47c78e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6b9f139e-f14a-4bc6-81b4-a9635b47c78e" (UID: "6b9f139e-f14a-4bc6-81b4-a9635b47c78e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.092829 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sc8nk\" (UniqueName: \"kubernetes.io/projected/c1fdc7ed-1970-42d9-a759-9b9fa3566070-kube-api-access-sc8nk\") on node \"crc\" DevicePath \"\"" Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.092851 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6b9f139e-f14a-4bc6-81b4-a9635b47c78e-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.096615 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b9f139e-f14a-4bc6-81b4-a9635b47c78e-kube-api-access-blz45" (OuterVolumeSpecName: "kube-api-access-blz45") pod "6b9f139e-f14a-4bc6-81b4-a9635b47c78e" (UID: "6b9f139e-f14a-4bc6-81b4-a9635b47c78e"). InnerVolumeSpecName "kube-api-access-blz45". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.194440 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blz45\" (UniqueName: \"kubernetes.io/projected/6b9f139e-f14a-4bc6-81b4-a9635b47c78e-kube-api-access-blz45\") on node \"crc\" DevicePath \"\"" Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.506084 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-e94a-account-create-update-rk45r" event={"ID":"c1fdc7ed-1970-42d9-a759-9b9fa3566070","Type":"ContainerDied","Data":"7593aae9b9b4af044331c3c637f39b6f95219da88847404d14b7e9b93c8a4c69"} Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.506130 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7593aae9b9b4af044331c3c637f39b6f95219da88847404d14b7e9b93c8a4c69" Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.506209 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-e94a-account-create-update-rk45r" Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.508150 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-create-lvjzd" event={"ID":"6b9f139e-f14a-4bc6-81b4-a9635b47c78e","Type":"ContainerDied","Data":"d76a71025db007c6b04d2620f94f5e4d81daef9e3bbe20ae4154b0ba1fc80f92"} Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.508177 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d76a71025db007c6b04d2620f94f5e4d81daef9e3bbe20ae4154b0ba1fc80f92" Dec 03 08:13:29 crc kubenswrapper[4831]: I1203 08:13:29.508268 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-create-lvjzd" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.566582 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-db-sync-l5zkp"] Dec 03 08:13:30 crc kubenswrapper[4831]: E1203 08:13:30.567537 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1fdc7ed-1970-42d9-a759-9b9fa3566070" containerName="mariadb-account-create-update" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.567557 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1fdc7ed-1970-42d9-a759-9b9fa3566070" containerName="mariadb-account-create-update" Dec 03 08:13:30 crc kubenswrapper[4831]: E1203 08:13:30.567589 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b9f139e-f14a-4bc6-81b4-a9635b47c78e" containerName="mariadb-database-create" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.567599 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b9f139e-f14a-4bc6-81b4-a9635b47c78e" containerName="mariadb-database-create" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.567889 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b9f139e-f14a-4bc6-81b4-a9635b47c78e" containerName="mariadb-database-create" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.567932 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1fdc7ed-1970-42d9-a759-9b9fa3566070" containerName="mariadb-account-create-update" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.568865 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-l5zkp" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.571945 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-bnjwn" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.572406 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.586602 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-l5zkp"] Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.732853 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6265a8-c4d5-43b4-b025-35a01e45411f-combined-ca-bundle\") pod \"heat-db-sync-l5zkp\" (UID: \"fb6265a8-c4d5-43b4-b025-35a01e45411f\") " pod="openstack/heat-db-sync-l5zkp" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.732924 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qtgp\" (UniqueName: \"kubernetes.io/projected/fb6265a8-c4d5-43b4-b025-35a01e45411f-kube-api-access-2qtgp\") pod \"heat-db-sync-l5zkp\" (UID: \"fb6265a8-c4d5-43b4-b025-35a01e45411f\") " pod="openstack/heat-db-sync-l5zkp" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.733040 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb6265a8-c4d5-43b4-b025-35a01e45411f-config-data\") pod \"heat-db-sync-l5zkp\" (UID: \"fb6265a8-c4d5-43b4-b025-35a01e45411f\") " pod="openstack/heat-db-sync-l5zkp" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.835727 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6265a8-c4d5-43b4-b025-35a01e45411f-combined-ca-bundle\") pod \"heat-db-sync-l5zkp\" (UID: \"fb6265a8-c4d5-43b4-b025-35a01e45411f\") " pod="openstack/heat-db-sync-l5zkp" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.835799 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qtgp\" (UniqueName: \"kubernetes.io/projected/fb6265a8-c4d5-43b4-b025-35a01e45411f-kube-api-access-2qtgp\") pod \"heat-db-sync-l5zkp\" (UID: \"fb6265a8-c4d5-43b4-b025-35a01e45411f\") " pod="openstack/heat-db-sync-l5zkp" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.835849 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb6265a8-c4d5-43b4-b025-35a01e45411f-config-data\") pod \"heat-db-sync-l5zkp\" (UID: \"fb6265a8-c4d5-43b4-b025-35a01e45411f\") " pod="openstack/heat-db-sync-l5zkp" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.850344 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb6265a8-c4d5-43b4-b025-35a01e45411f-config-data\") pod \"heat-db-sync-l5zkp\" (UID: \"fb6265a8-c4d5-43b4-b025-35a01e45411f\") " pod="openstack/heat-db-sync-l5zkp" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.851427 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6265a8-c4d5-43b4-b025-35a01e45411f-combined-ca-bundle\") pod \"heat-db-sync-l5zkp\" (UID: \"fb6265a8-c4d5-43b4-b025-35a01e45411f\") " pod="openstack/heat-db-sync-l5zkp" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.867741 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qtgp\" (UniqueName: \"kubernetes.io/projected/fb6265a8-c4d5-43b4-b025-35a01e45411f-kube-api-access-2qtgp\") pod \"heat-db-sync-l5zkp\" (UID: \"fb6265a8-c4d5-43b4-b025-35a01e45411f\") " pod="openstack/heat-db-sync-l5zkp" Dec 03 08:13:30 crc kubenswrapper[4831]: I1203 08:13:30.924762 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-l5zkp" Dec 03 08:13:31 crc kubenswrapper[4831]: I1203 08:13:31.043730 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-lzg9j"] Dec 03 08:13:31 crc kubenswrapper[4831]: I1203 08:13:31.058601 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-lzg9j"] Dec 03 08:13:32 crc kubenswrapper[4831]: I1203 08:13:32.256581 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-db-sync-l5zkp"] Dec 03 08:13:32 crc kubenswrapper[4831]: I1203 08:13:32.260129 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:13:32 crc kubenswrapper[4831]: I1203 08:13:32.536850 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-l5zkp" event={"ID":"fb6265a8-c4d5-43b4-b025-35a01e45411f","Type":"ContainerStarted","Data":"a9a6df25d9677e02678b0e4742d3ef7fe75c49a6c00a39167b4e02c080b78538"} Dec 03 08:13:33 crc kubenswrapper[4831]: I1203 08:13:33.051243 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a5d905f-87c5-4c09-8825-4938cda62ee7" path="/var/lib/kubelet/pods/4a5d905f-87c5-4c09-8825-4938cda62ee7/volumes" Dec 03 08:13:34 crc kubenswrapper[4831]: I1203 08:13:34.307031 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:34 crc kubenswrapper[4831]: I1203 08:13:34.307377 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:39 crc kubenswrapper[4831]: I1203 08:13:39.606919 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-l5zkp" event={"ID":"fb6265a8-c4d5-43b4-b025-35a01e45411f","Type":"ContainerStarted","Data":"66f7ba7aa5e988e305a00d2993bce58bbd7a812efb1462d67e0b03bbd30e2a10"} Dec 03 08:13:39 crc kubenswrapper[4831]: I1203 08:13:39.638224 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-db-sync-l5zkp" podStartSLOduration=2.765189329 podStartE2EDuration="9.638199963s" podCreationTimestamp="2025-12-03 08:13:30 +0000 UTC" firstStartedPulling="2025-12-03 08:13:32.259811488 +0000 UTC m=+6149.603394996" lastFinishedPulling="2025-12-03 08:13:39.132822122 +0000 UTC m=+6156.476405630" observedRunningTime="2025-12-03 08:13:39.62301628 +0000 UTC m=+6156.966599808" watchObservedRunningTime="2025-12-03 08:13:39.638199963 +0000 UTC m=+6156.981783471" Dec 03 08:13:42 crc kubenswrapper[4831]: I1203 08:13:42.652436 4831 generic.go:334] "Generic (PLEG): container finished" podID="fb6265a8-c4d5-43b4-b025-35a01e45411f" containerID="66f7ba7aa5e988e305a00d2993bce58bbd7a812efb1462d67e0b03bbd30e2a10" exitCode=0 Dec 03 08:13:42 crc kubenswrapper[4831]: I1203 08:13:42.652546 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-l5zkp" event={"ID":"fb6265a8-c4d5-43b4-b025-35a01e45411f","Type":"ContainerDied","Data":"66f7ba7aa5e988e305a00d2993bce58bbd7a812efb1462d67e0b03bbd30e2a10"} Dec 03 08:13:44 crc kubenswrapper[4831]: I1203 08:13:44.016747 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-l5zkp" Dec 03 08:13:44 crc kubenswrapper[4831]: I1203 08:13:44.193551 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2qtgp\" (UniqueName: \"kubernetes.io/projected/fb6265a8-c4d5-43b4-b025-35a01e45411f-kube-api-access-2qtgp\") pod \"fb6265a8-c4d5-43b4-b025-35a01e45411f\" (UID: \"fb6265a8-c4d5-43b4-b025-35a01e45411f\") " Dec 03 08:13:44 crc kubenswrapper[4831]: I1203 08:13:44.193745 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6265a8-c4d5-43b4-b025-35a01e45411f-combined-ca-bundle\") pod \"fb6265a8-c4d5-43b4-b025-35a01e45411f\" (UID: \"fb6265a8-c4d5-43b4-b025-35a01e45411f\") " Dec 03 08:13:44 crc kubenswrapper[4831]: I1203 08:13:44.193989 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb6265a8-c4d5-43b4-b025-35a01e45411f-config-data\") pod \"fb6265a8-c4d5-43b4-b025-35a01e45411f\" (UID: \"fb6265a8-c4d5-43b4-b025-35a01e45411f\") " Dec 03 08:13:44 crc kubenswrapper[4831]: I1203 08:13:44.202135 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb6265a8-c4d5-43b4-b025-35a01e45411f-kube-api-access-2qtgp" (OuterVolumeSpecName: "kube-api-access-2qtgp") pod "fb6265a8-c4d5-43b4-b025-35a01e45411f" (UID: "fb6265a8-c4d5-43b4-b025-35a01e45411f"). InnerVolumeSpecName "kube-api-access-2qtgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:13:44 crc kubenswrapper[4831]: I1203 08:13:44.229525 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6265a8-c4d5-43b4-b025-35a01e45411f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb6265a8-c4d5-43b4-b025-35a01e45411f" (UID: "fb6265a8-c4d5-43b4-b025-35a01e45411f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:13:44 crc kubenswrapper[4831]: I1203 08:13:44.272608 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb6265a8-c4d5-43b4-b025-35a01e45411f-config-data" (OuterVolumeSpecName: "config-data") pod "fb6265a8-c4d5-43b4-b025-35a01e45411f" (UID: "fb6265a8-c4d5-43b4-b025-35a01e45411f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:13:44 crc kubenswrapper[4831]: I1203 08:13:44.296656 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2qtgp\" (UniqueName: \"kubernetes.io/projected/fb6265a8-c4d5-43b4-b025-35a01e45411f-kube-api-access-2qtgp\") on node \"crc\" DevicePath \"\"" Dec 03 08:13:44 crc kubenswrapper[4831]: I1203 08:13:44.296685 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb6265a8-c4d5-43b4-b025-35a01e45411f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:13:44 crc kubenswrapper[4831]: I1203 08:13:44.296695 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb6265a8-c4d5-43b4-b025-35a01e45411f-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:13:44 crc kubenswrapper[4831]: I1203 08:13:44.678400 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-db-sync-l5zkp" event={"ID":"fb6265a8-c4d5-43b4-b025-35a01e45411f","Type":"ContainerDied","Data":"a9a6df25d9677e02678b0e4742d3ef7fe75c49a6c00a39167b4e02c080b78538"} Dec 03 08:13:44 crc kubenswrapper[4831]: I1203 08:13:44.678457 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a9a6df25d9677e02678b0e4742d3ef7fe75c49a6c00a39167b4e02c080b78538" Dec 03 08:13:44 crc kubenswrapper[4831]: I1203 08:13:44.678505 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/heat-db-sync-l5zkp" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.675629 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-engine-675cdcc9cb-xvrsg"] Dec 03 08:13:45 crc kubenswrapper[4831]: E1203 08:13:45.676452 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb6265a8-c4d5-43b4-b025-35a01e45411f" containerName="heat-db-sync" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.676470 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb6265a8-c4d5-43b4-b025-35a01e45411f" containerName="heat-db-sync" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.676701 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb6265a8-c4d5-43b4-b025-35a01e45411f" containerName="heat-db-sync" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.677610 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.685250 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-heat-dockercfg-bnjwn" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.689981 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-engine-config-data" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.690371 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-config-data" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.720671 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-675cdcc9cb-xvrsg"] Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.796934 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-api-86c4895765-xdwvz"] Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.798225 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.802647 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-api-config-data" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.808441 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-86c4895765-xdwvz"] Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.838463 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvhh9\" (UniqueName: \"kubernetes.io/projected/b7767894-04f2-4d29-b76b-7157e045803d-kube-api-access-qvhh9\") pod \"heat-engine-675cdcc9cb-xvrsg\" (UID: \"b7767894-04f2-4d29-b76b-7157e045803d\") " pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.838623 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7767894-04f2-4d29-b76b-7157e045803d-config-data-custom\") pod \"heat-engine-675cdcc9cb-xvrsg\" (UID: \"b7767894-04f2-4d29-b76b-7157e045803d\") " pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.838664 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7767894-04f2-4d29-b76b-7157e045803d-config-data\") pod \"heat-engine-675cdcc9cb-xvrsg\" (UID: \"b7767894-04f2-4d29-b76b-7157e045803d\") " pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.838695 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7767894-04f2-4d29-b76b-7157e045803d-combined-ca-bundle\") pod \"heat-engine-675cdcc9cb-xvrsg\" (UID: \"b7767894-04f2-4d29-b76b-7157e045803d\") " pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.920497 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/heat-cfnapi-56bc69b54c-5mw4x"] Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.966574 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-56bc69b54c-5mw4x"] Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.966698 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.968100 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7767894-04f2-4d29-b76b-7157e045803d-config-data\") pod \"heat-engine-675cdcc9cb-xvrsg\" (UID: \"b7767894-04f2-4d29-b76b-7157e045803d\") " pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.968150 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7767894-04f2-4d29-b76b-7157e045803d-combined-ca-bundle\") pod \"heat-engine-675cdcc9cb-xvrsg\" (UID: \"b7767894-04f2-4d29-b76b-7157e045803d\") " pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.968235 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb164731-d0b2-4f1a-992b-48d19d451819-combined-ca-bundle\") pod \"heat-cfnapi-56bc69b54c-5mw4x\" (UID: \"eb164731-d0b2-4f1a-992b-48d19d451819\") " pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.968267 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vggrn\" (UniqueName: \"kubernetes.io/projected/eb164731-d0b2-4f1a-992b-48d19d451819-kube-api-access-vggrn\") pod \"heat-cfnapi-56bc69b54c-5mw4x\" (UID: \"eb164731-d0b2-4f1a-992b-48d19d451819\") " pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.968297 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb164731-d0b2-4f1a-992b-48d19d451819-config-data\") pod \"heat-cfnapi-56bc69b54c-5mw4x\" (UID: \"eb164731-d0b2-4f1a-992b-48d19d451819\") " pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.968345 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb164731-d0b2-4f1a-992b-48d19d451819-config-data-custom\") pod \"heat-cfnapi-56bc69b54c-5mw4x\" (UID: \"eb164731-d0b2-4f1a-992b-48d19d451819\") " pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.968374 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/749aa4ea-dd38-4c6c-a33f-a7467e7d76ab-config-data\") pod \"heat-api-86c4895765-xdwvz\" (UID: \"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab\") " pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.968409 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvhh9\" (UniqueName: \"kubernetes.io/projected/b7767894-04f2-4d29-b76b-7157e045803d-kube-api-access-qvhh9\") pod \"heat-engine-675cdcc9cb-xvrsg\" (UID: \"b7767894-04f2-4d29-b76b-7157e045803d\") " pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.968453 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/749aa4ea-dd38-4c6c-a33f-a7467e7d76ab-combined-ca-bundle\") pod \"heat-api-86c4895765-xdwvz\" (UID: \"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab\") " pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.968478 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfs8d\" (UniqueName: \"kubernetes.io/projected/749aa4ea-dd38-4c6c-a33f-a7467e7d76ab-kube-api-access-gfs8d\") pod \"heat-api-86c4895765-xdwvz\" (UID: \"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab\") " pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.969039 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/749aa4ea-dd38-4c6c-a33f-a7467e7d76ab-config-data-custom\") pod \"heat-api-86c4895765-xdwvz\" (UID: \"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab\") " pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.969158 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7767894-04f2-4d29-b76b-7157e045803d-config-data-custom\") pod \"heat-engine-675cdcc9cb-xvrsg\" (UID: \"b7767894-04f2-4d29-b76b-7157e045803d\") " pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.975833 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"heat-cfnapi-config-data" Dec 03 08:13:45 crc kubenswrapper[4831]: I1203 08:13:45.979288 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b7767894-04f2-4d29-b76b-7157e045803d-config-data-custom\") pod \"heat-engine-675cdcc9cb-xvrsg\" (UID: \"b7767894-04f2-4d29-b76b-7157e045803d\") " pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.024174 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvhh9\" (UniqueName: \"kubernetes.io/projected/b7767894-04f2-4d29-b76b-7157e045803d-kube-api-access-qvhh9\") pod \"heat-engine-675cdcc9cb-xvrsg\" (UID: \"b7767894-04f2-4d29-b76b-7157e045803d\") " pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.030839 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7767894-04f2-4d29-b76b-7157e045803d-config-data\") pod \"heat-engine-675cdcc9cb-xvrsg\" (UID: \"b7767894-04f2-4d29-b76b-7157e045803d\") " pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.032042 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7767894-04f2-4d29-b76b-7157e045803d-combined-ca-bundle\") pod \"heat-engine-675cdcc9cb-xvrsg\" (UID: \"b7767894-04f2-4d29-b76b-7157e045803d\") " pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.072127 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb164731-d0b2-4f1a-992b-48d19d451819-combined-ca-bundle\") pod \"heat-cfnapi-56bc69b54c-5mw4x\" (UID: \"eb164731-d0b2-4f1a-992b-48d19d451819\") " pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.072202 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vggrn\" (UniqueName: \"kubernetes.io/projected/eb164731-d0b2-4f1a-992b-48d19d451819-kube-api-access-vggrn\") pod \"heat-cfnapi-56bc69b54c-5mw4x\" (UID: \"eb164731-d0b2-4f1a-992b-48d19d451819\") " pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.072240 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb164731-d0b2-4f1a-992b-48d19d451819-config-data\") pod \"heat-cfnapi-56bc69b54c-5mw4x\" (UID: \"eb164731-d0b2-4f1a-992b-48d19d451819\") " pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.072269 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb164731-d0b2-4f1a-992b-48d19d451819-config-data-custom\") pod \"heat-cfnapi-56bc69b54c-5mw4x\" (UID: \"eb164731-d0b2-4f1a-992b-48d19d451819\") " pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.072298 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/749aa4ea-dd38-4c6c-a33f-a7467e7d76ab-config-data\") pod \"heat-api-86c4895765-xdwvz\" (UID: \"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab\") " pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.072383 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/749aa4ea-dd38-4c6c-a33f-a7467e7d76ab-combined-ca-bundle\") pod \"heat-api-86c4895765-xdwvz\" (UID: \"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab\") " pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.072415 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gfs8d\" (UniqueName: \"kubernetes.io/projected/749aa4ea-dd38-4c6c-a33f-a7467e7d76ab-kube-api-access-gfs8d\") pod \"heat-api-86c4895765-xdwvz\" (UID: \"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab\") " pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.072483 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/749aa4ea-dd38-4c6c-a33f-a7467e7d76ab-config-data-custom\") pod \"heat-api-86c4895765-xdwvz\" (UID: \"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab\") " pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.078846 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/eb164731-d0b2-4f1a-992b-48d19d451819-config-data-custom\") pod \"heat-cfnapi-56bc69b54c-5mw4x\" (UID: \"eb164731-d0b2-4f1a-992b-48d19d451819\") " pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.079802 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/749aa4ea-dd38-4c6c-a33f-a7467e7d76ab-config-data-custom\") pod \"heat-api-86c4895765-xdwvz\" (UID: \"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab\") " pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.091108 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/749aa4ea-dd38-4c6c-a33f-a7467e7d76ab-config-data\") pod \"heat-api-86c4895765-xdwvz\" (UID: \"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab\") " pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.094080 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb164731-d0b2-4f1a-992b-48d19d451819-combined-ca-bundle\") pod \"heat-cfnapi-56bc69b54c-5mw4x\" (UID: \"eb164731-d0b2-4f1a-992b-48d19d451819\") " pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.096496 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/749aa4ea-dd38-4c6c-a33f-a7467e7d76ab-combined-ca-bundle\") pod \"heat-api-86c4895765-xdwvz\" (UID: \"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab\") " pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.097215 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/eb164731-d0b2-4f1a-992b-48d19d451819-config-data\") pod \"heat-cfnapi-56bc69b54c-5mw4x\" (UID: \"eb164731-d0b2-4f1a-992b-48d19d451819\") " pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.097778 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vggrn\" (UniqueName: \"kubernetes.io/projected/eb164731-d0b2-4f1a-992b-48d19d451819-kube-api-access-vggrn\") pod \"heat-cfnapi-56bc69b54c-5mw4x\" (UID: \"eb164731-d0b2-4f1a-992b-48d19d451819\") " pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.102752 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfs8d\" (UniqueName: \"kubernetes.io/projected/749aa4ea-dd38-4c6c-a33f-a7467e7d76ab-kube-api-access-gfs8d\") pod \"heat-api-86c4895765-xdwvz\" (UID: \"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab\") " pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.117078 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.219366 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.300500 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.314202 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.645798 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-api-86c4895765-xdwvz"] Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.703666 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-86c4895765-xdwvz" event={"ID":"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab","Type":"ContainerStarted","Data":"656260bf4e4dae0a2689898ed6c365814c5921db6b6106f42e255a0e6d4dac5c"} Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.791946 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-cfnapi-56bc69b54c-5mw4x"] Dec 03 08:13:46 crc kubenswrapper[4831]: W1203 08:13:46.794531 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb164731_d0b2_4f1a_992b_48d19d451819.slice/crio-dd4e44675f3dd7de63e46ad27097c2ebfbb16e4fe8279d69e18b766d912f1a55 WatchSource:0}: Error finding container dd4e44675f3dd7de63e46ad27097c2ebfbb16e4fe8279d69e18b766d912f1a55: Status 404 returned error can't find the container with id dd4e44675f3dd7de63e46ad27097c2ebfbb16e4fe8279d69e18b766d912f1a55 Dec 03 08:13:46 crc kubenswrapper[4831]: I1203 08:13:46.911812 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/heat-engine-675cdcc9cb-xvrsg"] Dec 03 08:13:46 crc kubenswrapper[4831]: W1203 08:13:46.912827 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7767894_04f2_4d29_b76b_7157e045803d.slice/crio-5ed4d35c0552f3acd8d4aac943f1770b90ee91b6061ad85f129856b0af5fa35d WatchSource:0}: Error finding container 5ed4d35c0552f3acd8d4aac943f1770b90ee91b6061ad85f129856b0af5fa35d: Status 404 returned error can't find the container with id 5ed4d35c0552f3acd8d4aac943f1770b90ee91b6061ad85f129856b0af5fa35d Dec 03 08:13:47 crc kubenswrapper[4831]: I1203 08:13:47.788664 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-675cdcc9cb-xvrsg" event={"ID":"b7767894-04f2-4d29-b76b-7157e045803d","Type":"ContainerStarted","Data":"d6cc4ead1e824f2e3877275e8d4bbe9e9c7a7067a36493724c929fd61db2a5b8"} Dec 03 08:13:47 crc kubenswrapper[4831]: I1203 08:13:47.789073 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-engine-675cdcc9cb-xvrsg" event={"ID":"b7767894-04f2-4d29-b76b-7157e045803d","Type":"ContainerStarted","Data":"5ed4d35c0552f3acd8d4aac943f1770b90ee91b6061ad85f129856b0af5fa35d"} Dec 03 08:13:47 crc kubenswrapper[4831]: I1203 08:13:47.790222 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:13:47 crc kubenswrapper[4831]: I1203 08:13:47.807512 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" event={"ID":"eb164731-d0b2-4f1a-992b-48d19d451819","Type":"ContainerStarted","Data":"dd4e44675f3dd7de63e46ad27097c2ebfbb16e4fe8279d69e18b766d912f1a55"} Dec 03 08:13:47 crc kubenswrapper[4831]: I1203 08:13:47.822009 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-engine-675cdcc9cb-xvrsg" podStartSLOduration=2.821991595 podStartE2EDuration="2.821991595s" podCreationTimestamp="2025-12-03 08:13:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:13:47.811780476 +0000 UTC m=+6165.155363974" watchObservedRunningTime="2025-12-03 08:13:47.821991595 +0000 UTC m=+6165.165575103" Dec 03 08:13:48 crc kubenswrapper[4831]: I1203 08:13:48.432474 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-78874fb77c-dtxst" Dec 03 08:13:48 crc kubenswrapper[4831]: I1203 08:13:48.514129 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54959bccf-pc4kb"] Dec 03 08:13:48 crc kubenswrapper[4831]: I1203 08:13:48.515252 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54959bccf-pc4kb" podUID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerName="horizon" containerID="cri-o://b0050ee94f57530d896caebecbbc9c0375ab39b99d2ef48d7fcb100dc77c32a1" gracePeriod=30 Dec 03 08:13:48 crc kubenswrapper[4831]: I1203 08:13:48.515123 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-54959bccf-pc4kb" podUID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerName="horizon-log" containerID="cri-o://cabda8be23ce9da31fff09b2b69121c21e28f01612c4d672affbf094037a9161" gracePeriod=30 Dec 03 08:13:49 crc kubenswrapper[4831]: I1203 08:13:49.831190 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" event={"ID":"eb164731-d0b2-4f1a-992b-48d19d451819","Type":"ContainerStarted","Data":"567881caaa834f5f6683a9c571421a35503bdf949ce5fddd1544191f7b844e74"} Dec 03 08:13:49 crc kubenswrapper[4831]: I1203 08:13:49.831727 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:49 crc kubenswrapper[4831]: I1203 08:13:49.832948 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/heat-api-86c4895765-xdwvz" event={"ID":"749aa4ea-dd38-4c6c-a33f-a7467e7d76ab","Type":"ContainerStarted","Data":"6663ec9f9a73fa1236d51c5cf89c412a204f4a9f5595e22e161c7e051b479226"} Dec 03 08:13:49 crc kubenswrapper[4831]: I1203 08:13:49.854441 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" podStartSLOduration=2.34410828 podStartE2EDuration="4.854417979s" podCreationTimestamp="2025-12-03 08:13:45 +0000 UTC" firstStartedPulling="2025-12-03 08:13:46.796686739 +0000 UTC m=+6164.140270247" lastFinishedPulling="2025-12-03 08:13:49.306996438 +0000 UTC m=+6166.650579946" observedRunningTime="2025-12-03 08:13:49.846279445 +0000 UTC m=+6167.189862953" watchObservedRunningTime="2025-12-03 08:13:49.854417979 +0000 UTC m=+6167.198001497" Dec 03 08:13:49 crc kubenswrapper[4831]: I1203 08:13:49.869261 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/heat-api-86c4895765-xdwvz" podStartSLOduration=2.218733185 podStartE2EDuration="4.86924028s" podCreationTimestamp="2025-12-03 08:13:45 +0000 UTC" firstStartedPulling="2025-12-03 08:13:46.648411941 +0000 UTC m=+6163.991995449" lastFinishedPulling="2025-12-03 08:13:49.298919036 +0000 UTC m=+6166.642502544" observedRunningTime="2025-12-03 08:13:49.86605914 +0000 UTC m=+6167.209642648" watchObservedRunningTime="2025-12-03 08:13:49.86924028 +0000 UTC m=+6167.212823778" Dec 03 08:13:50 crc kubenswrapper[4831]: I1203 08:13:50.840528 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:51 crc kubenswrapper[4831]: I1203 08:13:51.854389 4831 generic.go:334] "Generic (PLEG): container finished" podID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerID="b0050ee94f57530d896caebecbbc9c0375ab39b99d2ef48d7fcb100dc77c32a1" exitCode=0 Dec 03 08:13:51 crc kubenswrapper[4831]: I1203 08:13:51.855351 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54959bccf-pc4kb" event={"ID":"acdd083e-b37d-4cdf-b856-3eabe411df76","Type":"ContainerDied","Data":"b0050ee94f57530d896caebecbbc9c0375ab39b99d2ef48d7fcb100dc77c32a1"} Dec 03 08:13:57 crc kubenswrapper[4831]: I1203 08:13:57.464177 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-api-86c4895765-xdwvz" Dec 03 08:13:57 crc kubenswrapper[4831]: I1203 08:13:57.555233 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-cfnapi-56bc69b54c-5mw4x" Dec 03 08:13:57 crc kubenswrapper[4831]: I1203 08:13:57.596900 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:13:57 crc kubenswrapper[4831]: I1203 08:13:57.596973 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:13:58 crc kubenswrapper[4831]: I1203 08:13:58.002812 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54959bccf-pc4kb" podUID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.119:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8080: connect: connection refused" Dec 03 08:14:03 crc kubenswrapper[4831]: I1203 08:14:03.087866 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-d99f-account-create-update-7hbdc"] Dec 03 08:14:03 crc kubenswrapper[4831]: I1203 08:14:03.096881 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-rmdsf"] Dec 03 08:14:03 crc kubenswrapper[4831]: I1203 08:14:03.103994 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-rmdsf"] Dec 03 08:14:03 crc kubenswrapper[4831]: I1203 08:14:03.111241 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-d99f-account-create-update-7hbdc"] Dec 03 08:14:05 crc kubenswrapper[4831]: I1203 08:14:05.037847 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62186717-0243-43c0-be16-bc066648e2ac" path="/var/lib/kubelet/pods/62186717-0243-43c0-be16-bc066648e2ac/volumes" Dec 03 08:14:05 crc kubenswrapper[4831]: I1203 08:14:05.040189 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be898fbd-2ab0-4c28-8889-dc773c95348e" path="/var/lib/kubelet/pods/be898fbd-2ab0-4c28-8889-dc773c95348e/volumes" Dec 03 08:14:06 crc kubenswrapper[4831]: I1203 08:14:06.366930 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/heat-engine-675cdcc9cb-xvrsg" Dec 03 08:14:08 crc kubenswrapper[4831]: I1203 08:14:08.002872 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54959bccf-pc4kb" podUID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.119:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8080: connect: connection refused" Dec 03 08:14:09 crc kubenswrapper[4831]: I1203 08:14:09.066528 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-st2kv"] Dec 03 08:14:09 crc kubenswrapper[4831]: I1203 08:14:09.087344 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-st2kv"] Dec 03 08:14:11 crc kubenswrapper[4831]: I1203 08:14:11.027049 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2d08214-4528-4f01-ab7b-70ae27f6bd7f" path="/var/lib/kubelet/pods/a2d08214-4528-4f01-ab7b-70ae27f6bd7f/volumes" Dec 03 08:14:16 crc kubenswrapper[4831]: I1203 08:14:16.702179 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2"] Dec 03 08:14:16 crc kubenswrapper[4831]: I1203 08:14:16.704841 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" Dec 03 08:14:16 crc kubenswrapper[4831]: I1203 08:14:16.707969 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Dec 03 08:14:16 crc kubenswrapper[4831]: I1203 08:14:16.726414 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2"] Dec 03 08:14:16 crc kubenswrapper[4831]: I1203 08:14:16.730445 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qkjc\" (UniqueName: \"kubernetes.io/projected/f064abcd-d3d7-4a44-a224-6fde3a142406-kube-api-access-4qkjc\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2\" (UID: \"f064abcd-d3d7-4a44-a224-6fde3a142406\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" Dec 03 08:14:16 crc kubenswrapper[4831]: I1203 08:14:16.730487 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f064abcd-d3d7-4a44-a224-6fde3a142406-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2\" (UID: \"f064abcd-d3d7-4a44-a224-6fde3a142406\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" Dec 03 08:14:16 crc kubenswrapper[4831]: I1203 08:14:16.730544 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f064abcd-d3d7-4a44-a224-6fde3a142406-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2\" (UID: \"f064abcd-d3d7-4a44-a224-6fde3a142406\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" Dec 03 08:14:16 crc kubenswrapper[4831]: I1203 08:14:16.831257 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f064abcd-d3d7-4a44-a224-6fde3a142406-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2\" (UID: \"f064abcd-d3d7-4a44-a224-6fde3a142406\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" Dec 03 08:14:16 crc kubenswrapper[4831]: I1203 08:14:16.831497 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4qkjc\" (UniqueName: \"kubernetes.io/projected/f064abcd-d3d7-4a44-a224-6fde3a142406-kube-api-access-4qkjc\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2\" (UID: \"f064abcd-d3d7-4a44-a224-6fde3a142406\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" Dec 03 08:14:16 crc kubenswrapper[4831]: I1203 08:14:16.831538 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f064abcd-d3d7-4a44-a224-6fde3a142406-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2\" (UID: \"f064abcd-d3d7-4a44-a224-6fde3a142406\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" Dec 03 08:14:16 crc kubenswrapper[4831]: I1203 08:14:16.832069 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f064abcd-d3d7-4a44-a224-6fde3a142406-bundle\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2\" (UID: \"f064abcd-d3d7-4a44-a224-6fde3a142406\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" Dec 03 08:14:16 crc kubenswrapper[4831]: I1203 08:14:16.832101 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f064abcd-d3d7-4a44-a224-6fde3a142406-util\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2\" (UID: \"f064abcd-d3d7-4a44-a224-6fde3a142406\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" Dec 03 08:14:16 crc kubenswrapper[4831]: I1203 08:14:16.855168 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qkjc\" (UniqueName: \"kubernetes.io/projected/f064abcd-d3d7-4a44-a224-6fde3a142406-kube-api-access-4qkjc\") pod \"6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2\" (UID: \"f064abcd-d3d7-4a44-a224-6fde3a142406\") " pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" Dec 03 08:14:17 crc kubenswrapper[4831]: I1203 08:14:17.092810 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" Dec 03 08:14:17 crc kubenswrapper[4831]: I1203 08:14:17.695172 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2"] Dec 03 08:14:18 crc kubenswrapper[4831]: I1203 08:14:18.003888 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-54959bccf-pc4kb" podUID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerName="horizon" probeResult="failure" output="Get \"http://10.217.1.119:8080/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.1.119:8080: connect: connection refused" Dec 03 08:14:18 crc kubenswrapper[4831]: I1203 08:14:18.004476 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:14:18 crc kubenswrapper[4831]: I1203 08:14:18.420507 4831 generic.go:334] "Generic (PLEG): container finished" podID="f064abcd-d3d7-4a44-a224-6fde3a142406" containerID="5e04f16cfe5808735ea165b4bbdedf6d10955ee726e651c6cb32e72ec119d31e" exitCode=0 Dec 03 08:14:18 crc kubenswrapper[4831]: I1203 08:14:18.420602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" event={"ID":"f064abcd-d3d7-4a44-a224-6fde3a142406","Type":"ContainerDied","Data":"5e04f16cfe5808735ea165b4bbdedf6d10955ee726e651c6cb32e72ec119d31e"} Dec 03 08:14:18 crc kubenswrapper[4831]: I1203 08:14:18.421063 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" event={"ID":"f064abcd-d3d7-4a44-a224-6fde3a142406","Type":"ContainerStarted","Data":"0b11d0c42d9c0e3492b5ab00ea0d3239900b1d2fcfdc2449849d4f58a1c84392"} Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.431372 4831 generic.go:334] "Generic (PLEG): container finished" podID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerID="cabda8be23ce9da31fff09b2b69121c21e28f01612c4d672affbf094037a9161" exitCode=137 Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.431483 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54959bccf-pc4kb" event={"ID":"acdd083e-b37d-4cdf-b856-3eabe411df76","Type":"ContainerDied","Data":"cabda8be23ce9da31fff09b2b69121c21e28f01612c4d672affbf094037a9161"} Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.608865 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.793259 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acdd083e-b37d-4cdf-b856-3eabe411df76-config-data\") pod \"acdd083e-b37d-4cdf-b856-3eabe411df76\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.793341 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acdd083e-b37d-4cdf-b856-3eabe411df76-logs\") pod \"acdd083e-b37d-4cdf-b856-3eabe411df76\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.793447 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/acdd083e-b37d-4cdf-b856-3eabe411df76-horizon-secret-key\") pod \"acdd083e-b37d-4cdf-b856-3eabe411df76\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.793488 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4wc8\" (UniqueName: \"kubernetes.io/projected/acdd083e-b37d-4cdf-b856-3eabe411df76-kube-api-access-k4wc8\") pod \"acdd083e-b37d-4cdf-b856-3eabe411df76\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.793625 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acdd083e-b37d-4cdf-b856-3eabe411df76-scripts\") pod \"acdd083e-b37d-4cdf-b856-3eabe411df76\" (UID: \"acdd083e-b37d-4cdf-b856-3eabe411df76\") " Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.795055 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acdd083e-b37d-4cdf-b856-3eabe411df76-logs" (OuterVolumeSpecName: "logs") pod "acdd083e-b37d-4cdf-b856-3eabe411df76" (UID: "acdd083e-b37d-4cdf-b856-3eabe411df76"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.799339 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acdd083e-b37d-4cdf-b856-3eabe411df76-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "acdd083e-b37d-4cdf-b856-3eabe411df76" (UID: "acdd083e-b37d-4cdf-b856-3eabe411df76"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.799740 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acdd083e-b37d-4cdf-b856-3eabe411df76-kube-api-access-k4wc8" (OuterVolumeSpecName: "kube-api-access-k4wc8") pod "acdd083e-b37d-4cdf-b856-3eabe411df76" (UID: "acdd083e-b37d-4cdf-b856-3eabe411df76"). InnerVolumeSpecName "kube-api-access-k4wc8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.820034 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acdd083e-b37d-4cdf-b856-3eabe411df76-scripts" (OuterVolumeSpecName: "scripts") pod "acdd083e-b37d-4cdf-b856-3eabe411df76" (UID: "acdd083e-b37d-4cdf-b856-3eabe411df76"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.827623 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acdd083e-b37d-4cdf-b856-3eabe411df76-config-data" (OuterVolumeSpecName: "config-data") pod "acdd083e-b37d-4cdf-b856-3eabe411df76" (UID: "acdd083e-b37d-4cdf-b856-3eabe411df76"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.895769 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4wc8\" (UniqueName: \"kubernetes.io/projected/acdd083e-b37d-4cdf-b856-3eabe411df76-kube-api-access-k4wc8\") on node \"crc\" DevicePath \"\"" Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.896063 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/acdd083e-b37d-4cdf-b856-3eabe411df76-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.896075 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/acdd083e-b37d-4cdf-b856-3eabe411df76-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.896085 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/acdd083e-b37d-4cdf-b856-3eabe411df76-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:14:19 crc kubenswrapper[4831]: I1203 08:14:19.896093 4831 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/acdd083e-b37d-4cdf-b856-3eabe411df76-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:14:20 crc kubenswrapper[4831]: I1203 08:14:20.444891 4831 generic.go:334] "Generic (PLEG): container finished" podID="f064abcd-d3d7-4a44-a224-6fde3a142406" containerID="3cdbcf7a1fe09e0a6a72235b51dbb91ba73a34bc700820572b662fb711ddfca8" exitCode=0 Dec 03 08:14:20 crc kubenswrapper[4831]: I1203 08:14:20.445187 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" event={"ID":"f064abcd-d3d7-4a44-a224-6fde3a142406","Type":"ContainerDied","Data":"3cdbcf7a1fe09e0a6a72235b51dbb91ba73a34bc700820572b662fb711ddfca8"} Dec 03 08:14:20 crc kubenswrapper[4831]: I1203 08:14:20.448698 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-54959bccf-pc4kb" event={"ID":"acdd083e-b37d-4cdf-b856-3eabe411df76","Type":"ContainerDied","Data":"57938041de8357ccd9e7d6ab3253c6dd6bbb96a926ea4ccfe497d978bf3b5789"} Dec 03 08:14:20 crc kubenswrapper[4831]: I1203 08:14:20.448772 4831 scope.go:117] "RemoveContainer" containerID="b0050ee94f57530d896caebecbbc9c0375ab39b99d2ef48d7fcb100dc77c32a1" Dec 03 08:14:20 crc kubenswrapper[4831]: I1203 08:14:20.448795 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-54959bccf-pc4kb" Dec 03 08:14:20 crc kubenswrapper[4831]: I1203 08:14:20.499386 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-54959bccf-pc4kb"] Dec 03 08:14:20 crc kubenswrapper[4831]: I1203 08:14:20.506576 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-54959bccf-pc4kb"] Dec 03 08:14:20 crc kubenswrapper[4831]: I1203 08:14:20.673809 4831 scope.go:117] "RemoveContainer" containerID="cabda8be23ce9da31fff09b2b69121c21e28f01612c4d672affbf094037a9161" Dec 03 08:14:21 crc kubenswrapper[4831]: I1203 08:14:21.026777 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acdd083e-b37d-4cdf-b856-3eabe411df76" path="/var/lib/kubelet/pods/acdd083e-b37d-4cdf-b856-3eabe411df76/volumes" Dec 03 08:14:21 crc kubenswrapper[4831]: I1203 08:14:21.100552 4831 scope.go:117] "RemoveContainer" containerID="98b5520b362a0a5d305ff43584b662103223bc53c4b1e3de871396757d8d30b4" Dec 03 08:14:21 crc kubenswrapper[4831]: I1203 08:14:21.160655 4831 scope.go:117] "RemoveContainer" containerID="f1b6f8adcaf9d7621b93036f3120bada0ab61cc07f41f8a61bcfc70ba9fbe979" Dec 03 08:14:21 crc kubenswrapper[4831]: I1203 08:14:21.198726 4831 scope.go:117] "RemoveContainer" containerID="dbeed3b0cf0f639fe8f509483e0df43eceb9bb8b54d9db23191dc5256fd851e8" Dec 03 08:14:21 crc kubenswrapper[4831]: I1203 08:14:21.313477 4831 scope.go:117] "RemoveContainer" containerID="4e01e64ab1f88f36dd171cfe9a36cd281b44d569786258a1401e360ac84ea1bb" Dec 03 08:14:21 crc kubenswrapper[4831]: I1203 08:14:21.396308 4831 scope.go:117] "RemoveContainer" containerID="63d6d01accef261a28212cf6f728bfc2b8a9c76e1d5eb7bf598a62e5d3341ae2" Dec 03 08:14:21 crc kubenswrapper[4831]: I1203 08:14:21.438901 4831 scope.go:117] "RemoveContainer" containerID="098d6a615ad31864eb4583e6aa8c0497c2cb4eb26eed7d8c44d604ad34cda9ba" Dec 03 08:14:21 crc kubenswrapper[4831]: I1203 08:14:21.469652 4831 generic.go:334] "Generic (PLEG): container finished" podID="f064abcd-d3d7-4a44-a224-6fde3a142406" containerID="056c778cc1e75398a31b7f295b85340184d6fd148f5a62a286061d0fe7b5b853" exitCode=0 Dec 03 08:14:21 crc kubenswrapper[4831]: I1203 08:14:21.469717 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" event={"ID":"f064abcd-d3d7-4a44-a224-6fde3a142406","Type":"ContainerDied","Data":"056c778cc1e75398a31b7f295b85340184d6fd148f5a62a286061d0fe7b5b853"} Dec 03 08:14:22 crc kubenswrapper[4831]: I1203 08:14:22.959833 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" Dec 03 08:14:23 crc kubenswrapper[4831]: I1203 08:14:23.067798 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f064abcd-d3d7-4a44-a224-6fde3a142406-util\") pod \"f064abcd-d3d7-4a44-a224-6fde3a142406\" (UID: \"f064abcd-d3d7-4a44-a224-6fde3a142406\") " Dec 03 08:14:23 crc kubenswrapper[4831]: I1203 08:14:23.068170 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f064abcd-d3d7-4a44-a224-6fde3a142406-bundle\") pod \"f064abcd-d3d7-4a44-a224-6fde3a142406\" (UID: \"f064abcd-d3d7-4a44-a224-6fde3a142406\") " Dec 03 08:14:23 crc kubenswrapper[4831]: I1203 08:14:23.068260 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4qkjc\" (UniqueName: \"kubernetes.io/projected/f064abcd-d3d7-4a44-a224-6fde3a142406-kube-api-access-4qkjc\") pod \"f064abcd-d3d7-4a44-a224-6fde3a142406\" (UID: \"f064abcd-d3d7-4a44-a224-6fde3a142406\") " Dec 03 08:14:23 crc kubenswrapper[4831]: I1203 08:14:23.071770 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f064abcd-d3d7-4a44-a224-6fde3a142406-bundle" (OuterVolumeSpecName: "bundle") pod "f064abcd-d3d7-4a44-a224-6fde3a142406" (UID: "f064abcd-d3d7-4a44-a224-6fde3a142406"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:14:23 crc kubenswrapper[4831]: I1203 08:14:23.075220 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f064abcd-d3d7-4a44-a224-6fde3a142406-kube-api-access-4qkjc" (OuterVolumeSpecName: "kube-api-access-4qkjc") pod "f064abcd-d3d7-4a44-a224-6fde3a142406" (UID: "f064abcd-d3d7-4a44-a224-6fde3a142406"). InnerVolumeSpecName "kube-api-access-4qkjc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:14:23 crc kubenswrapper[4831]: I1203 08:14:23.083354 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f064abcd-d3d7-4a44-a224-6fde3a142406-util" (OuterVolumeSpecName: "util") pod "f064abcd-d3d7-4a44-a224-6fde3a142406" (UID: "f064abcd-d3d7-4a44-a224-6fde3a142406"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:14:23 crc kubenswrapper[4831]: I1203 08:14:23.170670 4831 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/f064abcd-d3d7-4a44-a224-6fde3a142406-util\") on node \"crc\" DevicePath \"\"" Dec 03 08:14:23 crc kubenswrapper[4831]: I1203 08:14:23.170704 4831 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/f064abcd-d3d7-4a44-a224-6fde3a142406-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:14:23 crc kubenswrapper[4831]: I1203 08:14:23.170717 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4qkjc\" (UniqueName: \"kubernetes.io/projected/f064abcd-d3d7-4a44-a224-6fde3a142406-kube-api-access-4qkjc\") on node \"crc\" DevicePath \"\"" Dec 03 08:14:23 crc kubenswrapper[4831]: I1203 08:14:23.506789 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" event={"ID":"f064abcd-d3d7-4a44-a224-6fde3a142406","Type":"ContainerDied","Data":"0b11d0c42d9c0e3492b5ab00ea0d3239900b1d2fcfdc2449849d4f58a1c84392"} Dec 03 08:14:23 crc kubenswrapper[4831]: I1203 08:14:23.506869 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b11d0c42d9c0e3492b5ab00ea0d3239900b1d2fcfdc2449849d4f58a1c84392" Dec 03 08:14:23 crc kubenswrapper[4831]: I1203 08:14:23.506891 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2" Dec 03 08:14:27 crc kubenswrapper[4831]: I1203 08:14:27.596301 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:14:27 crc kubenswrapper[4831]: I1203 08:14:27.596929 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:14:27 crc kubenswrapper[4831]: I1203 08:14:27.596980 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 08:14:27 crc kubenswrapper[4831]: I1203 08:14:27.597934 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"98f056b5c45cb7c3ebbde6e6e8b7c794ed7234028aab60b3d5caa65ca4fbade0"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:14:27 crc kubenswrapper[4831]: I1203 08:14:27.598016 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://98f056b5c45cb7c3ebbde6e6e8b7c794ed7234028aab60b3d5caa65ca4fbade0" gracePeriod=600 Dec 03 08:14:27 crc kubenswrapper[4831]: E1203 08:14:27.793796 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e04caf2_8e18_4af8_9779_c5711262077b.slice/crio-conmon-98f056b5c45cb7c3ebbde6e6e8b7c794ed7234028aab60b3d5caa65ca4fbade0.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e04caf2_8e18_4af8_9779_c5711262077b.slice/crio-98f056b5c45cb7c3ebbde6e6e8b7c794ed7234028aab60b3d5caa65ca4fbade0.scope\": RecentStats: unable to find data in memory cache]" Dec 03 08:14:28 crc kubenswrapper[4831]: I1203 08:14:28.587677 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="98f056b5c45cb7c3ebbde6e6e8b7c794ed7234028aab60b3d5caa65ca4fbade0" exitCode=0 Dec 03 08:14:28 crc kubenswrapper[4831]: I1203 08:14:28.587819 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"98f056b5c45cb7c3ebbde6e6e8b7c794ed7234028aab60b3d5caa65ca4fbade0"} Dec 03 08:14:28 crc kubenswrapper[4831]: I1203 08:14:28.588264 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911"} Dec 03 08:14:28 crc kubenswrapper[4831]: I1203 08:14:28.588286 4831 scope.go:117] "RemoveContainer" containerID="c74ab78793010050fe7299fa386cfc658060d07868deca63f3eb04cd26bdb342" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.443162 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-89ql6"] Dec 03 08:14:34 crc kubenswrapper[4831]: E1203 08:14:34.446527 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f064abcd-d3d7-4a44-a224-6fde3a142406" containerName="util" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.446555 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f064abcd-d3d7-4a44-a224-6fde3a142406" containerName="util" Dec 03 08:14:34 crc kubenswrapper[4831]: E1203 08:14:34.446610 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f064abcd-d3d7-4a44-a224-6fde3a142406" containerName="extract" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.446620 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f064abcd-d3d7-4a44-a224-6fde3a142406" containerName="extract" Dec 03 08:14:34 crc kubenswrapper[4831]: E1203 08:14:34.446648 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerName="horizon-log" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.446657 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerName="horizon-log" Dec 03 08:14:34 crc kubenswrapper[4831]: E1203 08:14:34.446679 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f064abcd-d3d7-4a44-a224-6fde3a142406" containerName="pull" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.446687 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f064abcd-d3d7-4a44-a224-6fde3a142406" containerName="pull" Dec 03 08:14:34 crc kubenswrapper[4831]: E1203 08:14:34.446745 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerName="horizon" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.446754 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerName="horizon" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.447203 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerName="horizon" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.447242 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f064abcd-d3d7-4a44-a224-6fde3a142406" containerName="extract" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.447260 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="acdd083e-b37d-4cdf-b856-3eabe411df76" containerName="horizon-log" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.456609 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-89ql6" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.466896 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.467140 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-bd9j8" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.467278 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.499408 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-89ql6"] Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.561145 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rbhx\" (UniqueName: \"kubernetes.io/projected/ae87822e-4b31-43cb-af6e-33739656a430-kube-api-access-4rbhx\") pod \"obo-prometheus-operator-668cf9dfbb-89ql6\" (UID: \"ae87822e-4b31-43cb-af6e-33739656a430\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-89ql6" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.574349 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp"] Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.575720 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.583656 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.583834 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-7jqcj" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.587543 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx"] Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.588954 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.603063 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp"] Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.621546 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx"] Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.662752 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rbhx\" (UniqueName: \"kubernetes.io/projected/ae87822e-4b31-43cb-af6e-33739656a430-kube-api-access-4rbhx\") pod \"obo-prometheus-operator-668cf9dfbb-89ql6\" (UID: \"ae87822e-4b31-43cb-af6e-33739656a430\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-89ql6" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.662859 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d153656-c76f-46c6-a2c1-51e507cd8705-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp\" (UID: \"7d153656-c76f-46c6-a2c1-51e507cd8705\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.662957 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d153656-c76f-46c6-a2c1-51e507cd8705-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp\" (UID: \"7d153656-c76f-46c6-a2c1-51e507cd8705\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.684517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rbhx\" (UniqueName: \"kubernetes.io/projected/ae87822e-4b31-43cb-af6e-33739656a430-kube-api-access-4rbhx\") pod \"obo-prometheus-operator-668cf9dfbb-89ql6\" (UID: \"ae87822e-4b31-43cb-af6e-33739656a430\") " pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-89ql6" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.764791 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d153656-c76f-46c6-a2c1-51e507cd8705-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp\" (UID: \"7d153656-c76f-46c6-a2c1-51e507cd8705\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.764927 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ad3a46cd-b6a3-4229-8d68-9d9931dc33bf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx\" (UID: \"ad3a46cd-b6a3-4229-8d68-9d9931dc33bf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.765045 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ad3a46cd-b6a3-4229-8d68-9d9931dc33bf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx\" (UID: \"ad3a46cd-b6a3-4229-8d68-9d9931dc33bf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.765129 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d153656-c76f-46c6-a2c1-51e507cd8705-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp\" (UID: \"7d153656-c76f-46c6-a2c1-51e507cd8705\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.771974 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/7d153656-c76f-46c6-a2c1-51e507cd8705-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp\" (UID: \"7d153656-c76f-46c6-a2c1-51e507cd8705\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.773031 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-tchlg"] Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.773880 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/7d153656-c76f-46c6-a2c1-51e507cd8705-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp\" (UID: \"7d153656-c76f-46c6-a2c1-51e507cd8705\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.774342 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-tchlg" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.786086 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.786308 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-h6nnt" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.797945 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-89ql6" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.810389 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-tchlg"] Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.866723 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5828415-c585-469e-8596-3c7142eb299a-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-tchlg\" (UID: \"c5828415-c585-469e-8596-3c7142eb299a\") " pod="openshift-operators/observability-operator-d8bb48f5d-tchlg" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.866770 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8tj9\" (UniqueName: \"kubernetes.io/projected/c5828415-c585-469e-8596-3c7142eb299a-kube-api-access-k8tj9\") pod \"observability-operator-d8bb48f5d-tchlg\" (UID: \"c5828415-c585-469e-8596-3c7142eb299a\") " pod="openshift-operators/observability-operator-d8bb48f5d-tchlg" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.866860 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ad3a46cd-b6a3-4229-8d68-9d9931dc33bf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx\" (UID: \"ad3a46cd-b6a3-4229-8d68-9d9931dc33bf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.866928 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ad3a46cd-b6a3-4229-8d68-9d9931dc33bf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx\" (UID: \"ad3a46cd-b6a3-4229-8d68-9d9931dc33bf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.873764 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ad3a46cd-b6a3-4229-8d68-9d9931dc33bf-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx\" (UID: \"ad3a46cd-b6a3-4229-8d68-9d9931dc33bf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.873879 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ad3a46cd-b6a3-4229-8d68-9d9931dc33bf-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx\" (UID: \"ad3a46cd-b6a3-4229-8d68-9d9931dc33bf\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.922856 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.935831 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.970419 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5828415-c585-469e-8596-3c7142eb299a-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-tchlg\" (UID: \"c5828415-c585-469e-8596-3c7142eb299a\") " pod="openshift-operators/observability-operator-d8bb48f5d-tchlg" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.970471 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k8tj9\" (UniqueName: \"kubernetes.io/projected/c5828415-c585-469e-8596-3c7142eb299a-kube-api-access-k8tj9\") pod \"observability-operator-d8bb48f5d-tchlg\" (UID: \"c5828415-c585-469e-8596-3c7142eb299a\") " pod="openshift-operators/observability-operator-d8bb48f5d-tchlg" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.979004 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5828415-c585-469e-8596-3c7142eb299a-observability-operator-tls\") pod \"observability-operator-d8bb48f5d-tchlg\" (UID: \"c5828415-c585-469e-8596-3c7142eb299a\") " pod="openshift-operators/observability-operator-d8bb48f5d-tchlg" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.983377 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5446b9c989-c7d8p"] Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.985005 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-c7d8p" Dec 03 08:14:34 crc kubenswrapper[4831]: I1203 08:14:34.995111 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-jzdmx" Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.002572 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8tj9\" (UniqueName: \"kubernetes.io/projected/c5828415-c585-469e-8596-3c7142eb299a-kube-api-access-k8tj9\") pod \"observability-operator-d8bb48f5d-tchlg\" (UID: \"c5828415-c585-469e-8596-3c7142eb299a\") " pod="openshift-operators/observability-operator-d8bb48f5d-tchlg" Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.061655 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-c7d8p"] Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.072217 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6tgx\" (UniqueName: \"kubernetes.io/projected/930ff1fd-f482-47a3-a52a-09970ac40b24-kube-api-access-f6tgx\") pod \"perses-operator-5446b9c989-c7d8p\" (UID: \"930ff1fd-f482-47a3-a52a-09970ac40b24\") " pod="openshift-operators/perses-operator-5446b9c989-c7d8p" Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.072476 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/930ff1fd-f482-47a3-a52a-09970ac40b24-openshift-service-ca\") pod \"perses-operator-5446b9c989-c7d8p\" (UID: \"930ff1fd-f482-47a3-a52a-09970ac40b24\") " pod="openshift-operators/perses-operator-5446b9c989-c7d8p" Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.177054 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6tgx\" (UniqueName: \"kubernetes.io/projected/930ff1fd-f482-47a3-a52a-09970ac40b24-kube-api-access-f6tgx\") pod \"perses-operator-5446b9c989-c7d8p\" (UID: \"930ff1fd-f482-47a3-a52a-09970ac40b24\") " pod="openshift-operators/perses-operator-5446b9c989-c7d8p" Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.179125 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/930ff1fd-f482-47a3-a52a-09970ac40b24-openshift-service-ca\") pod \"perses-operator-5446b9c989-c7d8p\" (UID: \"930ff1fd-f482-47a3-a52a-09970ac40b24\") " pod="openshift-operators/perses-operator-5446b9c989-c7d8p" Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.180180 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/930ff1fd-f482-47a3-a52a-09970ac40b24-openshift-service-ca\") pod \"perses-operator-5446b9c989-c7d8p\" (UID: \"930ff1fd-f482-47a3-a52a-09970ac40b24\") " pod="openshift-operators/perses-operator-5446b9c989-c7d8p" Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.199969 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6tgx\" (UniqueName: \"kubernetes.io/projected/930ff1fd-f482-47a3-a52a-09970ac40b24-kube-api-access-f6tgx\") pod \"perses-operator-5446b9c989-c7d8p\" (UID: \"930ff1fd-f482-47a3-a52a-09970ac40b24\") " pod="openshift-operators/perses-operator-5446b9c989-c7d8p" Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.231513 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-d8bb48f5d-tchlg" Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.366155 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5446b9c989-c7d8p" Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.431176 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-668cf9dfbb-89ql6"] Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.629892 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx"] Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.753262 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp"] Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.774281 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-89ql6" event={"ID":"ae87822e-4b31-43cb-af6e-33739656a430","Type":"ContainerStarted","Data":"c9d5d21a498364717562f0b160d95d4315c1a77d865f3297d9776f4def97aad2"} Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.776137 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx" event={"ID":"ad3a46cd-b6a3-4229-8d68-9d9931dc33bf","Type":"ContainerStarted","Data":"8f662d046b4f81afeb09d3cb17ac973835fcab2845c607b9334768b9d7f94017"} Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.853950 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-d8bb48f5d-tchlg"] Dec 03 08:14:35 crc kubenswrapper[4831]: I1203 08:14:35.976739 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5446b9c989-c7d8p"] Dec 03 08:14:36 crc kubenswrapper[4831]: I1203 08:14:36.796088 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-tchlg" event={"ID":"c5828415-c585-469e-8596-3c7142eb299a","Type":"ContainerStarted","Data":"b77b12ad9fa535b329fa1edb5aeabe6f94ab11d7c6344acfd247b3112505051c"} Dec 03 08:14:36 crc kubenswrapper[4831]: I1203 08:14:36.808203 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp" event={"ID":"7d153656-c76f-46c6-a2c1-51e507cd8705","Type":"ContainerStarted","Data":"b0f2ccb071692e008cc9d7f7e0f2d3d2d81d42327e41ff6c59b914588b635562"} Dec 03 08:14:36 crc kubenswrapper[4831]: I1203 08:14:36.831383 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-c7d8p" event={"ID":"930ff1fd-f482-47a3-a52a-09970ac40b24","Type":"ContainerStarted","Data":"69c7f99f5e127cfd36253abd143df8e3c34ea46a068788c29083cbd76930255b"} Dec 03 08:14:47 crc kubenswrapper[4831]: I1203 08:14:47.026891 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5446b9c989-c7d8p" Dec 03 08:14:47 crc kubenswrapper[4831]: I1203 08:14:47.028541 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-89ql6" event={"ID":"ae87822e-4b31-43cb-af6e-33739656a430","Type":"ContainerStarted","Data":"f94d883438fac89c3df7563af846f71fe19516aacdc732053239b626a80cb4d4"} Dec 03 08:14:47 crc kubenswrapper[4831]: I1203 08:14:47.028682 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx" event={"ID":"ad3a46cd-b6a3-4229-8d68-9d9931dc33bf","Type":"ContainerStarted","Data":"8dc26847a05dac4f5559c46b6f39712c4c494d8e4cd6614a5b7bc20dca0378cb"} Dec 03 08:14:47 crc kubenswrapper[4831]: I1203 08:14:47.028799 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5446b9c989-c7d8p" event={"ID":"930ff1fd-f482-47a3-a52a-09970ac40b24","Type":"ContainerStarted","Data":"3fc49059192e5dd813fa92b383330b85ab585c7e11cb2a9b97072cc39e290db3"} Dec 03 08:14:47 crc kubenswrapper[4831]: I1203 08:14:47.028894 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-d8bb48f5d-tchlg" Dec 03 08:14:47 crc kubenswrapper[4831]: I1203 08:14:47.029027 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-d8bb48f5d-tchlg" Dec 03 08:14:47 crc kubenswrapper[4831]: I1203 08:14:47.029118 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-d8bb48f5d-tchlg" event={"ID":"c5828415-c585-469e-8596-3c7142eb299a","Type":"ContainerStarted","Data":"f8cddf17823169cbbddb1366a0063907172b2a6570d697e4c250c83d8b846396"} Dec 03 08:14:47 crc kubenswrapper[4831]: I1203 08:14:47.029198 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp" event={"ID":"7d153656-c76f-46c6-a2c1-51e507cd8705","Type":"ContainerStarted","Data":"69ceb7ed44a8505fcdcdd434e777c792b8eeac2d1b039d5d2ef05ce1484d8e06"} Dec 03 08:14:47 crc kubenswrapper[4831]: I1203 08:14:47.073041 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-668cf9dfbb-89ql6" podStartSLOduration=2.315523851 podStartE2EDuration="13.073025024s" podCreationTimestamp="2025-12-03 08:14:34 +0000 UTC" firstStartedPulling="2025-12-03 08:14:35.454072389 +0000 UTC m=+6212.797655897" lastFinishedPulling="2025-12-03 08:14:46.211573572 +0000 UTC m=+6223.555157070" observedRunningTime="2025-12-03 08:14:47.070733783 +0000 UTC m=+6224.414317301" watchObservedRunningTime="2025-12-03 08:14:47.073025024 +0000 UTC m=+6224.416608532" Dec 03 08:14:47 crc kubenswrapper[4831]: I1203 08:14:47.111254 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-d8bb48f5d-tchlg" podStartSLOduration=2.6719490219999997 podStartE2EDuration="13.111231005s" podCreationTimestamp="2025-12-03 08:14:34 +0000 UTC" firstStartedPulling="2025-12-03 08:14:35.854464299 +0000 UTC m=+6213.198047807" lastFinishedPulling="2025-12-03 08:14:46.293746282 +0000 UTC m=+6223.637329790" observedRunningTime="2025-12-03 08:14:47.100396206 +0000 UTC m=+6224.443979734" watchObservedRunningTime="2025-12-03 08:14:47.111231005 +0000 UTC m=+6224.454814513" Dec 03 08:14:47 crc kubenswrapper[4831]: I1203 08:14:47.119683 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp" podStartSLOduration=2.67023608 podStartE2EDuration="13.119666017s" podCreationTimestamp="2025-12-03 08:14:34 +0000 UTC" firstStartedPulling="2025-12-03 08:14:35.758003566 +0000 UTC m=+6213.101587074" lastFinishedPulling="2025-12-03 08:14:46.207433503 +0000 UTC m=+6223.551017011" observedRunningTime="2025-12-03 08:14:47.116706375 +0000 UTC m=+6224.460289893" watchObservedRunningTime="2025-12-03 08:14:47.119666017 +0000 UTC m=+6224.463249525" Dec 03 08:14:47 crc kubenswrapper[4831]: I1203 08:14:47.162302 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx" podStartSLOduration=2.58616942 podStartE2EDuration="13.162279554s" podCreationTimestamp="2025-12-03 08:14:34 +0000 UTC" firstStartedPulling="2025-12-03 08:14:35.636010766 +0000 UTC m=+6212.979594274" lastFinishedPulling="2025-12-03 08:14:46.2121209 +0000 UTC m=+6223.555704408" observedRunningTime="2025-12-03 08:14:47.159155596 +0000 UTC m=+6224.502739104" watchObservedRunningTime="2025-12-03 08:14:47.162279554 +0000 UTC m=+6224.505863062" Dec 03 08:14:47 crc kubenswrapper[4831]: I1203 08:14:47.195280 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5446b9c989-c7d8p" podStartSLOduration=2.974256688 podStartE2EDuration="13.195262452s" podCreationTimestamp="2025-12-03 08:14:34 +0000 UTC" firstStartedPulling="2025-12-03 08:14:35.987748471 +0000 UTC m=+6213.331331979" lastFinishedPulling="2025-12-03 08:14:46.208754245 +0000 UTC m=+6223.552337743" observedRunningTime="2025-12-03 08:14:47.193692453 +0000 UTC m=+6224.537275951" watchObservedRunningTime="2025-12-03 08:14:47.195262452 +0000 UTC m=+6224.538845960" Dec 03 08:14:55 crc kubenswrapper[4831]: I1203 08:14:55.379082 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5446b9c989-c7d8p" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.005756 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.006344 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/openstackclient" podUID="fdb7ee35-d757-414f-b20f-227ce78917e7" containerName="openstackclient" containerID="cri-o://4355591e4499fd416dff34cd2bd7e62fac68847762422a98953e866be23cf375" gracePeriod=2 Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.016485 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.061532 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 08:14:58 crc kubenswrapper[4831]: E1203 08:14:58.062385 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fdb7ee35-d757-414f-b20f-227ce78917e7" containerName="openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.062403 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fdb7ee35-d757-414f-b20f-227ce78917e7" containerName="openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.062906 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fdb7ee35-d757-414f-b20f-227ce78917e7" containerName="openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.063649 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.069303 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="fdb7ee35-d757-414f-b20f-227ce78917e7" podUID="aa48e028-4a0e-4242-aa87-d15a5f6c0c76" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.094551 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.106865 4831 status_manager.go:875] "Failed to update status for pod" pod="openstack/openstackclient" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa48e028-4a0e-4242-aa87-d15a5f6c0c76\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T08:14:58Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T08:14:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T08:14:58Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2025-12-03T08:14:58Z\\\",\\\"message\\\":\\\"containers with unready status: [openstackclient]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/podified-antelope-centos9/openstack-openstackclient:current-podified\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"openstackclient\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/clouds.yaml\\\",\\\"name\\\":\\\"openstack-config\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/.config/openstack/secure.yaml\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/home/cloud-admin/cloudrc\\\",\\\"name\\\":\\\"openstack-config-secret\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2f8w\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2025-12-03T08:14:58Z\\\"}}\" for pod \"openstack\"/\"openstackclient\": pods \"openstackclient\" not found" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.121368 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/openstackclient"] Dec 03 08:14:58 crc kubenswrapper[4831]: E1203 08:14:58.122134 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-q2f8w openstack-config openstack-config-secret], unattached volumes=[], failed to process volumes=[kube-api-access-q2f8w openstack-config openstack-config-secret]: context canceled" pod="openstack/openstackclient" podUID="aa48e028-4a0e-4242-aa87-d15a5f6c0c76" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.153095 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/openstackclient"] Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.168034 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.169459 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.231930 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.234744 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-openstack-config\") pod \"openstackclient\" (UID: \"aa48e028-4a0e-4242-aa87-d15a5f6c0c76\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.234847 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2f8w\" (UniqueName: \"kubernetes.io/projected/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-kube-api-access-q2f8w\") pod \"openstackclient\" (UID: \"aa48e028-4a0e-4242-aa87-d15a5f6c0c76\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.242024 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="aa48e028-4a0e-4242-aa87-d15a5f6c0c76" podUID="85e48574-0a26-4ec9-ac44-f59acf845387" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.256434 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-openstack-config-secret\") pod \"openstackclient\" (UID: \"aa48e028-4a0e-4242-aa87-d15a5f6c0c76\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.346746 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.354846 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="aa48e028-4a0e-4242-aa87-d15a5f6c0c76" podUID="85e48574-0a26-4ec9-ac44-f59acf845387" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.359490 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6gqz\" (UniqueName: \"kubernetes.io/projected/85e48574-0a26-4ec9-ac44-f59acf845387-kube-api-access-v6gqz\") pod \"openstackclient\" (UID: \"85e48574-0a26-4ec9-ac44-f59acf845387\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.359617 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-openstack-config\") pod \"openstackclient\" (UID: \"aa48e028-4a0e-4242-aa87-d15a5f6c0c76\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.359662 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2f8w\" (UniqueName: \"kubernetes.io/projected/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-kube-api-access-q2f8w\") pod \"openstackclient\" (UID: \"aa48e028-4a0e-4242-aa87-d15a5f6c0c76\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.359683 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-openstack-config-secret\") pod \"openstackclient\" (UID: \"aa48e028-4a0e-4242-aa87-d15a5f6c0c76\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.359744 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/85e48574-0a26-4ec9-ac44-f59acf845387-openstack-config-secret\") pod \"openstackclient\" (UID: \"85e48574-0a26-4ec9-ac44-f59acf845387\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.359784 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/85e48574-0a26-4ec9-ac44-f59acf845387-openstack-config\") pod \"openstackclient\" (UID: \"85e48574-0a26-4ec9-ac44-f59acf845387\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.361195 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-openstack-config\") pod \"openstackclient\" (UID: \"aa48e028-4a0e-4242-aa87-d15a5f6c0c76\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.364816 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.366639 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 08:14:58 crc kubenswrapper[4831]: E1203 08:14:58.371749 4831 projected.go:194] Error preparing data for projected volume kube-api-access-q2f8w for pod openstack/openstackclient: failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (aa48e028-4a0e-4242-aa87-d15a5f6c0c76) does not match the UID in record. The object might have been deleted and then recreated Dec 03 08:14:58 crc kubenswrapper[4831]: E1203 08:14:58.371837 4831 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-kube-api-access-q2f8w podName:aa48e028-4a0e-4242-aa87-d15a5f6c0c76 nodeName:}" failed. No retries permitted until 2025-12-03 08:14:58.871813326 +0000 UTC m=+6236.215396834 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-q2f8w" (UniqueName: "kubernetes.io/projected/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-kube-api-access-q2f8w") pod "openstackclient" (UID: "aa48e028-4a0e-4242-aa87-d15a5f6c0c76") : failed to fetch token: serviceaccounts "openstackclient-openstackclient" is forbidden: the UID in the bound object reference (aa48e028-4a0e-4242-aa87-d15a5f6c0c76) does not match the UID in record. The object might have been deleted and then recreated Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.372243 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-dcnjv" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.398231 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-openstack-config-secret\") pod \"openstackclient\" (UID: \"aa48e028-4a0e-4242-aa87-d15a5f6c0c76\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.407808 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.462945 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/85e48574-0a26-4ec9-ac44-f59acf845387-openstack-config-secret\") pod \"openstackclient\" (UID: \"85e48574-0a26-4ec9-ac44-f59acf845387\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.463039 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/85e48574-0a26-4ec9-ac44-f59acf845387-openstack-config\") pod \"openstackclient\" (UID: \"85e48574-0a26-4ec9-ac44-f59acf845387\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.463128 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nffz\" (UniqueName: \"kubernetes.io/projected/670949d7-821a-4f20-84df-e32be87cac88-kube-api-access-2nffz\") pod \"kube-state-metrics-0\" (UID: \"670949d7-821a-4f20-84df-e32be87cac88\") " pod="openstack/kube-state-metrics-0" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.463167 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6gqz\" (UniqueName: \"kubernetes.io/projected/85e48574-0a26-4ec9-ac44-f59acf845387-kube-api-access-v6gqz\") pod \"openstackclient\" (UID: \"85e48574-0a26-4ec9-ac44-f59acf845387\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.465517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/85e48574-0a26-4ec9-ac44-f59acf845387-openstack-config\") pod \"openstackclient\" (UID: \"85e48574-0a26-4ec9-ac44-f59acf845387\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.470335 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.483820 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/85e48574-0a26-4ec9-ac44-f59acf845387-openstack-config-secret\") pod \"openstackclient\" (UID: \"85e48574-0a26-4ec9-ac44-f59acf845387\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.489112 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="aa48e028-4a0e-4242-aa87-d15a5f6c0c76" podUID="85e48574-0a26-4ec9-ac44-f59acf845387" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.512701 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6gqz\" (UniqueName: \"kubernetes.io/projected/85e48574-0a26-4ec9-ac44-f59acf845387-kube-api-access-v6gqz\") pod \"openstackclient\" (UID: \"85e48574-0a26-4ec9-ac44-f59acf845387\") " pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.534102 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.565487 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-openstack-config-secret\") pod \"aa48e028-4a0e-4242-aa87-d15a5f6c0c76\" (UID: \"aa48e028-4a0e-4242-aa87-d15a5f6c0c76\") " Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.565606 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-openstack-config\") pod \"aa48e028-4a0e-4242-aa87-d15a5f6c0c76\" (UID: \"aa48e028-4a0e-4242-aa87-d15a5f6c0c76\") " Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.565966 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nffz\" (UniqueName: \"kubernetes.io/projected/670949d7-821a-4f20-84df-e32be87cac88-kube-api-access-2nffz\") pod \"kube-state-metrics-0\" (UID: \"670949d7-821a-4f20-84df-e32be87cac88\") " pod="openstack/kube-state-metrics-0" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.566089 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2f8w\" (UniqueName: \"kubernetes.io/projected/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-kube-api-access-q2f8w\") on node \"crc\" DevicePath \"\"" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.567042 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "aa48e028-4a0e-4242-aa87-d15a5f6c0c76" (UID: "aa48e028-4a0e-4242-aa87-d15a5f6c0c76"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.604972 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "aa48e028-4a0e-4242-aa87-d15a5f6c0c76" (UID: "aa48e028-4a0e-4242-aa87-d15a5f6c0c76"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.609010 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nffz\" (UniqueName: \"kubernetes.io/projected/670949d7-821a-4f20-84df-e32be87cac88-kube-api-access-2nffz\") pod \"kube-state-metrics-0\" (UID: \"670949d7-821a-4f20-84df-e32be87cac88\") " pod="openstack/kube-state-metrics-0" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.671289 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.671327 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/aa48e028-4a0e-4242-aa87-d15a5f6c0c76-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:14:58 crc kubenswrapper[4831]: I1203 08:14:58.782826 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.044801 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa48e028-4a0e-4242-aa87-d15a5f6c0c76" path="/var/lib/kubelet/pods/aa48e028-4a0e-4242-aa87-d15a5f6c0c76/volumes" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.220808 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.223260 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.238628 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-web-config" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.238819 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-cluster-tls-config" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.238936 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-alertmanager-dockercfg-5zvgd" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.239043 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-generated" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.239151 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"alertmanager-metric-storage-tls-assets-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.295088 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.295372 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.295408 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.295454 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.295472 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.295505 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmb76\" (UniqueName: \"kubernetes.io/projected/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-kube-api-access-vmb76\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.295527 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.304600 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.367735 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.408443 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.408497 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.408549 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.408569 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.408601 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmb76\" (UniqueName: \"kubernetes.io/projected/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-kube-api-access-vmb76\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.408622 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.408680 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.415985 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-metric-storage-db\" (UniqueName: \"kubernetes.io/empty-dir/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-alertmanager-metric-storage-db\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.428971 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-cluster-tls-config\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.429120 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-tls-assets\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.434098 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-config-out\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.434868 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-web-config\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.437997 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-config-volume\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.443948 4831 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openstack/openstackclient" oldPodUID="aa48e028-4a0e-4242-aa87-d15a5f6c0c76" podUID="85e48574-0a26-4ec9-ac44-f59acf845387" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.511886 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.530294 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmb76\" (UniqueName: \"kubernetes.io/projected/aefc8d48-0a47-424c-bce7-8e5c75a6d0fe-kube-api-access-vmb76\") pod \"alertmanager-metric-storage-0\" (UID: \"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe\") " pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.594524 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/alertmanager-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.740698 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.754004 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.757880 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.767728 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-web-config" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.767942 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-thanos-prometheus-http-client-file" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.768050 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"prometheus-metric-storage-tls-assets-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.768196 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"prometheus-metric-storage-rulefiles-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.768293 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"metric-storage-prometheus-dockercfg-ltdzt" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.781499 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.824753 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b7346d4a-a166-4419-8305-76ecd3ccf9b1-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.824798 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b7346d4a-a166-4419-8305-76ecd3ccf9b1-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.824867 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x575g\" (UniqueName: \"kubernetes.io/projected/b7346d4a-a166-4419-8305-76ecd3ccf9b1-kube-api-access-x575g\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.824907 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b7346d4a-a166-4419-8305-76ecd3ccf9b1-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.824968 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b7346d4a-a166-4419-8305-76ecd3ccf9b1-config\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.825008 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-3d5be284-fc11-4087-be60-6d07e588cd81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d5be284-fc11-4087-be60-6d07e588cd81\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.825123 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b7346d4a-a166-4419-8305-76ecd3ccf9b1-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.825160 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b7346d4a-a166-4419-8305-76ecd3ccf9b1-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.923141 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.928699 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b7346d4a-a166-4419-8305-76ecd3ccf9b1-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.928750 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b7346d4a-a166-4419-8305-76ecd3ccf9b1-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.928807 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b7346d4a-a166-4419-8305-76ecd3ccf9b1-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.928832 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b7346d4a-a166-4419-8305-76ecd3ccf9b1-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.928888 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x575g\" (UniqueName: \"kubernetes.io/projected/b7346d4a-a166-4419-8305-76ecd3ccf9b1-kube-api-access-x575g\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.928917 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b7346d4a-a166-4419-8305-76ecd3ccf9b1-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.928970 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b7346d4a-a166-4419-8305-76ecd3ccf9b1-config\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.929002 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-3d5be284-fc11-4087-be60-6d07e588cd81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d5be284-fc11-4087-be60-6d07e588cd81\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.929855 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-metric-storage-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/b7346d4a-a166-4419-8305-76ecd3ccf9b1-prometheus-metric-storage-rulefiles-0\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.938582 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b7346d4a-a166-4419-8305-76ecd3ccf9b1-config\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.943467 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b7346d4a-a166-4419-8305-76ecd3ccf9b1-config-out\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.946967 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/b7346d4a-a166-4419-8305-76ecd3ccf9b1-thanos-prometheus-http-client-file\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.947168 4831 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.947195 4831 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-3d5be284-fc11-4087-be60-6d07e588cd81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d5be284-fc11-4087-be60-6d07e588cd81\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/b7f07084153b92d402756f0fc02d6dbc4225790ec42062a4e2f7d44708d4e8d6/globalmount\"" pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.948739 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b7346d4a-a166-4419-8305-76ecd3ccf9b1-web-config\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.949569 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b7346d4a-a166-4419-8305-76ecd3ccf9b1-tls-assets\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:14:59 crc kubenswrapper[4831]: I1203 08:14:59.959389 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x575g\" (UniqueName: \"kubernetes.io/projected/b7346d4a-a166-4419-8305-76ecd3ccf9b1-kube-api-access-x575g\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.177623 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs"] Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.179493 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.181818 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.182112 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.245542 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-3d5be284-fc11-4087-be60-6d07e588cd81\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-3d5be284-fc11-4087-be60-6d07e588cd81\") pod \"prometheus-metric-storage-0\" (UID: \"b7346d4a-a166-4419-8305-76ecd3ccf9b1\") " pod="openstack/prometheus-metric-storage-0" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.264362 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs"] Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.358495 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9cbbf48-7afa-4974-abdf-29610bca3012-secret-volume\") pod \"collect-profiles-29412495-dsnxs\" (UID: \"f9cbbf48-7afa-4974-abdf-29610bca3012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.358562 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxn7w\" (UniqueName: \"kubernetes.io/projected/f9cbbf48-7afa-4974-abdf-29610bca3012-kube-api-access-pxn7w\") pod \"collect-profiles-29412495-dsnxs\" (UID: \"f9cbbf48-7afa-4974-abdf-29610bca3012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.358949 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9cbbf48-7afa-4974-abdf-29610bca3012-config-volume\") pod \"collect-profiles-29412495-dsnxs\" (UID: \"f9cbbf48-7afa-4974-abdf-29610bca3012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.389661 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"670949d7-821a-4f20-84df-e32be87cac88","Type":"ContainerStarted","Data":"f37e1b22748dc9a7ad98adf8c0abfd3952f11c5c06a96c5714c559715cbf2f29"} Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.391084 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"85e48574-0a26-4ec9-ac44-f59acf845387","Type":"ContainerStarted","Data":"9d6e2f243aeb7d03ae16a41df772789c4e1342beed2810411383ba4f1b1f26bd"} Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.391132 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"85e48574-0a26-4ec9-ac44-f59acf845387","Type":"ContainerStarted","Data":"284198d044ff02fd909b82a0066afd6e9abf434559da26ad69f7645008ee8c81"} Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.391995 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/prometheus-metric-storage-0" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.396909 4831 generic.go:334] "Generic (PLEG): container finished" podID="fdb7ee35-d757-414f-b20f-227ce78917e7" containerID="4355591e4499fd416dff34cd2bd7e62fac68847762422a98953e866be23cf375" exitCode=137 Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.423089 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=2.423068047 podStartE2EDuration="2.423068047s" podCreationTimestamp="2025-12-03 08:14:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:15:00.419730093 +0000 UTC m=+6237.763313611" watchObservedRunningTime="2025-12-03 08:15:00.423068047 +0000 UTC m=+6237.766651555" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.465879 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pxn7w\" (UniqueName: \"kubernetes.io/projected/f9cbbf48-7afa-4974-abdf-29610bca3012-kube-api-access-pxn7w\") pod \"collect-profiles-29412495-dsnxs\" (UID: \"f9cbbf48-7afa-4974-abdf-29610bca3012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.466053 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9cbbf48-7afa-4974-abdf-29610bca3012-config-volume\") pod \"collect-profiles-29412495-dsnxs\" (UID: \"f9cbbf48-7afa-4974-abdf-29610bca3012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.466160 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9cbbf48-7afa-4974-abdf-29610bca3012-secret-volume\") pod \"collect-profiles-29412495-dsnxs\" (UID: \"f9cbbf48-7afa-4974-abdf-29610bca3012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.468906 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9cbbf48-7afa-4974-abdf-29610bca3012-config-volume\") pod \"collect-profiles-29412495-dsnxs\" (UID: \"f9cbbf48-7afa-4974-abdf-29610bca3012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.478164 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9cbbf48-7afa-4974-abdf-29610bca3012-secret-volume\") pod \"collect-profiles-29412495-dsnxs\" (UID: \"f9cbbf48-7afa-4974-abdf-29610bca3012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.487263 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxn7w\" (UniqueName: \"kubernetes.io/projected/f9cbbf48-7afa-4974-abdf-29610bca3012-kube-api-access-pxn7w\") pod \"collect-profiles-29412495-dsnxs\" (UID: \"f9cbbf48-7afa-4974-abdf-29610bca3012\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.554447 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.683920 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/alertmanager-metric-storage-0"] Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.764474 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.876310 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brb7p\" (UniqueName: \"kubernetes.io/projected/fdb7ee35-d757-414f-b20f-227ce78917e7-kube-api-access-brb7p\") pod \"fdb7ee35-d757-414f-b20f-227ce78917e7\" (UID: \"fdb7ee35-d757-414f-b20f-227ce78917e7\") " Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.876386 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fdb7ee35-d757-414f-b20f-227ce78917e7-openstack-config-secret\") pod \"fdb7ee35-d757-414f-b20f-227ce78917e7\" (UID: \"fdb7ee35-d757-414f-b20f-227ce78917e7\") " Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.876462 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fdb7ee35-d757-414f-b20f-227ce78917e7-openstack-config\") pod \"fdb7ee35-d757-414f-b20f-227ce78917e7\" (UID: \"fdb7ee35-d757-414f-b20f-227ce78917e7\") " Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.884611 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fdb7ee35-d757-414f-b20f-227ce78917e7-kube-api-access-brb7p" (OuterVolumeSpecName: "kube-api-access-brb7p") pod "fdb7ee35-d757-414f-b20f-227ce78917e7" (UID: "fdb7ee35-d757-414f-b20f-227ce78917e7"). InnerVolumeSpecName "kube-api-access-brb7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.907754 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fdb7ee35-d757-414f-b20f-227ce78917e7-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "fdb7ee35-d757-414f-b20f-227ce78917e7" (UID: "fdb7ee35-d757-414f-b20f-227ce78917e7"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.942421 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fdb7ee35-d757-414f-b20f-227ce78917e7-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "fdb7ee35-d757-414f-b20f-227ce78917e7" (UID: "fdb7ee35-d757-414f-b20f-227ce78917e7"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.978384 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brb7p\" (UniqueName: \"kubernetes.io/projected/fdb7ee35-d757-414f-b20f-227ce78917e7-kube-api-access-brb7p\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.978417 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/fdb7ee35-d757-414f-b20f-227ce78917e7-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:00 crc kubenswrapper[4831]: I1203 08:15:00.978428 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/fdb7ee35-d757-414f-b20f-227ce78917e7-openstack-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:01 crc kubenswrapper[4831]: I1203 08:15:01.010996 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/prometheus-metric-storage-0"] Dec 03 08:15:01 crc kubenswrapper[4831]: W1203 08:15:01.014733 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7346d4a_a166_4419_8305_76ecd3ccf9b1.slice/crio-2f2949e54b559a8c3ce84c4e4ffa4944ab6126df9d16fc479e096d435a2ba252 WatchSource:0}: Error finding container 2f2949e54b559a8c3ce84c4e4ffa4944ab6126df9d16fc479e096d435a2ba252: Status 404 returned error can't find the container with id 2f2949e54b559a8c3ce84c4e4ffa4944ab6126df9d16fc479e096d435a2ba252 Dec 03 08:15:01 crc kubenswrapper[4831]: I1203 08:15:01.026809 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fdb7ee35-d757-414f-b20f-227ce78917e7" path="/var/lib/kubelet/pods/fdb7ee35-d757-414f-b20f-227ce78917e7/volumes" Dec 03 08:15:01 crc kubenswrapper[4831]: W1203 08:15:01.161984 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9cbbf48_7afa_4974_abdf_29610bca3012.slice/crio-73046d1a6449e08a2c89d740131fd866b952688b248c1eb30c121bffc2bb928b WatchSource:0}: Error finding container 73046d1a6449e08a2c89d740131fd866b952688b248c1eb30c121bffc2bb928b: Status 404 returned error can't find the container with id 73046d1a6449e08a2c89d740131fd866b952688b248c1eb30c121bffc2bb928b Dec 03 08:15:01 crc kubenswrapper[4831]: I1203 08:15:01.165047 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs"] Dec 03 08:15:01 crc kubenswrapper[4831]: I1203 08:15:01.408971 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Dec 03 08:15:01 crc kubenswrapper[4831]: I1203 08:15:01.408995 4831 scope.go:117] "RemoveContainer" containerID="4355591e4499fd416dff34cd2bd7e62fac68847762422a98953e866be23cf375" Dec 03 08:15:01 crc kubenswrapper[4831]: I1203 08:15:01.411825 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe","Type":"ContainerStarted","Data":"55d00496bc6628402ab9f659c48de3a6b2a6356a3844cb4a1ddf3950deb36587"} Dec 03 08:15:01 crc kubenswrapper[4831]: I1203 08:15:01.413503 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" event={"ID":"f9cbbf48-7afa-4974-abdf-29610bca3012","Type":"ContainerStarted","Data":"f414287ed552dd798b0f0866d2be1f587127d848aa75325c91a72844d9535b21"} Dec 03 08:15:01 crc kubenswrapper[4831]: I1203 08:15:01.413533 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" event={"ID":"f9cbbf48-7afa-4974-abdf-29610bca3012","Type":"ContainerStarted","Data":"73046d1a6449e08a2c89d740131fd866b952688b248c1eb30c121bffc2bb928b"} Dec 03 08:15:01 crc kubenswrapper[4831]: I1203 08:15:01.415100 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b7346d4a-a166-4419-8305-76ecd3ccf9b1","Type":"ContainerStarted","Data":"2f2949e54b559a8c3ce84c4e4ffa4944ab6126df9d16fc479e096d435a2ba252"} Dec 03 08:15:01 crc kubenswrapper[4831]: I1203 08:15:01.417512 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"670949d7-821a-4f20-84df-e32be87cac88","Type":"ContainerStarted","Data":"d9ad0e1bbdcc3cfa62848e842507fa8a62d6f3816b2d6475a7babc9781a6e44c"} Dec 03 08:15:01 crc kubenswrapper[4831]: I1203 08:15:01.417692 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Dec 03 08:15:01 crc kubenswrapper[4831]: I1203 08:15:01.470530 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" podStartSLOduration=1.470505042 podStartE2EDuration="1.470505042s" podCreationTimestamp="2025-12-03 08:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:15:01.438954009 +0000 UTC m=+6238.782537567" watchObservedRunningTime="2025-12-03 08:15:01.470505042 +0000 UTC m=+6238.814088550" Dec 03 08:15:01 crc kubenswrapper[4831]: I1203 08:15:01.479732 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.873700303 podStartE2EDuration="3.479710648s" podCreationTimestamp="2025-12-03 08:14:58 +0000 UTC" firstStartedPulling="2025-12-03 08:14:59.938247466 +0000 UTC m=+6237.281830974" lastFinishedPulling="2025-12-03 08:15:00.544257811 +0000 UTC m=+6237.887841319" observedRunningTime="2025-12-03 08:15:01.461596244 +0000 UTC m=+6238.805179752" watchObservedRunningTime="2025-12-03 08:15:01.479710648 +0000 UTC m=+6238.823294156" Dec 03 08:15:02 crc kubenswrapper[4831]: I1203 08:15:02.439513 4831 generic.go:334] "Generic (PLEG): container finished" podID="f9cbbf48-7afa-4974-abdf-29610bca3012" containerID="f414287ed552dd798b0f0866d2be1f587127d848aa75325c91a72844d9535b21" exitCode=0 Dec 03 08:15:02 crc kubenswrapper[4831]: I1203 08:15:02.439579 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" event={"ID":"f9cbbf48-7afa-4974-abdf-29610bca3012","Type":"ContainerDied","Data":"f414287ed552dd798b0f0866d2be1f587127d848aa75325c91a72844d9535b21"} Dec 03 08:15:03 crc kubenswrapper[4831]: I1203 08:15:03.851371 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" Dec 03 08:15:03 crc kubenswrapper[4831]: I1203 08:15:03.968226 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9cbbf48-7afa-4974-abdf-29610bca3012-secret-volume\") pod \"f9cbbf48-7afa-4974-abdf-29610bca3012\" (UID: \"f9cbbf48-7afa-4974-abdf-29610bca3012\") " Dec 03 08:15:03 crc kubenswrapper[4831]: I1203 08:15:03.968957 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pxn7w\" (UniqueName: \"kubernetes.io/projected/f9cbbf48-7afa-4974-abdf-29610bca3012-kube-api-access-pxn7w\") pod \"f9cbbf48-7afa-4974-abdf-29610bca3012\" (UID: \"f9cbbf48-7afa-4974-abdf-29610bca3012\") " Dec 03 08:15:03 crc kubenswrapper[4831]: I1203 08:15:03.969025 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9cbbf48-7afa-4974-abdf-29610bca3012-config-volume\") pod \"f9cbbf48-7afa-4974-abdf-29610bca3012\" (UID: \"f9cbbf48-7afa-4974-abdf-29610bca3012\") " Dec 03 08:15:03 crc kubenswrapper[4831]: I1203 08:15:03.970535 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9cbbf48-7afa-4974-abdf-29610bca3012-config-volume" (OuterVolumeSpecName: "config-volume") pod "f9cbbf48-7afa-4974-abdf-29610bca3012" (UID: "f9cbbf48-7afa-4974-abdf-29610bca3012"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:15:03 crc kubenswrapper[4831]: I1203 08:15:03.973933 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9cbbf48-7afa-4974-abdf-29610bca3012-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f9cbbf48-7afa-4974-abdf-29610bca3012" (UID: "f9cbbf48-7afa-4974-abdf-29610bca3012"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:15:03 crc kubenswrapper[4831]: I1203 08:15:03.974123 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9cbbf48-7afa-4974-abdf-29610bca3012-kube-api-access-pxn7w" (OuterVolumeSpecName: "kube-api-access-pxn7w") pod "f9cbbf48-7afa-4974-abdf-29610bca3012" (UID: "f9cbbf48-7afa-4974-abdf-29610bca3012"). InnerVolumeSpecName "kube-api-access-pxn7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:15:04 crc kubenswrapper[4831]: I1203 08:15:04.072408 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pxn7w\" (UniqueName: \"kubernetes.io/projected/f9cbbf48-7afa-4974-abdf-29610bca3012-kube-api-access-pxn7w\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:04 crc kubenswrapper[4831]: I1203 08:15:04.072548 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f9cbbf48-7afa-4974-abdf-29610bca3012-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:04 crc kubenswrapper[4831]: I1203 08:15:04.072664 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f9cbbf48-7afa-4974-abdf-29610bca3012-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:04 crc kubenswrapper[4831]: I1203 08:15:04.500577 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz"] Dec 03 08:15:04 crc kubenswrapper[4831]: I1203 08:15:04.509933 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" event={"ID":"f9cbbf48-7afa-4974-abdf-29610bca3012","Type":"ContainerDied","Data":"73046d1a6449e08a2c89d740131fd866b952688b248c1eb30c121bffc2bb928b"} Dec 03 08:15:04 crc kubenswrapper[4831]: I1203 08:15:04.509978 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73046d1a6449e08a2c89d740131fd866b952688b248c1eb30c121bffc2bb928b" Dec 03 08:15:04 crc kubenswrapper[4831]: I1203 08:15:04.510000 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs" Dec 03 08:15:04 crc kubenswrapper[4831]: I1203 08:15:04.510438 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412450-q7hsz"] Dec 03 08:15:05 crc kubenswrapper[4831]: I1203 08:15:05.035312 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed8f3db3-96a2-4695-8c68-446fc5d299da" path="/var/lib/kubelet/pods/ed8f3db3-96a2-4695-8c68-446fc5d299da/volumes" Dec 03 08:15:07 crc kubenswrapper[4831]: I1203 08:15:07.581214 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b7346d4a-a166-4419-8305-76ecd3ccf9b1","Type":"ContainerStarted","Data":"3f42c6e5fa3562536a34625e41931c42e1b0024a40ffe26910f0b81d874b3f2e"} Dec 03 08:15:07 crc kubenswrapper[4831]: I1203 08:15:07.585118 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe","Type":"ContainerStarted","Data":"54887fac3fad437cc567a06ee394cdd1cede4648f5bf2dae3226cf6d7368b909"} Dec 03 08:15:08 crc kubenswrapper[4831]: I1203 08:15:08.788622 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Dec 03 08:15:11 crc kubenswrapper[4831]: I1203 08:15:11.059733 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-799b-account-create-update-bs5mz"] Dec 03 08:15:11 crc kubenswrapper[4831]: I1203 08:15:11.068519 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-7w2vb"] Dec 03 08:15:11 crc kubenswrapper[4831]: I1203 08:15:11.101139 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-mzhpz"] Dec 03 08:15:11 crc kubenswrapper[4831]: I1203 08:15:11.110937 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-799b-account-create-update-bs5mz"] Dec 03 08:15:11 crc kubenswrapper[4831]: I1203 08:15:11.120462 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-7w2vb"] Dec 03 08:15:11 crc kubenswrapper[4831]: I1203 08:15:11.138115 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-mzhpz"] Dec 03 08:15:12 crc kubenswrapper[4831]: I1203 08:15:12.048872 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-1d5b-account-create-update-8pz46"] Dec 03 08:15:12 crc kubenswrapper[4831]: I1203 08:15:12.066198 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-602f-account-create-update-zq45g"] Dec 03 08:15:12 crc kubenswrapper[4831]: I1203 08:15:12.079740 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-tqqqc"] Dec 03 08:15:12 crc kubenswrapper[4831]: I1203 08:15:12.090067 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-tqqqc"] Dec 03 08:15:12 crc kubenswrapper[4831]: I1203 08:15:12.100231 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-1d5b-account-create-update-8pz46"] Dec 03 08:15:12 crc kubenswrapper[4831]: I1203 08:15:12.110784 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-602f-account-create-update-zq45g"] Dec 03 08:15:13 crc kubenswrapper[4831]: I1203 08:15:13.032211 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c80b66c-c8e7-4252-9a57-dbd41a97b743" path="/var/lib/kubelet/pods/0c80b66c-c8e7-4252-9a57-dbd41a97b743/volumes" Dec 03 08:15:13 crc kubenswrapper[4831]: I1203 08:15:13.033975 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="408a6d5e-0406-4be5-8618-55c294287e17" path="/var/lib/kubelet/pods/408a6d5e-0406-4be5-8618-55c294287e17/volumes" Dec 03 08:15:13 crc kubenswrapper[4831]: I1203 08:15:13.035502 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bc239b6-ab23-4d2e-a970-ca26af1e40b2" path="/var/lib/kubelet/pods/6bc239b6-ab23-4d2e-a970-ca26af1e40b2/volumes" Dec 03 08:15:13 crc kubenswrapper[4831]: I1203 08:15:13.036549 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a657f01-40c8-4311-9929-1c8f616fdbd2" path="/var/lib/kubelet/pods/7a657f01-40c8-4311-9929-1c8f616fdbd2/volumes" Dec 03 08:15:13 crc kubenswrapper[4831]: I1203 08:15:13.038876 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="826d082d-eb2b-448b-bcdb-5b74e20e492a" path="/var/lib/kubelet/pods/826d082d-eb2b-448b-bcdb-5b74e20e492a/volumes" Dec 03 08:15:13 crc kubenswrapper[4831]: I1203 08:15:13.039736 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="98f2f011-b635-4508-bc7a-9898bffb6ca4" path="/var/lib/kubelet/pods/98f2f011-b635-4508-bc7a-9898bffb6ca4/volumes" Dec 03 08:15:15 crc kubenswrapper[4831]: I1203 08:15:15.687368 4831 generic.go:334] "Generic (PLEG): container finished" podID="aefc8d48-0a47-424c-bce7-8e5c75a6d0fe" containerID="54887fac3fad437cc567a06ee394cdd1cede4648f5bf2dae3226cf6d7368b909" exitCode=0 Dec 03 08:15:15 crc kubenswrapper[4831]: I1203 08:15:15.687433 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe","Type":"ContainerDied","Data":"54887fac3fad437cc567a06ee394cdd1cede4648f5bf2dae3226cf6d7368b909"} Dec 03 08:15:16 crc kubenswrapper[4831]: I1203 08:15:16.701465 4831 generic.go:334] "Generic (PLEG): container finished" podID="b7346d4a-a166-4419-8305-76ecd3ccf9b1" containerID="3f42c6e5fa3562536a34625e41931c42e1b0024a40ffe26910f0b81d874b3f2e" exitCode=0 Dec 03 08:15:16 crc kubenswrapper[4831]: I1203 08:15:16.701528 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b7346d4a-a166-4419-8305-76ecd3ccf9b1","Type":"ContainerDied","Data":"3f42c6e5fa3562536a34625e41931c42e1b0024a40ffe26910f0b81d874b3f2e"} Dec 03 08:15:19 crc kubenswrapper[4831]: I1203 08:15:19.734828 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe","Type":"ContainerStarted","Data":"44a4252781b15013a8d4efbc498a46470d32150337501b9f0db1aa28ad15b841"} Dec 03 08:15:21 crc kubenswrapper[4831]: I1203 08:15:21.621739 4831 scope.go:117] "RemoveContainer" containerID="83afc0e752a5df8dcb286c53d8a6dcd5ae6353db2e2fafb318b6362af7991333" Dec 03 08:15:22 crc kubenswrapper[4831]: I1203 08:15:22.032699 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zh2bx"] Dec 03 08:15:22 crc kubenswrapper[4831]: I1203 08:15:22.042815 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-zh2bx"] Dec 03 08:15:22 crc kubenswrapper[4831]: I1203 08:15:22.768199 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/alertmanager-metric-storage-0" event={"ID":"aefc8d48-0a47-424c-bce7-8e5c75a6d0fe","Type":"ContainerStarted","Data":"52146a62370553c9ca26f27837d3a894639191802acc88c8d8570dd47da23f16"} Dec 03 08:15:22 crc kubenswrapper[4831]: I1203 08:15:22.808865 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/alertmanager-metric-storage-0" podStartSLOduration=6.065449063 podStartE2EDuration="23.808813044s" podCreationTimestamp="2025-12-03 08:14:59 +0000 UTC" firstStartedPulling="2025-12-03 08:15:00.745791328 +0000 UTC m=+6238.089374836" lastFinishedPulling="2025-12-03 08:15:18.489155309 +0000 UTC m=+6255.832738817" observedRunningTime="2025-12-03 08:15:22.80196872 +0000 UTC m=+6260.145552238" watchObservedRunningTime="2025-12-03 08:15:22.808813044 +0000 UTC m=+6260.152396592" Dec 03 08:15:23 crc kubenswrapper[4831]: I1203 08:15:23.031611 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f966f8d-94ec-4e11-9049-35b3b66e192b" path="/var/lib/kubelet/pods/6f966f8d-94ec-4e11-9049-35b3b66e192b/volumes" Dec 03 08:15:23 crc kubenswrapper[4831]: I1203 08:15:23.247784 4831 scope.go:117] "RemoveContainer" containerID="f59fdcd959907bf12a3096cfae5d1674e40b384b1ee88597ba2592f88e4267e2" Dec 03 08:15:23 crc kubenswrapper[4831]: I1203 08:15:23.380630 4831 scope.go:117] "RemoveContainer" containerID="3aed52b3ebebf32f3f8db0d19b0d4bfdfc0a3f87d91edf8e0d473bfa37e94e9a" Dec 03 08:15:23 crc kubenswrapper[4831]: I1203 08:15:23.415921 4831 scope.go:117] "RemoveContainer" containerID="fd71482d9f20efd9e15333458c0dafd8184ac4a2e3cd91a3595f4d7d91bad990" Dec 03 08:15:23 crc kubenswrapper[4831]: I1203 08:15:23.552066 4831 scope.go:117] "RemoveContainer" containerID="82e2feb07f726b72e240005680cfdd8168a870bd48d8cc354c83df66a6570e1b" Dec 03 08:15:23 crc kubenswrapper[4831]: I1203 08:15:23.583868 4831 scope.go:117] "RemoveContainer" containerID="f1b858e32684363fea0633275d5005c29951501d24ea444d785f268121f09709" Dec 03 08:15:23 crc kubenswrapper[4831]: I1203 08:15:23.618599 4831 scope.go:117] "RemoveContainer" containerID="ac915b22360dd455aaeec441b056b79e0b914d84f303fbb9ca2670a8acae304a" Dec 03 08:15:23 crc kubenswrapper[4831]: I1203 08:15:23.779866 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b7346d4a-a166-4419-8305-76ecd3ccf9b1","Type":"ContainerStarted","Data":"4a66d4e734351ba1bd22cb06893372b0ad21f788b1e4ac9637c8daeb2ef43e09"} Dec 03 08:15:23 crc kubenswrapper[4831]: I1203 08:15:23.787250 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/alertmanager-metric-storage-0" Dec 03 08:15:23 crc kubenswrapper[4831]: I1203 08:15:23.791024 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/alertmanager-metric-storage-0" Dec 03 08:15:30 crc kubenswrapper[4831]: I1203 08:15:30.875451 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b7346d4a-a166-4419-8305-76ecd3ccf9b1","Type":"ContainerStarted","Data":"52db529eee6f74e9c39e83308a87c447dbc9e5579894558ca7e44c4e9df48abb"} Dec 03 08:15:34 crc kubenswrapper[4831]: I1203 08:15:34.926177 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/prometheus-metric-storage-0" event={"ID":"b7346d4a-a166-4419-8305-76ecd3ccf9b1","Type":"ContainerStarted","Data":"b8a0e3cca92b96118eb0685f5e4817dd40adb4c1a1e97235f25a82b076fad6e7"} Dec 03 08:15:34 crc kubenswrapper[4831]: I1203 08:15:34.998228 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/prometheus-metric-storage-0" podStartSLOduration=4.138914501 podStartE2EDuration="36.998209336s" podCreationTimestamp="2025-12-03 08:14:58 +0000 UTC" firstStartedPulling="2025-12-03 08:15:01.016491131 +0000 UTC m=+6238.360074639" lastFinishedPulling="2025-12-03 08:15:33.875785966 +0000 UTC m=+6271.219369474" observedRunningTime="2025-12-03 08:15:34.985147559 +0000 UTC m=+6272.328731067" watchObservedRunningTime="2025-12-03 08:15:34.998209336 +0000 UTC m=+6272.341792844" Dec 03 08:15:35 crc kubenswrapper[4831]: I1203 08:15:35.393039 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/prometheus-metric-storage-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.658370 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:15:38 crc kubenswrapper[4831]: E1203 08:15:38.658981 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9cbbf48-7afa-4974-abdf-29610bca3012" containerName="collect-profiles" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.658993 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cbbf48-7afa-4974-abdf-29610bca3012" containerName="collect-profiles" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.659230 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9cbbf48-7afa-4974-abdf-29610bca3012" containerName="collect-profiles" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.661523 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.663280 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.664568 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.675762 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.771886 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-config-data\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.771933 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.771952 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e98544-148c-4212-810d-cf047b7efbe8-log-httpd\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.771971 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.771987 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-scripts\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.772029 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7bkvk\" (UniqueName: \"kubernetes.io/projected/57e98544-148c-4212-810d-cf047b7efbe8-kube-api-access-7bkvk\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.772065 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e98544-148c-4212-810d-cf047b7efbe8-run-httpd\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.874490 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-config-data\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.874551 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.874580 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e98544-148c-4212-810d-cf047b7efbe8-log-httpd\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.874609 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.874633 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-scripts\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.874670 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7bkvk\" (UniqueName: \"kubernetes.io/projected/57e98544-148c-4212-810d-cf047b7efbe8-kube-api-access-7bkvk\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.874728 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e98544-148c-4212-810d-cf047b7efbe8-run-httpd\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.875429 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e98544-148c-4212-810d-cf047b7efbe8-log-httpd\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.875536 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e98544-148c-4212-810d-cf047b7efbe8-run-httpd\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.881078 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-config-data\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.883016 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.891131 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-scripts\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.897185 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7bkvk\" (UniqueName: \"kubernetes.io/projected/57e98544-148c-4212-810d-cf047b7efbe8-kube-api-access-7bkvk\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.897214 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " pod="openstack/ceilometer-0" Dec 03 08:15:38 crc kubenswrapper[4831]: I1203 08:15:38.985255 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:15:39 crc kubenswrapper[4831]: I1203 08:15:39.484435 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:15:39 crc kubenswrapper[4831]: I1203 08:15:39.986134 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e98544-148c-4212-810d-cf047b7efbe8","Type":"ContainerStarted","Data":"e56576bd356db12e056e8d6e01786147e5f85f945d7b198e7db0de7048b0156d"} Dec 03 08:15:40 crc kubenswrapper[4831]: I1203 08:15:40.051549 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hjj9z"] Dec 03 08:15:40 crc kubenswrapper[4831]: I1203 08:15:40.061135 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-hjj9z"] Dec 03 08:15:40 crc kubenswrapper[4831]: I1203 08:15:40.996448 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e98544-148c-4212-810d-cf047b7efbe8","Type":"ContainerStarted","Data":"2e3a7f5864f114c0d1055f3c14cd01253187f46e0f302c1bc4a9f29c144db417"} Dec 03 08:15:40 crc kubenswrapper[4831]: I1203 08:15:40.997056 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e98544-148c-4212-810d-cf047b7efbe8","Type":"ContainerStarted","Data":"f02c527450753835495c5b00da6b3614b4d4a51e9e8eef804ac4a311a0bcafa3"} Dec 03 08:15:41 crc kubenswrapper[4831]: I1203 08:15:41.056822 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4689e9b9-ff30-42db-b689-949dd272945f" path="/var/lib/kubelet/pods/4689e9b9-ff30-42db-b689-949dd272945f/volumes" Dec 03 08:15:41 crc kubenswrapper[4831]: I1203 08:15:41.057556 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-gzxbp"] Dec 03 08:15:41 crc kubenswrapper[4831]: I1203 08:15:41.057596 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-gzxbp"] Dec 03 08:15:42 crc kubenswrapper[4831]: I1203 08:15:42.011231 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e98544-148c-4212-810d-cf047b7efbe8","Type":"ContainerStarted","Data":"9f306b3d4cd2bfcd4c71d52235b1573c8da7635cf0d1c4eec6a95b9d1fb4dd33"} Dec 03 08:15:43 crc kubenswrapper[4831]: I1203 08:15:43.047119 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddf02d76-b548-4fab-9e1d-690a64c0be2e" path="/var/lib/kubelet/pods/ddf02d76-b548-4fab-9e1d-690a64c0be2e/volumes" Dec 03 08:15:43 crc kubenswrapper[4831]: I1203 08:15:43.048211 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e98544-148c-4212-810d-cf047b7efbe8","Type":"ContainerStarted","Data":"e407d6ae433fb3bf9b430ea470de4fe512cc2b5c8bfddbc90f4240795ff6d6d9"} Dec 03 08:15:43 crc kubenswrapper[4831]: I1203 08:15:43.048258 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 08:15:43 crc kubenswrapper[4831]: I1203 08:15:43.093006 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.8694899120000001 podStartE2EDuration="5.092971044s" podCreationTimestamp="2025-12-03 08:15:38 +0000 UTC" firstStartedPulling="2025-12-03 08:15:39.490116706 +0000 UTC m=+6276.833700214" lastFinishedPulling="2025-12-03 08:15:42.713597808 +0000 UTC m=+6280.057181346" observedRunningTime="2025-12-03 08:15:43.087681989 +0000 UTC m=+6280.431265517" watchObservedRunningTime="2025-12-03 08:15:43.092971044 +0000 UTC m=+6280.436554562" Dec 03 08:15:45 crc kubenswrapper[4831]: I1203 08:15:45.393112 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/prometheus-metric-storage-0" Dec 03 08:15:45 crc kubenswrapper[4831]: I1203 08:15:45.396262 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/prometheus-metric-storage-0" Dec 03 08:15:46 crc kubenswrapper[4831]: I1203 08:15:46.070274 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/prometheus-metric-storage-0" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.285833 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-create-f7s6p"] Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.288119 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f7s6p" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.309965 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-f7s6p"] Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.381935 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-c002-account-create-update-ks2p9"] Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.383632 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-c002-account-create-update-ks2p9" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.386056 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-db-secret" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.390405 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-c002-account-create-update-ks2p9"] Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.463248 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8020737-2ead-4709-ad77-8433e4fb38cb-operator-scripts\") pod \"aodh-db-create-f7s6p\" (UID: \"e8020737-2ead-4709-ad77-8433e4fb38cb\") " pod="openstack/aodh-db-create-f7s6p" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.463483 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9n2q\" (UniqueName: \"kubernetes.io/projected/e8020737-2ead-4709-ad77-8433e4fb38cb-kube-api-access-v9n2q\") pod \"aodh-db-create-f7s6p\" (UID: \"e8020737-2ead-4709-ad77-8433e4fb38cb\") " pod="openstack/aodh-db-create-f7s6p" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.566138 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab1c516-0cc9-4ee3-b0ac-993e92c6fd10-operator-scripts\") pod \"aodh-c002-account-create-update-ks2p9\" (UID: \"aab1c516-0cc9-4ee3-b0ac-993e92c6fd10\") " pod="openstack/aodh-c002-account-create-update-ks2p9" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.566593 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddfcf\" (UniqueName: \"kubernetes.io/projected/aab1c516-0cc9-4ee3-b0ac-993e92c6fd10-kube-api-access-ddfcf\") pod \"aodh-c002-account-create-update-ks2p9\" (UID: \"aab1c516-0cc9-4ee3-b0ac-993e92c6fd10\") " pod="openstack/aodh-c002-account-create-update-ks2p9" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.566801 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8020737-2ead-4709-ad77-8433e4fb38cb-operator-scripts\") pod \"aodh-db-create-f7s6p\" (UID: \"e8020737-2ead-4709-ad77-8433e4fb38cb\") " pod="openstack/aodh-db-create-f7s6p" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.566971 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9n2q\" (UniqueName: \"kubernetes.io/projected/e8020737-2ead-4709-ad77-8433e4fb38cb-kube-api-access-v9n2q\") pod \"aodh-db-create-f7s6p\" (UID: \"e8020737-2ead-4709-ad77-8433e4fb38cb\") " pod="openstack/aodh-db-create-f7s6p" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.567993 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8020737-2ead-4709-ad77-8433e4fb38cb-operator-scripts\") pod \"aodh-db-create-f7s6p\" (UID: \"e8020737-2ead-4709-ad77-8433e4fb38cb\") " pod="openstack/aodh-db-create-f7s6p" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.599005 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9n2q\" (UniqueName: \"kubernetes.io/projected/e8020737-2ead-4709-ad77-8433e4fb38cb-kube-api-access-v9n2q\") pod \"aodh-db-create-f7s6p\" (UID: \"e8020737-2ead-4709-ad77-8433e4fb38cb\") " pod="openstack/aodh-db-create-f7s6p" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.611343 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f7s6p" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.668903 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab1c516-0cc9-4ee3-b0ac-993e92c6fd10-operator-scripts\") pod \"aodh-c002-account-create-update-ks2p9\" (UID: \"aab1c516-0cc9-4ee3-b0ac-993e92c6fd10\") " pod="openstack/aodh-c002-account-create-update-ks2p9" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.669061 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddfcf\" (UniqueName: \"kubernetes.io/projected/aab1c516-0cc9-4ee3-b0ac-993e92c6fd10-kube-api-access-ddfcf\") pod \"aodh-c002-account-create-update-ks2p9\" (UID: \"aab1c516-0cc9-4ee3-b0ac-993e92c6fd10\") " pod="openstack/aodh-c002-account-create-update-ks2p9" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.671077 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab1c516-0cc9-4ee3-b0ac-993e92c6fd10-operator-scripts\") pod \"aodh-c002-account-create-update-ks2p9\" (UID: \"aab1c516-0cc9-4ee3-b0ac-993e92c6fd10\") " pod="openstack/aodh-c002-account-create-update-ks2p9" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.695273 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddfcf\" (UniqueName: \"kubernetes.io/projected/aab1c516-0cc9-4ee3-b0ac-993e92c6fd10-kube-api-access-ddfcf\") pod \"aodh-c002-account-create-update-ks2p9\" (UID: \"aab1c516-0cc9-4ee3-b0ac-993e92c6fd10\") " pod="openstack/aodh-c002-account-create-update-ks2p9" Dec 03 08:15:50 crc kubenswrapper[4831]: I1203 08:15:50.701591 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-c002-account-create-update-ks2p9" Dec 03 08:15:51 crc kubenswrapper[4831]: W1203 08:15:51.208902 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8020737_2ead_4709_ad77_8433e4fb38cb.slice/crio-4b7699e554ce1ece3edc0c41f1676fb8715573ead8076900cf9db381d6acc32c WatchSource:0}: Error finding container 4b7699e554ce1ece3edc0c41f1676fb8715573ead8076900cf9db381d6acc32c: Status 404 returned error can't find the container with id 4b7699e554ce1ece3edc0c41f1676fb8715573ead8076900cf9db381d6acc32c Dec 03 08:15:51 crc kubenswrapper[4831]: I1203 08:15:51.209068 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-create-f7s6p"] Dec 03 08:15:51 crc kubenswrapper[4831]: I1203 08:15:51.339796 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-c002-account-create-update-ks2p9"] Dec 03 08:15:51 crc kubenswrapper[4831]: W1203 08:15:51.343892 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaab1c516_0cc9_4ee3_b0ac_993e92c6fd10.slice/crio-4c8245661e0771ccc75cd81b0d307ec2028154464d97eee6be77d47b1671e93e WatchSource:0}: Error finding container 4c8245661e0771ccc75cd81b0d307ec2028154464d97eee6be77d47b1671e93e: Status 404 returned error can't find the container with id 4c8245661e0771ccc75cd81b0d307ec2028154464d97eee6be77d47b1671e93e Dec 03 08:15:52 crc kubenswrapper[4831]: I1203 08:15:52.130385 4831 generic.go:334] "Generic (PLEG): container finished" podID="e8020737-2ead-4709-ad77-8433e4fb38cb" containerID="489140d2e14c9afa177a1880cdf4b96133f8335120e51c370bdbcca6f3915763" exitCode=0 Dec 03 08:15:52 crc kubenswrapper[4831]: I1203 08:15:52.130449 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f7s6p" event={"ID":"e8020737-2ead-4709-ad77-8433e4fb38cb","Type":"ContainerDied","Data":"489140d2e14c9afa177a1880cdf4b96133f8335120e51c370bdbcca6f3915763"} Dec 03 08:15:52 crc kubenswrapper[4831]: I1203 08:15:52.130830 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f7s6p" event={"ID":"e8020737-2ead-4709-ad77-8433e4fb38cb","Type":"ContainerStarted","Data":"4b7699e554ce1ece3edc0c41f1676fb8715573ead8076900cf9db381d6acc32c"} Dec 03 08:15:52 crc kubenswrapper[4831]: I1203 08:15:52.133565 4831 generic.go:334] "Generic (PLEG): container finished" podID="aab1c516-0cc9-4ee3-b0ac-993e92c6fd10" containerID="ddf3a5fc7ce171a4af600381e10d4bc90fd1126b1a5bddc8b0ad55abe51a8845" exitCode=0 Dec 03 08:15:52 crc kubenswrapper[4831]: I1203 08:15:52.133611 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-c002-account-create-update-ks2p9" event={"ID":"aab1c516-0cc9-4ee3-b0ac-993e92c6fd10","Type":"ContainerDied","Data":"ddf3a5fc7ce171a4af600381e10d4bc90fd1126b1a5bddc8b0ad55abe51a8845"} Dec 03 08:15:52 crc kubenswrapper[4831]: I1203 08:15:52.133637 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-c002-account-create-update-ks2p9" event={"ID":"aab1c516-0cc9-4ee3-b0ac-993e92c6fd10","Type":"ContainerStarted","Data":"4c8245661e0771ccc75cd81b0d307ec2028154464d97eee6be77d47b1671e93e"} Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.704044 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-c002-account-create-update-ks2p9" Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.714093 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f7s6p" Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.849519 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab1c516-0cc9-4ee3-b0ac-993e92c6fd10-operator-scripts\") pod \"aab1c516-0cc9-4ee3-b0ac-993e92c6fd10\" (UID: \"aab1c516-0cc9-4ee3-b0ac-993e92c6fd10\") " Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.849654 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9n2q\" (UniqueName: \"kubernetes.io/projected/e8020737-2ead-4709-ad77-8433e4fb38cb-kube-api-access-v9n2q\") pod \"e8020737-2ead-4709-ad77-8433e4fb38cb\" (UID: \"e8020737-2ead-4709-ad77-8433e4fb38cb\") " Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.849919 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddfcf\" (UniqueName: \"kubernetes.io/projected/aab1c516-0cc9-4ee3-b0ac-993e92c6fd10-kube-api-access-ddfcf\") pod \"aab1c516-0cc9-4ee3-b0ac-993e92c6fd10\" (UID: \"aab1c516-0cc9-4ee3-b0ac-993e92c6fd10\") " Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.850007 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8020737-2ead-4709-ad77-8433e4fb38cb-operator-scripts\") pod \"e8020737-2ead-4709-ad77-8433e4fb38cb\" (UID: \"e8020737-2ead-4709-ad77-8433e4fb38cb\") " Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.850407 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aab1c516-0cc9-4ee3-b0ac-993e92c6fd10-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "aab1c516-0cc9-4ee3-b0ac-993e92c6fd10" (UID: "aab1c516-0cc9-4ee3-b0ac-993e92c6fd10"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.851077 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8020737-2ead-4709-ad77-8433e4fb38cb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e8020737-2ead-4709-ad77-8433e4fb38cb" (UID: "e8020737-2ead-4709-ad77-8433e4fb38cb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.851179 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/aab1c516-0cc9-4ee3-b0ac-993e92c6fd10-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.859103 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aab1c516-0cc9-4ee3-b0ac-993e92c6fd10-kube-api-access-ddfcf" (OuterVolumeSpecName: "kube-api-access-ddfcf") pod "aab1c516-0cc9-4ee3-b0ac-993e92c6fd10" (UID: "aab1c516-0cc9-4ee3-b0ac-993e92c6fd10"). InnerVolumeSpecName "kube-api-access-ddfcf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.859256 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8020737-2ead-4709-ad77-8433e4fb38cb-kube-api-access-v9n2q" (OuterVolumeSpecName: "kube-api-access-v9n2q") pod "e8020737-2ead-4709-ad77-8433e4fb38cb" (UID: "e8020737-2ead-4709-ad77-8433e4fb38cb"). InnerVolumeSpecName "kube-api-access-v9n2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.953723 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9n2q\" (UniqueName: \"kubernetes.io/projected/e8020737-2ead-4709-ad77-8433e4fb38cb-kube-api-access-v9n2q\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.953771 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddfcf\" (UniqueName: \"kubernetes.io/projected/aab1c516-0cc9-4ee3-b0ac-993e92c6fd10-kube-api-access-ddfcf\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:53 crc kubenswrapper[4831]: I1203 08:15:53.953784 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e8020737-2ead-4709-ad77-8433e4fb38cb-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:15:54 crc kubenswrapper[4831]: I1203 08:15:54.046433 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-pclsl"] Dec 03 08:15:54 crc kubenswrapper[4831]: I1203 08:15:54.058634 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-pclsl"] Dec 03 08:15:54 crc kubenswrapper[4831]: I1203 08:15:54.168535 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-c002-account-create-update-ks2p9" Dec 03 08:15:54 crc kubenswrapper[4831]: I1203 08:15:54.168582 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-c002-account-create-update-ks2p9" event={"ID":"aab1c516-0cc9-4ee3-b0ac-993e92c6fd10","Type":"ContainerDied","Data":"4c8245661e0771ccc75cd81b0d307ec2028154464d97eee6be77d47b1671e93e"} Dec 03 08:15:54 crc kubenswrapper[4831]: I1203 08:15:54.168642 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4c8245661e0771ccc75cd81b0d307ec2028154464d97eee6be77d47b1671e93e" Dec 03 08:15:54 crc kubenswrapper[4831]: I1203 08:15:54.170612 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-create-f7s6p" event={"ID":"e8020737-2ead-4709-ad77-8433e4fb38cb","Type":"ContainerDied","Data":"4b7699e554ce1ece3edc0c41f1676fb8715573ead8076900cf9db381d6acc32c"} Dec 03 08:15:54 crc kubenswrapper[4831]: I1203 08:15:54.170657 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4b7699e554ce1ece3edc0c41f1676fb8715573ead8076900cf9db381d6acc32c" Dec 03 08:15:54 crc kubenswrapper[4831]: I1203 08:15:54.170739 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-create-f7s6p" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.030381 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5493d05-e214-4ce9-ab1d-39dc6d512041" path="/var/lib/kubelet/pods/d5493d05-e214-4ce9-ab1d-39dc6d512041/volumes" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.783512 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-db-sync-4c4nv"] Dec 03 08:15:55 crc kubenswrapper[4831]: E1203 08:15:55.784117 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8020737-2ead-4709-ad77-8433e4fb38cb" containerName="mariadb-database-create" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.784141 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8020737-2ead-4709-ad77-8433e4fb38cb" containerName="mariadb-database-create" Dec 03 08:15:55 crc kubenswrapper[4831]: E1203 08:15:55.784162 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aab1c516-0cc9-4ee3-b0ac-993e92c6fd10" containerName="mariadb-account-create-update" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.784169 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="aab1c516-0cc9-4ee3-b0ac-993e92c6fd10" containerName="mariadb-account-create-update" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.784432 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8020737-2ead-4709-ad77-8433e4fb38cb" containerName="mariadb-database-create" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.784461 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="aab1c516-0cc9-4ee3-b0ac-993e92c6fd10" containerName="mariadb-account-create-update" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.787700 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.791958 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.792540 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.792709 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-lpjtj" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.792835 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.795581 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-4c4nv"] Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.904307 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-scripts\") pod \"aodh-db-sync-4c4nv\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.904549 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-config-data\") pod \"aodh-db-sync-4c4nv\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.904649 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx6zr\" (UniqueName: \"kubernetes.io/projected/e766432e-74e3-4160-adf1-1d2406683662-kube-api-access-fx6zr\") pod \"aodh-db-sync-4c4nv\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:55 crc kubenswrapper[4831]: I1203 08:15:55.904778 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-combined-ca-bundle\") pod \"aodh-db-sync-4c4nv\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:56 crc kubenswrapper[4831]: I1203 08:15:56.006807 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-scripts\") pod \"aodh-db-sync-4c4nv\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:56 crc kubenswrapper[4831]: I1203 08:15:56.006894 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-config-data\") pod \"aodh-db-sync-4c4nv\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:56 crc kubenswrapper[4831]: I1203 08:15:56.006973 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx6zr\" (UniqueName: \"kubernetes.io/projected/e766432e-74e3-4160-adf1-1d2406683662-kube-api-access-fx6zr\") pod \"aodh-db-sync-4c4nv\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:56 crc kubenswrapper[4831]: I1203 08:15:56.007085 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-combined-ca-bundle\") pod \"aodh-db-sync-4c4nv\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:56 crc kubenswrapper[4831]: I1203 08:15:56.013109 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-config-data\") pod \"aodh-db-sync-4c4nv\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:56 crc kubenswrapper[4831]: I1203 08:15:56.013345 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-scripts\") pod \"aodh-db-sync-4c4nv\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:56 crc kubenswrapper[4831]: I1203 08:15:56.028195 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx6zr\" (UniqueName: \"kubernetes.io/projected/e766432e-74e3-4160-adf1-1d2406683662-kube-api-access-fx6zr\") pod \"aodh-db-sync-4c4nv\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:56 crc kubenswrapper[4831]: I1203 08:15:56.028720 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-combined-ca-bundle\") pod \"aodh-db-sync-4c4nv\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:56 crc kubenswrapper[4831]: I1203 08:15:56.122148 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:15:56 crc kubenswrapper[4831]: I1203 08:15:56.605003 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-db-sync-4c4nv"] Dec 03 08:15:57 crc kubenswrapper[4831]: I1203 08:15:57.204377 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-4c4nv" event={"ID":"e766432e-74e3-4160-adf1-1d2406683662","Type":"ContainerStarted","Data":"50be7845cfbca123f24244b8589fc25cefb591747749229ece979b444850019b"} Dec 03 08:16:02 crc kubenswrapper[4831]: I1203 08:16:02.279363 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-4c4nv" event={"ID":"e766432e-74e3-4160-adf1-1d2406683662","Type":"ContainerStarted","Data":"ede2f4831e82f16cdcd581ca83e3c1e0e006bfc3f238dbeb11528004f043a658"} Dec 03 08:16:02 crc kubenswrapper[4831]: I1203 08:16:02.311657 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-db-sync-4c4nv" podStartSLOduration=2.265912131 podStartE2EDuration="7.31163692s" podCreationTimestamp="2025-12-03 08:15:55 +0000 UTC" firstStartedPulling="2025-12-03 08:15:56.608904728 +0000 UTC m=+6293.952488236" lastFinishedPulling="2025-12-03 08:16:01.654629517 +0000 UTC m=+6298.998213025" observedRunningTime="2025-12-03 08:16:02.297020425 +0000 UTC m=+6299.640603963" watchObservedRunningTime="2025-12-03 08:16:02.31163692 +0000 UTC m=+6299.655220438" Dec 03 08:16:05 crc kubenswrapper[4831]: I1203 08:16:05.322439 4831 generic.go:334] "Generic (PLEG): container finished" podID="e766432e-74e3-4160-adf1-1d2406683662" containerID="ede2f4831e82f16cdcd581ca83e3c1e0e006bfc3f238dbeb11528004f043a658" exitCode=0 Dec 03 08:16:05 crc kubenswrapper[4831]: I1203 08:16:05.322549 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-4c4nv" event={"ID":"e766432e-74e3-4160-adf1-1d2406683662","Type":"ContainerDied","Data":"ede2f4831e82f16cdcd581ca83e3c1e0e006bfc3f238dbeb11528004f043a658"} Dec 03 08:16:06 crc kubenswrapper[4831]: I1203 08:16:06.861853 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.051203 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fx6zr\" (UniqueName: \"kubernetes.io/projected/e766432e-74e3-4160-adf1-1d2406683662-kube-api-access-fx6zr\") pod \"e766432e-74e3-4160-adf1-1d2406683662\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.051442 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-scripts\") pod \"e766432e-74e3-4160-adf1-1d2406683662\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.051519 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-config-data\") pod \"e766432e-74e3-4160-adf1-1d2406683662\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.051823 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-combined-ca-bundle\") pod \"e766432e-74e3-4160-adf1-1d2406683662\" (UID: \"e766432e-74e3-4160-adf1-1d2406683662\") " Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.059515 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-scripts" (OuterVolumeSpecName: "scripts") pod "e766432e-74e3-4160-adf1-1d2406683662" (UID: "e766432e-74e3-4160-adf1-1d2406683662"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.065877 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e766432e-74e3-4160-adf1-1d2406683662-kube-api-access-fx6zr" (OuterVolumeSpecName: "kube-api-access-fx6zr") pod "e766432e-74e3-4160-adf1-1d2406683662" (UID: "e766432e-74e3-4160-adf1-1d2406683662"). InnerVolumeSpecName "kube-api-access-fx6zr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.088716 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e766432e-74e3-4160-adf1-1d2406683662" (UID: "e766432e-74e3-4160-adf1-1d2406683662"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.105623 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-config-data" (OuterVolumeSpecName: "config-data") pod "e766432e-74e3-4160-adf1-1d2406683662" (UID: "e766432e-74e3-4160-adf1-1d2406683662"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.155187 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.155233 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.155245 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e766432e-74e3-4160-adf1-1d2406683662-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.155260 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fx6zr\" (UniqueName: \"kubernetes.io/projected/e766432e-74e3-4160-adf1-1d2406683662-kube-api-access-fx6zr\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.359716 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-db-sync-4c4nv" event={"ID":"e766432e-74e3-4160-adf1-1d2406683662","Type":"ContainerDied","Data":"50be7845cfbca123f24244b8589fc25cefb591747749229ece979b444850019b"} Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.359766 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50be7845cfbca123f24244b8589fc25cefb591747749229ece979b444850019b" Dec 03 08:16:07 crc kubenswrapper[4831]: I1203 08:16:07.359841 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-db-sync-4c4nv" Dec 03 08:16:08 crc kubenswrapper[4831]: I1203 08:16:08.997173 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.157976 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/aodh-0"] Dec 03 08:16:11 crc kubenswrapper[4831]: E1203 08:16:11.160776 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e766432e-74e3-4160-adf1-1d2406683662" containerName="aodh-db-sync" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.160809 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e766432e-74e3-4160-adf1-1d2406683662" containerName="aodh-db-sync" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.161096 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e766432e-74e3-4160-adf1-1d2406683662" containerName="aodh-db-sync" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.163600 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.169690 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-config-data" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.172927 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-autoscaling-dockercfg-lpjtj" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.173008 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"aodh-scripts" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.182826 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.267858 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53ba8fe0-f472-4cd8-b061-4e5ac2be04e2-config-data\") pod \"aodh-0\" (UID: \"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2\") " pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.267930 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53ba8fe0-f472-4cd8-b061-4e5ac2be04e2-scripts\") pod \"aodh-0\" (UID: \"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2\") " pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.268138 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng4q7\" (UniqueName: \"kubernetes.io/projected/53ba8fe0-f472-4cd8-b061-4e5ac2be04e2-kube-api-access-ng4q7\") pod \"aodh-0\" (UID: \"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2\") " pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.268596 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53ba8fe0-f472-4cd8-b061-4e5ac2be04e2-combined-ca-bundle\") pod \"aodh-0\" (UID: \"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2\") " pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.370802 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ng4q7\" (UniqueName: \"kubernetes.io/projected/53ba8fe0-f472-4cd8-b061-4e5ac2be04e2-kube-api-access-ng4q7\") pod \"aodh-0\" (UID: \"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2\") " pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.370901 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53ba8fe0-f472-4cd8-b061-4e5ac2be04e2-combined-ca-bundle\") pod \"aodh-0\" (UID: \"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2\") " pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.370975 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53ba8fe0-f472-4cd8-b061-4e5ac2be04e2-config-data\") pod \"aodh-0\" (UID: \"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2\") " pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.371013 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53ba8fe0-f472-4cd8-b061-4e5ac2be04e2-scripts\") pod \"aodh-0\" (UID: \"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2\") " pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.387950 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/53ba8fe0-f472-4cd8-b061-4e5ac2be04e2-scripts\") pod \"aodh-0\" (UID: \"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2\") " pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.388376 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/53ba8fe0-f472-4cd8-b061-4e5ac2be04e2-config-data\") pod \"aodh-0\" (UID: \"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2\") " pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.388506 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/53ba8fe0-f472-4cd8-b061-4e5ac2be04e2-combined-ca-bundle\") pod \"aodh-0\" (UID: \"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2\") " pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.399045 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ng4q7\" (UniqueName: \"kubernetes.io/projected/53ba8fe0-f472-4cd8-b061-4e5ac2be04e2-kube-api-access-ng4q7\") pod \"aodh-0\" (UID: \"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2\") " pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.493793 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/aodh-0" Dec 03 08:16:11 crc kubenswrapper[4831]: I1203 08:16:11.978952 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/aodh-0"] Dec 03 08:16:12 crc kubenswrapper[4831]: I1203 08:16:12.427553 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2","Type":"ContainerStarted","Data":"9b4cf756d88bece726a94672b8f81519086b9eb1f79d98a719fc05c81e1ca88b"} Dec 03 08:16:13 crc kubenswrapper[4831]: I1203 08:16:13.342883 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:16:13 crc kubenswrapper[4831]: I1203 08:16:13.343358 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="ceilometer-central-agent" containerID="cri-o://f02c527450753835495c5b00da6b3614b4d4a51e9e8eef804ac4a311a0bcafa3" gracePeriod=30 Dec 03 08:16:13 crc kubenswrapper[4831]: I1203 08:16:13.343774 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="proxy-httpd" containerID="cri-o://e407d6ae433fb3bf9b430ea470de4fe512cc2b5c8bfddbc90f4240795ff6d6d9" gracePeriod=30 Dec 03 08:16:13 crc kubenswrapper[4831]: I1203 08:16:13.343831 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="sg-core" containerID="cri-o://9f306b3d4cd2bfcd4c71d52235b1573c8da7635cf0d1c4eec6a95b9d1fb4dd33" gracePeriod=30 Dec 03 08:16:13 crc kubenswrapper[4831]: I1203 08:16:13.343871 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="ceilometer-notification-agent" containerID="cri-o://2e3a7f5864f114c0d1055f3c14cd01253187f46e0f302c1bc4a9f29c144db417" gracePeriod=30 Dec 03 08:16:13 crc kubenswrapper[4831]: I1203 08:16:13.455153 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2","Type":"ContainerStarted","Data":"a3ec9fcad68bdb633e08d02b46f35fcc5ad0dc2bee513b7a445e8444c9f2a0f3"} Dec 03 08:16:14 crc kubenswrapper[4831]: I1203 08:16:14.467765 4831 generic.go:334] "Generic (PLEG): container finished" podID="57e98544-148c-4212-810d-cf047b7efbe8" containerID="e407d6ae433fb3bf9b430ea470de4fe512cc2b5c8bfddbc90f4240795ff6d6d9" exitCode=0 Dec 03 08:16:14 crc kubenswrapper[4831]: I1203 08:16:14.468078 4831 generic.go:334] "Generic (PLEG): container finished" podID="57e98544-148c-4212-810d-cf047b7efbe8" containerID="9f306b3d4cd2bfcd4c71d52235b1573c8da7635cf0d1c4eec6a95b9d1fb4dd33" exitCode=2 Dec 03 08:16:14 crc kubenswrapper[4831]: I1203 08:16:14.468088 4831 generic.go:334] "Generic (PLEG): container finished" podID="57e98544-148c-4212-810d-cf047b7efbe8" containerID="f02c527450753835495c5b00da6b3614b4d4a51e9e8eef804ac4a311a0bcafa3" exitCode=0 Dec 03 08:16:14 crc kubenswrapper[4831]: I1203 08:16:14.467866 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e98544-148c-4212-810d-cf047b7efbe8","Type":"ContainerDied","Data":"e407d6ae433fb3bf9b430ea470de4fe512cc2b5c8bfddbc90f4240795ff6d6d9"} Dec 03 08:16:14 crc kubenswrapper[4831]: I1203 08:16:14.468124 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e98544-148c-4212-810d-cf047b7efbe8","Type":"ContainerDied","Data":"9f306b3d4cd2bfcd4c71d52235b1573c8da7635cf0d1c4eec6a95b9d1fb4dd33"} Dec 03 08:16:14 crc kubenswrapper[4831]: I1203 08:16:14.468137 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e98544-148c-4212-810d-cf047b7efbe8","Type":"ContainerDied","Data":"f02c527450753835495c5b00da6b3614b4d4a51e9e8eef804ac4a311a0bcafa3"} Dec 03 08:16:15 crc kubenswrapper[4831]: I1203 08:16:15.487036 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2","Type":"ContainerStarted","Data":"703219e25d043ee93dec7f1591dcac7c3e70c155fba219773e5d0e0f2163d600"} Dec 03 08:16:16 crc kubenswrapper[4831]: I1203 08:16:16.498304 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2","Type":"ContainerStarted","Data":"ba72b9cede296ffbf26d9946ad8f504877fa78ccbabc38c6045b1af3df78056d"} Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.212428 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.332850 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-sg-core-conf-yaml\") pod \"57e98544-148c-4212-810d-cf047b7efbe8\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.332968 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-config-data\") pod \"57e98544-148c-4212-810d-cf047b7efbe8\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.333013 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-scripts\") pod \"57e98544-148c-4212-810d-cf047b7efbe8\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.333145 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-combined-ca-bundle\") pod \"57e98544-148c-4212-810d-cf047b7efbe8\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.333189 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e98544-148c-4212-810d-cf047b7efbe8-run-httpd\") pod \"57e98544-148c-4212-810d-cf047b7efbe8\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.333271 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7bkvk\" (UniqueName: \"kubernetes.io/projected/57e98544-148c-4212-810d-cf047b7efbe8-kube-api-access-7bkvk\") pod \"57e98544-148c-4212-810d-cf047b7efbe8\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.333449 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e98544-148c-4212-810d-cf047b7efbe8-log-httpd\") pod \"57e98544-148c-4212-810d-cf047b7efbe8\" (UID: \"57e98544-148c-4212-810d-cf047b7efbe8\") " Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.334791 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57e98544-148c-4212-810d-cf047b7efbe8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "57e98544-148c-4212-810d-cf047b7efbe8" (UID: "57e98544-148c-4212-810d-cf047b7efbe8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.337924 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57e98544-148c-4212-810d-cf047b7efbe8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "57e98544-148c-4212-810d-cf047b7efbe8" (UID: "57e98544-148c-4212-810d-cf047b7efbe8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.342688 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-scripts" (OuterVolumeSpecName: "scripts") pod "57e98544-148c-4212-810d-cf047b7efbe8" (UID: "57e98544-148c-4212-810d-cf047b7efbe8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.345284 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57e98544-148c-4212-810d-cf047b7efbe8-kube-api-access-7bkvk" (OuterVolumeSpecName: "kube-api-access-7bkvk") pod "57e98544-148c-4212-810d-cf047b7efbe8" (UID: "57e98544-148c-4212-810d-cf047b7efbe8"). InnerVolumeSpecName "kube-api-access-7bkvk". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.380163 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "57e98544-148c-4212-810d-cf047b7efbe8" (UID: "57e98544-148c-4212-810d-cf047b7efbe8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.436380 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e98544-148c-4212-810d-cf047b7efbe8-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.436638 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.436653 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.436661 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/57e98544-148c-4212-810d-cf047b7efbe8-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.436670 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7bkvk\" (UniqueName: \"kubernetes.io/projected/57e98544-148c-4212-810d-cf047b7efbe8-kube-api-access-7bkvk\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.448371 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "57e98544-148c-4212-810d-cf047b7efbe8" (UID: "57e98544-148c-4212-810d-cf047b7efbe8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.472288 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-config-data" (OuterVolumeSpecName: "config-data") pod "57e98544-148c-4212-810d-cf047b7efbe8" (UID: "57e98544-148c-4212-810d-cf047b7efbe8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.523736 4831 generic.go:334] "Generic (PLEG): container finished" podID="57e98544-148c-4212-810d-cf047b7efbe8" containerID="2e3a7f5864f114c0d1055f3c14cd01253187f46e0f302c1bc4a9f29c144db417" exitCode=0 Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.523790 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e98544-148c-4212-810d-cf047b7efbe8","Type":"ContainerDied","Data":"2e3a7f5864f114c0d1055f3c14cd01253187f46e0f302c1bc4a9f29c144db417"} Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.523817 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"57e98544-148c-4212-810d-cf047b7efbe8","Type":"ContainerDied","Data":"e56576bd356db12e056e8d6e01786147e5f85f945d7b198e7db0de7048b0156d"} Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.523834 4831 scope.go:117] "RemoveContainer" containerID="e407d6ae433fb3bf9b430ea470de4fe512cc2b5c8bfddbc90f4240795ff6d6d9" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.523956 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.535923 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/aodh-0" event={"ID":"53ba8fe0-f472-4cd8-b061-4e5ac2be04e2","Type":"ContainerStarted","Data":"d7ff90eba5a2162cd6d337dc18a3d316a5d65985b2fd99138e8f53fef40024a0"} Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.538168 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.538183 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/57e98544-148c-4212-810d-cf047b7efbe8-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.554468 4831 scope.go:117] "RemoveContainer" containerID="9f306b3d4cd2bfcd4c71d52235b1573c8da7635cf0d1c4eec6a95b9d1fb4dd33" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.562228 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/aodh-0" podStartSLOduration=1.5650086010000002 podStartE2EDuration="7.562209976s" podCreationTimestamp="2025-12-03 08:16:11 +0000 UTC" firstStartedPulling="2025-12-03 08:16:11.984688786 +0000 UTC m=+6309.328272294" lastFinishedPulling="2025-12-03 08:16:17.981890121 +0000 UTC m=+6315.325473669" observedRunningTime="2025-12-03 08:16:18.559541402 +0000 UTC m=+6315.903124930" watchObservedRunningTime="2025-12-03 08:16:18.562209976 +0000 UTC m=+6315.905793484" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.584523 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.591632 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.619602 4831 scope.go:117] "RemoveContainer" containerID="2e3a7f5864f114c0d1055f3c14cd01253187f46e0f302c1bc4a9f29c144db417" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.621115 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:16:18 crc kubenswrapper[4831]: E1203 08:16:18.621537 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="proxy-httpd" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.621551 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="proxy-httpd" Dec 03 08:16:18 crc kubenswrapper[4831]: E1203 08:16:18.621570 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="ceilometer-central-agent" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.621576 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="ceilometer-central-agent" Dec 03 08:16:18 crc kubenswrapper[4831]: E1203 08:16:18.621592 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="sg-core" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.621597 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="sg-core" Dec 03 08:16:18 crc kubenswrapper[4831]: E1203 08:16:18.621618 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="ceilometer-notification-agent" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.621624 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="ceilometer-notification-agent" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.621802 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="ceilometer-central-agent" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.621823 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="ceilometer-notification-agent" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.621833 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="proxy-httpd" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.621845 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="57e98544-148c-4212-810d-cf047b7efbe8" containerName="sg-core" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.628367 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.638762 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.638823 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.640351 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.664305 4831 scope.go:117] "RemoveContainer" containerID="f02c527450753835495c5b00da6b3614b4d4a51e9e8eef804ac4a311a0bcafa3" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.693491 4831 scope.go:117] "RemoveContainer" containerID="e407d6ae433fb3bf9b430ea470de4fe512cc2b5c8bfddbc90f4240795ff6d6d9" Dec 03 08:16:18 crc kubenswrapper[4831]: E1203 08:16:18.693996 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e407d6ae433fb3bf9b430ea470de4fe512cc2b5c8bfddbc90f4240795ff6d6d9\": container with ID starting with e407d6ae433fb3bf9b430ea470de4fe512cc2b5c8bfddbc90f4240795ff6d6d9 not found: ID does not exist" containerID="e407d6ae433fb3bf9b430ea470de4fe512cc2b5c8bfddbc90f4240795ff6d6d9" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.694090 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e407d6ae433fb3bf9b430ea470de4fe512cc2b5c8bfddbc90f4240795ff6d6d9"} err="failed to get container status \"e407d6ae433fb3bf9b430ea470de4fe512cc2b5c8bfddbc90f4240795ff6d6d9\": rpc error: code = NotFound desc = could not find container \"e407d6ae433fb3bf9b430ea470de4fe512cc2b5c8bfddbc90f4240795ff6d6d9\": container with ID starting with e407d6ae433fb3bf9b430ea470de4fe512cc2b5c8bfddbc90f4240795ff6d6d9 not found: ID does not exist" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.694128 4831 scope.go:117] "RemoveContainer" containerID="9f306b3d4cd2bfcd4c71d52235b1573c8da7635cf0d1c4eec6a95b9d1fb4dd33" Dec 03 08:16:18 crc kubenswrapper[4831]: E1203 08:16:18.697231 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9f306b3d4cd2bfcd4c71d52235b1573c8da7635cf0d1c4eec6a95b9d1fb4dd33\": container with ID starting with 9f306b3d4cd2bfcd4c71d52235b1573c8da7635cf0d1c4eec6a95b9d1fb4dd33 not found: ID does not exist" containerID="9f306b3d4cd2bfcd4c71d52235b1573c8da7635cf0d1c4eec6a95b9d1fb4dd33" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.697276 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9f306b3d4cd2bfcd4c71d52235b1573c8da7635cf0d1c4eec6a95b9d1fb4dd33"} err="failed to get container status \"9f306b3d4cd2bfcd4c71d52235b1573c8da7635cf0d1c4eec6a95b9d1fb4dd33\": rpc error: code = NotFound desc = could not find container \"9f306b3d4cd2bfcd4c71d52235b1573c8da7635cf0d1c4eec6a95b9d1fb4dd33\": container with ID starting with 9f306b3d4cd2bfcd4c71d52235b1573c8da7635cf0d1c4eec6a95b9d1fb4dd33 not found: ID does not exist" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.697302 4831 scope.go:117] "RemoveContainer" containerID="2e3a7f5864f114c0d1055f3c14cd01253187f46e0f302c1bc4a9f29c144db417" Dec 03 08:16:18 crc kubenswrapper[4831]: E1203 08:16:18.697767 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e3a7f5864f114c0d1055f3c14cd01253187f46e0f302c1bc4a9f29c144db417\": container with ID starting with 2e3a7f5864f114c0d1055f3c14cd01253187f46e0f302c1bc4a9f29c144db417 not found: ID does not exist" containerID="2e3a7f5864f114c0d1055f3c14cd01253187f46e0f302c1bc4a9f29c144db417" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.697792 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e3a7f5864f114c0d1055f3c14cd01253187f46e0f302c1bc4a9f29c144db417"} err="failed to get container status \"2e3a7f5864f114c0d1055f3c14cd01253187f46e0f302c1bc4a9f29c144db417\": rpc error: code = NotFound desc = could not find container \"2e3a7f5864f114c0d1055f3c14cd01253187f46e0f302c1bc4a9f29c144db417\": container with ID starting with 2e3a7f5864f114c0d1055f3c14cd01253187f46e0f302c1bc4a9f29c144db417 not found: ID does not exist" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.697810 4831 scope.go:117] "RemoveContainer" containerID="f02c527450753835495c5b00da6b3614b4d4a51e9e8eef804ac4a311a0bcafa3" Dec 03 08:16:18 crc kubenswrapper[4831]: E1203 08:16:18.698471 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f02c527450753835495c5b00da6b3614b4d4a51e9e8eef804ac4a311a0bcafa3\": container with ID starting with f02c527450753835495c5b00da6b3614b4d4a51e9e8eef804ac4a311a0bcafa3 not found: ID does not exist" containerID="f02c527450753835495c5b00da6b3614b4d4a51e9e8eef804ac4a311a0bcafa3" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.698498 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f02c527450753835495c5b00da6b3614b4d4a51e9e8eef804ac4a311a0bcafa3"} err="failed to get container status \"f02c527450753835495c5b00da6b3614b4d4a51e9e8eef804ac4a311a0bcafa3\": rpc error: code = NotFound desc = could not find container \"f02c527450753835495c5b00da6b3614b4d4a51e9e8eef804ac4a311a0bcafa3\": container with ID starting with f02c527450753835495c5b00da6b3614b4d4a51e9e8eef804ac4a311a0bcafa3 not found: ID does not exist" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.741572 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.741617 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.741897 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c45c9d6a-f99b-423b-aec8-e327bb32263c-run-httpd\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.742102 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vh2bd\" (UniqueName: \"kubernetes.io/projected/c45c9d6a-f99b-423b-aec8-e327bb32263c-kube-api-access-vh2bd\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.742128 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-config-data\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.742153 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c45c9d6a-f99b-423b-aec8-e327bb32263c-log-httpd\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.742211 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-scripts\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.844818 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c45c9d6a-f99b-423b-aec8-e327bb32263c-run-httpd\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.844930 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vh2bd\" (UniqueName: \"kubernetes.io/projected/c45c9d6a-f99b-423b-aec8-e327bb32263c-kube-api-access-vh2bd\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.844981 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-config-data\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.845000 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c45c9d6a-f99b-423b-aec8-e327bb32263c-log-httpd\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.845553 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c45c9d6a-f99b-423b-aec8-e327bb32263c-run-httpd\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.845941 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-scripts\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.846004 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c45c9d6a-f99b-423b-aec8-e327bb32263c-log-httpd\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.846024 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.846111 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.850063 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-scripts\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.850863 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.851206 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-config-data\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.852054 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.879632 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vh2bd\" (UniqueName: \"kubernetes.io/projected/c45c9d6a-f99b-423b-aec8-e327bb32263c-kube-api-access-vh2bd\") pod \"ceilometer-0\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " pod="openstack/ceilometer-0" Dec 03 08:16:18 crc kubenswrapper[4831]: I1203 08:16:18.951273 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:16:19 crc kubenswrapper[4831]: I1203 08:16:19.043988 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57e98544-148c-4212-810d-cf047b7efbe8" path="/var/lib/kubelet/pods/57e98544-148c-4212-810d-cf047b7efbe8/volumes" Dec 03 08:16:19 crc kubenswrapper[4831]: I1203 08:16:19.475372 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:16:19 crc kubenswrapper[4831]: W1203 08:16:19.480620 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc45c9d6a_f99b_423b_aec8_e327bb32263c.slice/crio-66da56e6aad11d0fc56b9450b88e51c8b3edd858f2919f08496cef4ff5511b48 WatchSource:0}: Error finding container 66da56e6aad11d0fc56b9450b88e51c8b3edd858f2919f08496cef4ff5511b48: Status 404 returned error can't find the container with id 66da56e6aad11d0fc56b9450b88e51c8b3edd858f2919f08496cef4ff5511b48 Dec 03 08:16:19 crc kubenswrapper[4831]: I1203 08:16:19.547693 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c45c9d6a-f99b-423b-aec8-e327bb32263c","Type":"ContainerStarted","Data":"66da56e6aad11d0fc56b9450b88e51c8b3edd858f2919f08496cef4ff5511b48"} Dec 03 08:16:20 crc kubenswrapper[4831]: I1203 08:16:20.558670 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c45c9d6a-f99b-423b-aec8-e327bb32263c","Type":"ContainerStarted","Data":"03cec33174c748df5f9b0e263f08b5ac2cdc459584c8f48f8cca1ee0d2ff9c5b"} Dec 03 08:16:21 crc kubenswrapper[4831]: I1203 08:16:21.572622 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c45c9d6a-f99b-423b-aec8-e327bb32263c","Type":"ContainerStarted","Data":"8e9d2dcf38a1c328cdc3daab593426c6e15b26c61456e85c6f18705ff5b2f38c"} Dec 03 08:16:22 crc kubenswrapper[4831]: I1203 08:16:22.587296 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c45c9d6a-f99b-423b-aec8-e327bb32263c","Type":"ContainerStarted","Data":"441404eb8cb543fdbc63a8bb7af7cee9e7c21032a427c3ec88667d2477bf985c"} Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.783206 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-lqtzw"] Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.784870 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-lqtzw" Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.793983 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-lqtzw"] Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.857033 4831 scope.go:117] "RemoveContainer" containerID="0e43dcc618667594635298af43a769444279529d0a4f5f00117293a6ad37d6fb" Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.867118 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbb68c80-d879-4f3a-8ea3-beea7b78a02f-operator-scripts\") pod \"manila-db-create-lqtzw\" (UID: \"cbb68c80-d879-4f3a-8ea3-beea7b78a02f\") " pod="openstack/manila-db-create-lqtzw" Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.867190 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cxxp\" (UniqueName: \"kubernetes.io/projected/cbb68c80-d879-4f3a-8ea3-beea7b78a02f-kube-api-access-7cxxp\") pod \"manila-db-create-lqtzw\" (UID: \"cbb68c80-d879-4f3a-8ea3-beea7b78a02f\") " pod="openstack/manila-db-create-lqtzw" Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.889218 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-b907-account-create-update-8kt2t"] Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.890601 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b907-account-create-update-8kt2t" Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.894549 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.922966 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-b907-account-create-update-8kt2t"] Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.934381 4831 scope.go:117] "RemoveContainer" containerID="32686504a7304266d746f4f6d0cf7c9dd1e65a000d8c310e047b13bf624f9773" Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.969108 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbb68c80-d879-4f3a-8ea3-beea7b78a02f-operator-scripts\") pod \"manila-db-create-lqtzw\" (UID: \"cbb68c80-d879-4f3a-8ea3-beea7b78a02f\") " pod="openstack/manila-db-create-lqtzw" Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.969228 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cxxp\" (UniqueName: \"kubernetes.io/projected/cbb68c80-d879-4f3a-8ea3-beea7b78a02f-kube-api-access-7cxxp\") pod \"manila-db-create-lqtzw\" (UID: \"cbb68c80-d879-4f3a-8ea3-beea7b78a02f\") " pod="openstack/manila-db-create-lqtzw" Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.969299 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk5t8\" (UniqueName: \"kubernetes.io/projected/80ab9d94-b820-49a4-a0a4-bf0c6311a691-kube-api-access-pk5t8\") pod \"manila-b907-account-create-update-8kt2t\" (UID: \"80ab9d94-b820-49a4-a0a4-bf0c6311a691\") " pod="openstack/manila-b907-account-create-update-8kt2t" Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.969538 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ab9d94-b820-49a4-a0a4-bf0c6311a691-operator-scripts\") pod \"manila-b907-account-create-update-8kt2t\" (UID: \"80ab9d94-b820-49a4-a0a4-bf0c6311a691\") " pod="openstack/manila-b907-account-create-update-8kt2t" Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.970129 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbb68c80-d879-4f3a-8ea3-beea7b78a02f-operator-scripts\") pod \"manila-db-create-lqtzw\" (UID: \"cbb68c80-d879-4f3a-8ea3-beea7b78a02f\") " pod="openstack/manila-db-create-lqtzw" Dec 03 08:16:23 crc kubenswrapper[4831]: I1203 08:16:23.992979 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cxxp\" (UniqueName: \"kubernetes.io/projected/cbb68c80-d879-4f3a-8ea3-beea7b78a02f-kube-api-access-7cxxp\") pod \"manila-db-create-lqtzw\" (UID: \"cbb68c80-d879-4f3a-8ea3-beea7b78a02f\") " pod="openstack/manila-db-create-lqtzw" Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.010351 4831 scope.go:117] "RemoveContainer" containerID="2b04056302db6bd7faa4fc4e590fcd62c93780a8569813067249092a58c255ad" Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.071348 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ab9d94-b820-49a4-a0a4-bf0c6311a691-operator-scripts\") pod \"manila-b907-account-create-update-8kt2t\" (UID: \"80ab9d94-b820-49a4-a0a4-bf0c6311a691\") " pod="openstack/manila-b907-account-create-update-8kt2t" Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.072057 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ab9d94-b820-49a4-a0a4-bf0c6311a691-operator-scripts\") pod \"manila-b907-account-create-update-8kt2t\" (UID: \"80ab9d94-b820-49a4-a0a4-bf0c6311a691\") " pod="openstack/manila-b907-account-create-update-8kt2t" Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.072871 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk5t8\" (UniqueName: \"kubernetes.io/projected/80ab9d94-b820-49a4-a0a4-bf0c6311a691-kube-api-access-pk5t8\") pod \"manila-b907-account-create-update-8kt2t\" (UID: \"80ab9d94-b820-49a4-a0a4-bf0c6311a691\") " pod="openstack/manila-b907-account-create-update-8kt2t" Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.084033 4831 scope.go:117] "RemoveContainer" containerID="8d49e18455a26e04c7b8e4d1f7356be7506a77d06a07c4d9aa202bf278a68a87" Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.091412 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk5t8\" (UniqueName: \"kubernetes.io/projected/80ab9d94-b820-49a4-a0a4-bf0c6311a691-kube-api-access-pk5t8\") pod \"manila-b907-account-create-update-8kt2t\" (UID: \"80ab9d94-b820-49a4-a0a4-bf0c6311a691\") " pod="openstack/manila-b907-account-create-update-8kt2t" Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.101733 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-lqtzw" Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.115645 4831 scope.go:117] "RemoveContainer" containerID="49742facba31fb15c7ee2566e3fbcab42abd2e795fff55e7a1dcf02453e23451" Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.249786 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b907-account-create-update-8kt2t" Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.610862 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c45c9d6a-f99b-423b-aec8-e327bb32263c","Type":"ContainerStarted","Data":"6287dfd308493f36dec77f60a6cf190daf888295f121b1221922e9c339a1d726"} Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.611554 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.640763 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.614081775 podStartE2EDuration="6.640745984s" podCreationTimestamp="2025-12-03 08:16:18 +0000 UTC" firstStartedPulling="2025-12-03 08:16:19.484053938 +0000 UTC m=+6316.827637456" lastFinishedPulling="2025-12-03 08:16:23.510718147 +0000 UTC m=+6320.854301665" observedRunningTime="2025-12-03 08:16:24.635075567 +0000 UTC m=+6321.978659075" watchObservedRunningTime="2025-12-03 08:16:24.640745984 +0000 UTC m=+6321.984329492" Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.733766 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-lqtzw"] Dec 03 08:16:24 crc kubenswrapper[4831]: I1203 08:16:24.880630 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-b907-account-create-update-8kt2t"] Dec 03 08:16:25 crc kubenswrapper[4831]: I1203 08:16:25.622611 4831 generic.go:334] "Generic (PLEG): container finished" podID="80ab9d94-b820-49a4-a0a4-bf0c6311a691" containerID="b3718b6295b09e61d17fb5f2eae8f839312104a9223855aae44a689d3bfde73c" exitCode=0 Dec 03 08:16:25 crc kubenswrapper[4831]: I1203 08:16:25.622673 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b907-account-create-update-8kt2t" event={"ID":"80ab9d94-b820-49a4-a0a4-bf0c6311a691","Type":"ContainerDied","Data":"b3718b6295b09e61d17fb5f2eae8f839312104a9223855aae44a689d3bfde73c"} Dec 03 08:16:25 crc kubenswrapper[4831]: I1203 08:16:25.623079 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b907-account-create-update-8kt2t" event={"ID":"80ab9d94-b820-49a4-a0a4-bf0c6311a691","Type":"ContainerStarted","Data":"78b7f76b9fb4ab21763d26e0dce650a5f19176dfa2cf3f05f36c85eab295f01b"} Dec 03 08:16:25 crc kubenswrapper[4831]: I1203 08:16:25.625285 4831 generic.go:334] "Generic (PLEG): container finished" podID="cbb68c80-d879-4f3a-8ea3-beea7b78a02f" containerID="a64c6b415ba478e145090cd9f084bc6147d762b59ffd9d1e875357ca9346b0f7" exitCode=0 Dec 03 08:16:25 crc kubenswrapper[4831]: I1203 08:16:25.625362 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-lqtzw" event={"ID":"cbb68c80-d879-4f3a-8ea3-beea7b78a02f","Type":"ContainerDied","Data":"a64c6b415ba478e145090cd9f084bc6147d762b59ffd9d1e875357ca9346b0f7"} Dec 03 08:16:25 crc kubenswrapper[4831]: I1203 08:16:25.625414 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-lqtzw" event={"ID":"cbb68c80-d879-4f3a-8ea3-beea7b78a02f","Type":"ContainerStarted","Data":"f207017547c5f89083acf50fb85bda2871cc4f31ea468bcc315179ec0553d16f"} Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.186119 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b907-account-create-update-8kt2t" Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.192085 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-lqtzw" Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.257290 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk5t8\" (UniqueName: \"kubernetes.io/projected/80ab9d94-b820-49a4-a0a4-bf0c6311a691-kube-api-access-pk5t8\") pod \"80ab9d94-b820-49a4-a0a4-bf0c6311a691\" (UID: \"80ab9d94-b820-49a4-a0a4-bf0c6311a691\") " Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.257399 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbb68c80-d879-4f3a-8ea3-beea7b78a02f-operator-scripts\") pod \"cbb68c80-d879-4f3a-8ea3-beea7b78a02f\" (UID: \"cbb68c80-d879-4f3a-8ea3-beea7b78a02f\") " Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.257595 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cxxp\" (UniqueName: \"kubernetes.io/projected/cbb68c80-d879-4f3a-8ea3-beea7b78a02f-kube-api-access-7cxxp\") pod \"cbb68c80-d879-4f3a-8ea3-beea7b78a02f\" (UID: \"cbb68c80-d879-4f3a-8ea3-beea7b78a02f\") " Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.257632 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ab9d94-b820-49a4-a0a4-bf0c6311a691-operator-scripts\") pod \"80ab9d94-b820-49a4-a0a4-bf0c6311a691\" (UID: \"80ab9d94-b820-49a4-a0a4-bf0c6311a691\") " Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.258819 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80ab9d94-b820-49a4-a0a4-bf0c6311a691-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80ab9d94-b820-49a4-a0a4-bf0c6311a691" (UID: "80ab9d94-b820-49a4-a0a4-bf0c6311a691"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.259977 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cbb68c80-d879-4f3a-8ea3-beea7b78a02f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cbb68c80-d879-4f3a-8ea3-beea7b78a02f" (UID: "cbb68c80-d879-4f3a-8ea3-beea7b78a02f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.265789 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80ab9d94-b820-49a4-a0a4-bf0c6311a691-kube-api-access-pk5t8" (OuterVolumeSpecName: "kube-api-access-pk5t8") pod "80ab9d94-b820-49a4-a0a4-bf0c6311a691" (UID: "80ab9d94-b820-49a4-a0a4-bf0c6311a691"). InnerVolumeSpecName "kube-api-access-pk5t8". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.266114 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cbb68c80-d879-4f3a-8ea3-beea7b78a02f-kube-api-access-7cxxp" (OuterVolumeSpecName: "kube-api-access-7cxxp") pod "cbb68c80-d879-4f3a-8ea3-beea7b78a02f" (UID: "cbb68c80-d879-4f3a-8ea3-beea7b78a02f"). InnerVolumeSpecName "kube-api-access-7cxxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.359629 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cxxp\" (UniqueName: \"kubernetes.io/projected/cbb68c80-d879-4f3a-8ea3-beea7b78a02f-kube-api-access-7cxxp\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.359662 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80ab9d94-b820-49a4-a0a4-bf0c6311a691-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.359671 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk5t8\" (UniqueName: \"kubernetes.io/projected/80ab9d94-b820-49a4-a0a4-bf0c6311a691-kube-api-access-pk5t8\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.359680 4831 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cbb68c80-d879-4f3a-8ea3-beea7b78a02f-operator-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.651214 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-lqtzw" Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.651409 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-lqtzw" event={"ID":"cbb68c80-d879-4f3a-8ea3-beea7b78a02f","Type":"ContainerDied","Data":"f207017547c5f89083acf50fb85bda2871cc4f31ea468bcc315179ec0553d16f"} Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.651700 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f207017547c5f89083acf50fb85bda2871cc4f31ea468bcc315179ec0553d16f" Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.653014 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-b907-account-create-update-8kt2t" event={"ID":"80ab9d94-b820-49a4-a0a4-bf0c6311a691","Type":"ContainerDied","Data":"78b7f76b9fb4ab21763d26e0dce650a5f19176dfa2cf3f05f36c85eab295f01b"} Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.653044 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78b7f76b9fb4ab21763d26e0dce650a5f19176dfa2cf3f05f36c85eab295f01b" Dec 03 08:16:27 crc kubenswrapper[4831]: I1203 08:16:27.653095 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-b907-account-create-update-8kt2t" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.273718 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-67jx5"] Dec 03 08:16:29 crc kubenswrapper[4831]: E1203 08:16:29.274768 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80ab9d94-b820-49a4-a0a4-bf0c6311a691" containerName="mariadb-account-create-update" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.274786 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="80ab9d94-b820-49a4-a0a4-bf0c6311a691" containerName="mariadb-account-create-update" Dec 03 08:16:29 crc kubenswrapper[4831]: E1203 08:16:29.274817 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cbb68c80-d879-4f3a-8ea3-beea7b78a02f" containerName="mariadb-database-create" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.274828 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cbb68c80-d879-4f3a-8ea3-beea7b78a02f" containerName="mariadb-database-create" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.275115 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="cbb68c80-d879-4f3a-8ea3-beea7b78a02f" containerName="mariadb-database-create" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.275135 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="80ab9d94-b820-49a4-a0a4-bf0c6311a691" containerName="mariadb-account-create-update" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.276032 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.279177 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-4sx6k" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.287161 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.342752 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-combined-ca-bundle\") pod \"manila-db-sync-67jx5\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.342846 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-config-data\") pod \"manila-db-sync-67jx5\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.342913 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-job-config-data\") pod \"manila-db-sync-67jx5\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.342947 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g55cx\" (UniqueName: \"kubernetes.io/projected/1bfbd107-20c1-4f67-bd53-cd68fb695b07-kube-api-access-g55cx\") pod \"manila-db-sync-67jx5\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.344809 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-67jx5"] Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.444667 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-combined-ca-bundle\") pod \"manila-db-sync-67jx5\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.444956 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-config-data\") pod \"manila-db-sync-67jx5\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.445062 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-job-config-data\") pod \"manila-db-sync-67jx5\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.445118 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g55cx\" (UniqueName: \"kubernetes.io/projected/1bfbd107-20c1-4f67-bd53-cd68fb695b07-kube-api-access-g55cx\") pod \"manila-db-sync-67jx5\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.450775 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-combined-ca-bundle\") pod \"manila-db-sync-67jx5\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.450892 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-config-data\") pod \"manila-db-sync-67jx5\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.453517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-job-config-data\") pod \"manila-db-sync-67jx5\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.462718 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g55cx\" (UniqueName: \"kubernetes.io/projected/1bfbd107-20c1-4f67-bd53-cd68fb695b07-kube-api-access-g55cx\") pod \"manila-db-sync-67jx5\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:29 crc kubenswrapper[4831]: I1203 08:16:29.647773 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:30 crc kubenswrapper[4831]: I1203 08:16:30.580001 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-67jx5"] Dec 03 08:16:30 crc kubenswrapper[4831]: I1203 08:16:30.685647 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-67jx5" event={"ID":"1bfbd107-20c1-4f67-bd53-cd68fb695b07","Type":"ContainerStarted","Data":"9ce09f05cba1c86bfd113a47d3503db267e60c32526b0735cc2a0295500c0c75"} Dec 03 08:16:36 crc kubenswrapper[4831]: I1203 08:16:36.768080 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-67jx5" event={"ID":"1bfbd107-20c1-4f67-bd53-cd68fb695b07","Type":"ContainerStarted","Data":"addcaa2377214201ea8a70fc4215385378472f1ccfd73b56d1fbbe4459a0925f"} Dec 03 08:16:36 crc kubenswrapper[4831]: I1203 08:16:36.799905 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-67jx5" podStartSLOduration=2.62458001 podStartE2EDuration="7.799877975s" podCreationTimestamp="2025-12-03 08:16:29 +0000 UTC" firstStartedPulling="2025-12-03 08:16:30.588406146 +0000 UTC m=+6327.931989654" lastFinishedPulling="2025-12-03 08:16:35.763704111 +0000 UTC m=+6333.107287619" observedRunningTime="2025-12-03 08:16:36.788405227 +0000 UTC m=+6334.131988735" watchObservedRunningTime="2025-12-03 08:16:36.799877975 +0000 UTC m=+6334.143461513" Dec 03 08:16:38 crc kubenswrapper[4831]: I1203 08:16:38.803116 4831 generic.go:334] "Generic (PLEG): container finished" podID="1bfbd107-20c1-4f67-bd53-cd68fb695b07" containerID="addcaa2377214201ea8a70fc4215385378472f1ccfd73b56d1fbbe4459a0925f" exitCode=0 Dec 03 08:16:38 crc kubenswrapper[4831]: I1203 08:16:38.803196 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-67jx5" event={"ID":"1bfbd107-20c1-4f67-bd53-cd68fb695b07","Type":"ContainerDied","Data":"addcaa2377214201ea8a70fc4215385378472f1ccfd73b56d1fbbe4459a0925f"} Dec 03 08:16:39 crc kubenswrapper[4831]: I1203 08:16:39.061595 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-b3e3-account-create-update-ktdk6"] Dec 03 08:16:39 crc kubenswrapper[4831]: I1203 08:16:39.072833 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-b3e3-account-create-update-ktdk6"] Dec 03 08:16:39 crc kubenswrapper[4831]: I1203 08:16:39.083420 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-xjhz6"] Dec 03 08:16:39 crc kubenswrapper[4831]: I1203 08:16:39.093427 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-xjhz6"] Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.450597 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.645188 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-config-data\") pod \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.645577 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g55cx\" (UniqueName: \"kubernetes.io/projected/1bfbd107-20c1-4f67-bd53-cd68fb695b07-kube-api-access-g55cx\") pod \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.645630 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-job-config-data\") pod \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.645659 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-combined-ca-bundle\") pod \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\" (UID: \"1bfbd107-20c1-4f67-bd53-cd68fb695b07\") " Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.652188 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "1bfbd107-20c1-4f67-bd53-cd68fb695b07" (UID: "1bfbd107-20c1-4f67-bd53-cd68fb695b07"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.653698 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bfbd107-20c1-4f67-bd53-cd68fb695b07-kube-api-access-g55cx" (OuterVolumeSpecName: "kube-api-access-g55cx") pod "1bfbd107-20c1-4f67-bd53-cd68fb695b07" (UID: "1bfbd107-20c1-4f67-bd53-cd68fb695b07"). InnerVolumeSpecName "kube-api-access-g55cx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.657118 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-config-data" (OuterVolumeSpecName: "config-data") pod "1bfbd107-20c1-4f67-bd53-cd68fb695b07" (UID: "1bfbd107-20c1-4f67-bd53-cd68fb695b07"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.704561 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1bfbd107-20c1-4f67-bd53-cd68fb695b07" (UID: "1bfbd107-20c1-4f67-bd53-cd68fb695b07"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.748376 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.748407 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g55cx\" (UniqueName: \"kubernetes.io/projected/1bfbd107-20c1-4f67-bd53-cd68fb695b07-kube-api-access-g55cx\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.748419 4831 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-job-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.748431 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bfbd107-20c1-4f67-bd53-cd68fb695b07-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.824711 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-67jx5" event={"ID":"1bfbd107-20c1-4f67-bd53-cd68fb695b07","Type":"ContainerDied","Data":"9ce09f05cba1c86bfd113a47d3503db267e60c32526b0735cc2a0295500c0c75"} Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.824753 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9ce09f05cba1c86bfd113a47d3503db267e60c32526b0735cc2a0295500c0c75" Dec 03 08:16:40 crc kubenswrapper[4831]: I1203 08:16:40.825022 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-67jx5" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.026292 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b67cda2-032a-4d1d-aa32-8c79fb4828b4" path="/var/lib/kubelet/pods/7b67cda2-032a-4d1d-aa32-8c79fb4828b4/volumes" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.027140 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a915b2c9-9e16-4841-b5cd-f572ba326520" path="/var/lib/kubelet/pods/a915b2c9-9e16-4841-b5cd-f572ba326520/volumes" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.099018 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Dec 03 08:16:41 crc kubenswrapper[4831]: E1203 08:16:41.099786 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1bfbd107-20c1-4f67-bd53-cd68fb695b07" containerName="manila-db-sync" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.099894 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1bfbd107-20c1-4f67-bd53-cd68fb695b07" containerName="manila-db-sync" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.100258 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1bfbd107-20c1-4f67-bd53-cd68fb695b07" containerName="manila-db-sync" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.101788 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.104167 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.104862 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-4sx6k" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.105079 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.106211 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.120850 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.164867 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc246fd1-9874-441f-85c4-67712abd90d3-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.164933 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc246fd1-9874-441f-85c4-67712abd90d3-scripts\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.164968 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc246fd1-9874-441f-85c4-67712abd90d3-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.165115 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc246fd1-9874-441f-85c4-67712abd90d3-config-data\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.165231 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc246fd1-9874-441f-85c4-67712abd90d3-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.165344 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pp828\" (UniqueName: \"kubernetes.io/projected/cc246fd1-9874-441f-85c4-67712abd90d3-kube-api-access-pp828\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.176264 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.178120 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.180727 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.192277 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.266702 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93961821-7d28-43df-8440-250747588c2d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.266961 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ghz42\" (UniqueName: \"kubernetes.io/projected/93961821-7d28-43df-8440-250747588c2d-kube-api-access-ghz42\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.266988 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc246fd1-9874-441f-85c4-67712abd90d3-config-data\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.267026 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/93961821-7d28-43df-8440-250747588c2d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.267044 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93961821-7d28-43df-8440-250747588c2d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.267064 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc246fd1-9874-441f-85c4-67712abd90d3-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.267091 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93961821-7d28-43df-8440-250747588c2d-config-data\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.267106 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pp828\" (UniqueName: \"kubernetes.io/projected/cc246fd1-9874-441f-85c4-67712abd90d3-kube-api-access-pp828\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.267128 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93961821-7d28-43df-8440-250747588c2d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.267814 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93961821-7d28-43df-8440-250747588c2d-scripts\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.267851 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc246fd1-9874-441f-85c4-67712abd90d3-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.267897 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc246fd1-9874-441f-85c4-67712abd90d3-scripts\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.267923 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc246fd1-9874-441f-85c4-67712abd90d3-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.267954 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/93961821-7d28-43df-8440-250747588c2d-ceph\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.272517 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/cc246fd1-9874-441f-85c4-67712abd90d3-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.278575 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/cc246fd1-9874-441f-85c4-67712abd90d3-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.284557 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cc246fd1-9874-441f-85c4-67712abd90d3-scripts\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.289647 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc246fd1-9874-441f-85c4-67712abd90d3-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.290365 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc246fd1-9874-441f-85c4-67712abd90d3-config-data\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.294999 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pp828\" (UniqueName: \"kubernetes.io/projected/cc246fd1-9874-441f-85c4-67712abd90d3-kube-api-access-pp828\") pod \"manila-scheduler-0\" (UID: \"cc246fd1-9874-441f-85c4-67712abd90d3\") " pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.371668 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6448f4c67c-vfcck"] Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.373344 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/93961821-7d28-43df-8440-250747588c2d-ceph\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.373504 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93961821-7d28-43df-8440-250747588c2d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.373595 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ghz42\" (UniqueName: \"kubernetes.io/projected/93961821-7d28-43df-8440-250747588c2d-kube-api-access-ghz42\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.373888 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/93961821-7d28-43df-8440-250747588c2d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.373968 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93961821-7d28-43df-8440-250747588c2d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.374063 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93961821-7d28-43df-8440-250747588c2d-config-data\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.374209 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93961821-7d28-43df-8440-250747588c2d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.374474 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93961821-7d28-43df-8440-250747588c2d-scripts\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.376529 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/93961821-7d28-43df-8440-250747588c2d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.376612 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/93961821-7d28-43df-8440-250747588c2d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.383851 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/93961821-7d28-43df-8440-250747588c2d-ceph\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.383973 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/93961821-7d28-43df-8440-250747588c2d-config-data\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.385118 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/93961821-7d28-43df-8440-250747588c2d-scripts\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.385701 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.386027 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/93961821-7d28-43df-8440-250747588c2d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.392274 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6448f4c67c-vfcck"] Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.403159 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/93961821-7d28-43df-8440-250747588c2d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.409738 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ghz42\" (UniqueName: \"kubernetes.io/projected/93961821-7d28-43df-8440-250747588c2d-kube-api-access-ghz42\") pod \"manila-share-share1-0\" (UID: \"93961821-7d28-43df-8440-250747588c2d\") " pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.419332 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.476539 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr45r\" (UniqueName: \"kubernetes.io/projected/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-kube-api-access-jr45r\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.476674 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-ovsdbserver-nb\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.476700 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-ovsdbserver-sb\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.476752 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-dns-svc\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.476773 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-config\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.479379 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.481277 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.485929 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.494453 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.496749 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.578144 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-ovsdbserver-nb\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.578718 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-ovsdbserver-sb\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.578806 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-dns-svc\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.578832 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-config\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.578883 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jr45r\" (UniqueName: \"kubernetes.io/projected/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-kube-api-access-jr45r\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.579629 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-config\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.579683 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-ovsdbserver-sb\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.580283 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-dns-svc\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.580799 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-ovsdbserver-nb\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.603119 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr45r\" (UniqueName: \"kubernetes.io/projected/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-kube-api-access-jr45r\") pod \"dnsmasq-dns-6448f4c67c-vfcck\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.682713 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-scripts\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.683037 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-etc-machine-id\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.683080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-config-data-custom\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.683116 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc9tv\" (UniqueName: \"kubernetes.io/projected/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-kube-api-access-fc9tv\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.683134 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-logs\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.683154 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.683227 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-config-data\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.784943 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-scripts\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.785035 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-etc-machine-id\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.785079 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-config-data-custom\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.785117 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc9tv\" (UniqueName: \"kubernetes.io/projected/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-kube-api-access-fc9tv\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.785138 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-logs\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.785158 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.785232 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-config-data\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.785909 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-logs\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.786620 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-etc-machine-id\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.789668 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.789757 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-scripts\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.790523 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-config-data\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.796628 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-config-data-custom\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.808837 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc9tv\" (UniqueName: \"kubernetes.io/projected/4e7d2410-1d15-4296-a3cb-adc2baff3fc4-kube-api-access-fc9tv\") pod \"manila-api-0\" (UID: \"4e7d2410-1d15-4296-a3cb-adc2baff3fc4\") " pod="openstack/manila-api-0" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.878969 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:41 crc kubenswrapper[4831]: I1203 08:16:41.890073 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Dec 03 08:16:42 crc kubenswrapper[4831]: I1203 08:16:42.164930 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Dec 03 08:16:42 crc kubenswrapper[4831]: I1203 08:16:42.318636 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Dec 03 08:16:42 crc kubenswrapper[4831]: I1203 08:16:42.570702 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6448f4c67c-vfcck"] Dec 03 08:16:42 crc kubenswrapper[4831]: W1203 08:16:42.706572 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4e7d2410_1d15_4296_a3cb_adc2baff3fc4.slice/crio-db00b15fed4af9f11fec303cc6f7d06f2602e19618deefdc6cd6ec1b745f0c04 WatchSource:0}: Error finding container db00b15fed4af9f11fec303cc6f7d06f2602e19618deefdc6cd6ec1b745f0c04: Status 404 returned error can't find the container with id db00b15fed4af9f11fec303cc6f7d06f2602e19618deefdc6cd6ec1b745f0c04 Dec 03 08:16:42 crc kubenswrapper[4831]: I1203 08:16:42.707369 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Dec 03 08:16:42 crc kubenswrapper[4831]: I1203 08:16:42.900488 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"4e7d2410-1d15-4296-a3cb-adc2baff3fc4","Type":"ContainerStarted","Data":"db00b15fed4af9f11fec303cc6f7d06f2602e19618deefdc6cd6ec1b745f0c04"} Dec 03 08:16:42 crc kubenswrapper[4831]: I1203 08:16:42.902104 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"93961821-7d28-43df-8440-250747588c2d","Type":"ContainerStarted","Data":"ba42f8de2ddfd7a82a041d02c9bcb092ec8bf14e6958d0a6ae86052ce2dc15d7"} Dec 03 08:16:42 crc kubenswrapper[4831]: I1203 08:16:42.904991 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" event={"ID":"fd0d52c9-2ad8-4410-a080-8d68193ca4e7","Type":"ContainerStarted","Data":"164ba62f43fecc4576097d97c16c86fbfb12f1b0be4ebbcbd9140b161d4cf36b"} Dec 03 08:16:42 crc kubenswrapper[4831]: I1203 08:16:42.908527 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"cc246fd1-9874-441f-85c4-67712abd90d3","Type":"ContainerStarted","Data":"b67754995d584d170166c2a0f93c09422e8b8e9d614445da23ac3e86fda0cc32"} Dec 03 08:16:43 crc kubenswrapper[4831]: I1203 08:16:43.933748 4831 generic.go:334] "Generic (PLEG): container finished" podID="fd0d52c9-2ad8-4410-a080-8d68193ca4e7" containerID="56d46623a1859183348e18182e4a23ddca9cf73d31374f383f703b029925506d" exitCode=0 Dec 03 08:16:43 crc kubenswrapper[4831]: I1203 08:16:43.934421 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" event={"ID":"fd0d52c9-2ad8-4410-a080-8d68193ca4e7","Type":"ContainerDied","Data":"56d46623a1859183348e18182e4a23ddca9cf73d31374f383f703b029925506d"} Dec 03 08:16:43 crc kubenswrapper[4831]: I1203 08:16:43.937689 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"cc246fd1-9874-441f-85c4-67712abd90d3","Type":"ContainerStarted","Data":"ea31dd7235ef66dfc03f7f451c14ee8bcca4e2b7cb245f10e3b736216bc6af44"} Dec 03 08:16:43 crc kubenswrapper[4831]: I1203 08:16:43.940306 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"4e7d2410-1d15-4296-a3cb-adc2baff3fc4","Type":"ContainerStarted","Data":"f34220a178f71df1d7b174c9a21ec79e1a9cba6c3fc20a61714b1ec4f35e6faf"} Dec 03 08:16:44 crc kubenswrapper[4831]: I1203 08:16:44.961520 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" event={"ID":"fd0d52c9-2ad8-4410-a080-8d68193ca4e7","Type":"ContainerStarted","Data":"864892b9eea956390575a465e97829cc94b6ff18d69a7cd820f16cd52a604e90"} Dec 03 08:16:44 crc kubenswrapper[4831]: I1203 08:16:44.962417 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:44 crc kubenswrapper[4831]: I1203 08:16:44.970096 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"cc246fd1-9874-441f-85c4-67712abd90d3","Type":"ContainerStarted","Data":"d255c5d3137fd92b5cf2580a5ac4d84427815850d5a1f36801242935999d0c4d"} Dec 03 08:16:44 crc kubenswrapper[4831]: I1203 08:16:44.973969 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"4e7d2410-1d15-4296-a3cb-adc2baff3fc4","Type":"ContainerStarted","Data":"b88dfba921bec362cf4a0c01457476efed9016b3f91ba9b091f473bcb01eab48"} Dec 03 08:16:44 crc kubenswrapper[4831]: I1203 08:16:44.974163 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Dec 03 08:16:44 crc kubenswrapper[4831]: I1203 08:16:44.987468 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" podStartSLOduration=3.987448132 podStartE2EDuration="3.987448132s" podCreationTimestamp="2025-12-03 08:16:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:16:44.980503316 +0000 UTC m=+6342.324086824" watchObservedRunningTime="2025-12-03 08:16:44.987448132 +0000 UTC m=+6342.331031640" Dec 03 08:16:45 crc kubenswrapper[4831]: I1203 08:16:45.032213 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=3.341787393 podStartE2EDuration="4.032193967s" podCreationTimestamp="2025-12-03 08:16:41 +0000 UTC" firstStartedPulling="2025-12-03 08:16:42.189248977 +0000 UTC m=+6339.532832485" lastFinishedPulling="2025-12-03 08:16:42.879655561 +0000 UTC m=+6340.223239059" observedRunningTime="2025-12-03 08:16:45.017981984 +0000 UTC m=+6342.361565492" watchObservedRunningTime="2025-12-03 08:16:45.032193967 +0000 UTC m=+6342.375777475" Dec 03 08:16:45 crc kubenswrapper[4831]: I1203 08:16:45.037822 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=4.037808602 podStartE2EDuration="4.037808602s" podCreationTimestamp="2025-12-03 08:16:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:16:45.036825661 +0000 UTC m=+6342.380409169" watchObservedRunningTime="2025-12-03 08:16:45.037808602 +0000 UTC m=+6342.381392110" Dec 03 08:16:47 crc kubenswrapper[4831]: I1203 08:16:47.037809 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-2mdbb"] Dec 03 08:16:47 crc kubenswrapper[4831]: I1203 08:16:47.049308 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-2mdbb"] Dec 03 08:16:48 crc kubenswrapper[4831]: I1203 08:16:48.269882 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:16:48 crc kubenswrapper[4831]: I1203 08:16:48.270482 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="sg-core" containerID="cri-o://441404eb8cb543fdbc63a8bb7af7cee9e7c21032a427c3ec88667d2477bf985c" gracePeriod=30 Dec 03 08:16:48 crc kubenswrapper[4831]: I1203 08:16:48.270556 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="proxy-httpd" containerID="cri-o://6287dfd308493f36dec77f60a6cf190daf888295f121b1221922e9c339a1d726" gracePeriod=30 Dec 03 08:16:48 crc kubenswrapper[4831]: I1203 08:16:48.270596 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="ceilometer-notification-agent" containerID="cri-o://8e9d2dcf38a1c328cdc3daab593426c6e15b26c61456e85c6f18705ff5b2f38c" gracePeriod=30 Dec 03 08:16:48 crc kubenswrapper[4831]: I1203 08:16:48.270854 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="ceilometer-central-agent" containerID="cri-o://03cec33174c748df5f9b0e263f08b5ac2cdc459584c8f48f8cca1ee0d2ff9c5b" gracePeriod=30 Dec 03 08:16:48 crc kubenswrapper[4831]: I1203 08:16:48.281167 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.146:3000/\": EOF" Dec 03 08:16:48 crc kubenswrapper[4831]: I1203 08:16:48.952800 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="proxy-httpd" probeResult="failure" output="Get \"http://10.217.1.146:3000/\": dial tcp 10.217.1.146:3000: connect: connection refused" Dec 03 08:16:49 crc kubenswrapper[4831]: I1203 08:16:49.019872 4831 generic.go:334] "Generic (PLEG): container finished" podID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerID="6287dfd308493f36dec77f60a6cf190daf888295f121b1221922e9c339a1d726" exitCode=0 Dec 03 08:16:49 crc kubenswrapper[4831]: I1203 08:16:49.019919 4831 generic.go:334] "Generic (PLEG): container finished" podID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerID="441404eb8cb543fdbc63a8bb7af7cee9e7c21032a427c3ec88667d2477bf985c" exitCode=2 Dec 03 08:16:49 crc kubenswrapper[4831]: I1203 08:16:49.019928 4831 generic.go:334] "Generic (PLEG): container finished" podID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerID="03cec33174c748df5f9b0e263f08b5ac2cdc459584c8f48f8cca1ee0d2ff9c5b" exitCode=0 Dec 03 08:16:49 crc kubenswrapper[4831]: I1203 08:16:49.029452 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87832c9d-bf1c-45fc-b106-bbcac2b8641c" path="/var/lib/kubelet/pods/87832c9d-bf1c-45fc-b106-bbcac2b8641c/volumes" Dec 03 08:16:49 crc kubenswrapper[4831]: I1203 08:16:49.030351 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c45c9d6a-f99b-423b-aec8-e327bb32263c","Type":"ContainerDied","Data":"6287dfd308493f36dec77f60a6cf190daf888295f121b1221922e9c339a1d726"} Dec 03 08:16:49 crc kubenswrapper[4831]: I1203 08:16:49.030380 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c45c9d6a-f99b-423b-aec8-e327bb32263c","Type":"ContainerDied","Data":"441404eb8cb543fdbc63a8bb7af7cee9e7c21032a427c3ec88667d2477bf985c"} Dec 03 08:16:49 crc kubenswrapper[4831]: I1203 08:16:49.030395 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c45c9d6a-f99b-423b-aec8-e327bb32263c","Type":"ContainerDied","Data":"03cec33174c748df5f9b0e263f08b5ac2cdc459584c8f48f8cca1ee0d2ff9c5b"} Dec 03 08:16:51 crc kubenswrapper[4831]: I1203 08:16:51.046854 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"93961821-7d28-43df-8440-250747588c2d","Type":"ContainerStarted","Data":"44618461fdd99233601b8f3ebbd785fc02d40165a41b7f5fcaa331e7af27d6fd"} Dec 03 08:16:51 crc kubenswrapper[4831]: I1203 08:16:51.420044 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Dec 03 08:16:51 crc kubenswrapper[4831]: I1203 08:16:51.880479 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:16:51 crc kubenswrapper[4831]: I1203 08:16:51.976385 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bfb66ff99-jlg8h"] Dec 03 08:16:51 crc kubenswrapper[4831]: I1203 08:16:51.976675 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" podUID="b217d44f-743f-48e6-a819-3483767f288b" containerName="dnsmasq-dns" containerID="cri-o://96ad549132d9ec5938ef2491ec2ea753a3e614e94d37b2df4264d29c938a5878" gracePeriod=10 Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.057698 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"93961821-7d28-43df-8440-250747588c2d","Type":"ContainerStarted","Data":"82bfc2e0590b281cafb17460dcfb2f5977b75198b65a772cedfdf860783bd7ce"} Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.079000 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=3.059718017 podStartE2EDuration="11.078982832s" podCreationTimestamp="2025-12-03 08:16:41 +0000 UTC" firstStartedPulling="2025-12-03 08:16:42.318158862 +0000 UTC m=+6339.661742370" lastFinishedPulling="2025-12-03 08:16:50.337423667 +0000 UTC m=+6347.681007185" observedRunningTime="2025-12-03 08:16:52.078001932 +0000 UTC m=+6349.421585450" watchObservedRunningTime="2025-12-03 08:16:52.078982832 +0000 UTC m=+6349.422566340" Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.565347 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.644744 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rczfx\" (UniqueName: \"kubernetes.io/projected/b217d44f-743f-48e6-a819-3483767f288b-kube-api-access-rczfx\") pod \"b217d44f-743f-48e6-a819-3483767f288b\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.644935 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-ovsdbserver-nb\") pod \"b217d44f-743f-48e6-a819-3483767f288b\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.645221 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-dns-svc\") pod \"b217d44f-743f-48e6-a819-3483767f288b\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.645428 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-ovsdbserver-sb\") pod \"b217d44f-743f-48e6-a819-3483767f288b\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.645513 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-config\") pod \"b217d44f-743f-48e6-a819-3483767f288b\" (UID: \"b217d44f-743f-48e6-a819-3483767f288b\") " Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.693270 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b217d44f-743f-48e6-a819-3483767f288b-kube-api-access-rczfx" (OuterVolumeSpecName: "kube-api-access-rczfx") pod "b217d44f-743f-48e6-a819-3483767f288b" (UID: "b217d44f-743f-48e6-a819-3483767f288b"). InnerVolumeSpecName "kube-api-access-rczfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.739677 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-config" (OuterVolumeSpecName: "config") pod "b217d44f-743f-48e6-a819-3483767f288b" (UID: "b217d44f-743f-48e6-a819-3483767f288b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.745963 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b217d44f-743f-48e6-a819-3483767f288b" (UID: "b217d44f-743f-48e6-a819-3483767f288b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.748124 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.748164 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rczfx\" (UniqueName: \"kubernetes.io/projected/b217d44f-743f-48e6-a819-3483767f288b-kube-api-access-rczfx\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.748183 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.754722 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b217d44f-743f-48e6-a819-3483767f288b" (UID: "b217d44f-743f-48e6-a819-3483767f288b"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.764865 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b217d44f-743f-48e6-a819-3483767f288b" (UID: "b217d44f-743f-48e6-a819-3483767f288b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.851982 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:52 crc kubenswrapper[4831]: I1203 08:16:52.852537 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b217d44f-743f-48e6-a819-3483767f288b-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.092557 4831 generic.go:334] "Generic (PLEG): container finished" podID="b217d44f-743f-48e6-a819-3483767f288b" containerID="96ad549132d9ec5938ef2491ec2ea753a3e614e94d37b2df4264d29c938a5878" exitCode=0 Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.092651 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" event={"ID":"b217d44f-743f-48e6-a819-3483767f288b","Type":"ContainerDied","Data":"96ad549132d9ec5938ef2491ec2ea753a3e614e94d37b2df4264d29c938a5878"} Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.092694 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" event={"ID":"b217d44f-743f-48e6-a819-3483767f288b","Type":"ContainerDied","Data":"6d50b2fa85bf36a58debc2582bd07de3f0dbbb5ef108fdbb17e753fd0c76cdc5"} Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.092714 4831 scope.go:117] "RemoveContainer" containerID="96ad549132d9ec5938ef2491ec2ea753a3e614e94d37b2df4264d29c938a5878" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.092876 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bfb66ff99-jlg8h" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.112083 4831 generic.go:334] "Generic (PLEG): container finished" podID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerID="8e9d2dcf38a1c328cdc3daab593426c6e15b26c61456e85c6f18705ff5b2f38c" exitCode=0 Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.112986 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c45c9d6a-f99b-423b-aec8-e327bb32263c","Type":"ContainerDied","Data":"8e9d2dcf38a1c328cdc3daab593426c6e15b26c61456e85c6f18705ff5b2f38c"} Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.127546 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bfb66ff99-jlg8h"] Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.141994 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bfb66ff99-jlg8h"] Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.188619 4831 scope.go:117] "RemoveContainer" containerID="9adee222cf7b0b8b08cbf85d4a58fd8912e48371faf2de5a5c1996202d02d576" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.284156 4831 scope.go:117] "RemoveContainer" containerID="96ad549132d9ec5938ef2491ec2ea753a3e614e94d37b2df4264d29c938a5878" Dec 03 08:16:53 crc kubenswrapper[4831]: E1203 08:16:53.285227 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96ad549132d9ec5938ef2491ec2ea753a3e614e94d37b2df4264d29c938a5878\": container with ID starting with 96ad549132d9ec5938ef2491ec2ea753a3e614e94d37b2df4264d29c938a5878 not found: ID does not exist" containerID="96ad549132d9ec5938ef2491ec2ea753a3e614e94d37b2df4264d29c938a5878" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.285259 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ad549132d9ec5938ef2491ec2ea753a3e614e94d37b2df4264d29c938a5878"} err="failed to get container status \"96ad549132d9ec5938ef2491ec2ea753a3e614e94d37b2df4264d29c938a5878\": rpc error: code = NotFound desc = could not find container \"96ad549132d9ec5938ef2491ec2ea753a3e614e94d37b2df4264d29c938a5878\": container with ID starting with 96ad549132d9ec5938ef2491ec2ea753a3e614e94d37b2df4264d29c938a5878 not found: ID does not exist" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.285279 4831 scope.go:117] "RemoveContainer" containerID="9adee222cf7b0b8b08cbf85d4a58fd8912e48371faf2de5a5c1996202d02d576" Dec 03 08:16:53 crc kubenswrapper[4831]: E1203 08:16:53.285547 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9adee222cf7b0b8b08cbf85d4a58fd8912e48371faf2de5a5c1996202d02d576\": container with ID starting with 9adee222cf7b0b8b08cbf85d4a58fd8912e48371faf2de5a5c1996202d02d576 not found: ID does not exist" containerID="9adee222cf7b0b8b08cbf85d4a58fd8912e48371faf2de5a5c1996202d02d576" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.285568 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9adee222cf7b0b8b08cbf85d4a58fd8912e48371faf2de5a5c1996202d02d576"} err="failed to get container status \"9adee222cf7b0b8b08cbf85d4a58fd8912e48371faf2de5a5c1996202d02d576\": rpc error: code = NotFound desc = could not find container \"9adee222cf7b0b8b08cbf85d4a58fd8912e48371faf2de5a5c1996202d02d576\": container with ID starting with 9adee222cf7b0b8b08cbf85d4a58fd8912e48371faf2de5a5c1996202d02d576 not found: ID does not exist" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.561043 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.670460 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-sg-core-conf-yaml\") pod \"c45c9d6a-f99b-423b-aec8-e327bb32263c\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.670612 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vh2bd\" (UniqueName: \"kubernetes.io/projected/c45c9d6a-f99b-423b-aec8-e327bb32263c-kube-api-access-vh2bd\") pod \"c45c9d6a-f99b-423b-aec8-e327bb32263c\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.670672 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-config-data\") pod \"c45c9d6a-f99b-423b-aec8-e327bb32263c\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.670806 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-combined-ca-bundle\") pod \"c45c9d6a-f99b-423b-aec8-e327bb32263c\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.670839 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c45c9d6a-f99b-423b-aec8-e327bb32263c-run-httpd\") pod \"c45c9d6a-f99b-423b-aec8-e327bb32263c\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.670879 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c45c9d6a-f99b-423b-aec8-e327bb32263c-log-httpd\") pod \"c45c9d6a-f99b-423b-aec8-e327bb32263c\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.670952 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-scripts\") pod \"c45c9d6a-f99b-423b-aec8-e327bb32263c\" (UID: \"c45c9d6a-f99b-423b-aec8-e327bb32263c\") " Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.671656 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c45c9d6a-f99b-423b-aec8-e327bb32263c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "c45c9d6a-f99b-423b-aec8-e327bb32263c" (UID: "c45c9d6a-f99b-423b-aec8-e327bb32263c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.671907 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c45c9d6a-f99b-423b-aec8-e327bb32263c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "c45c9d6a-f99b-423b-aec8-e327bb32263c" (UID: "c45c9d6a-f99b-423b-aec8-e327bb32263c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.676676 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c45c9d6a-f99b-423b-aec8-e327bb32263c-kube-api-access-vh2bd" (OuterVolumeSpecName: "kube-api-access-vh2bd") pod "c45c9d6a-f99b-423b-aec8-e327bb32263c" (UID: "c45c9d6a-f99b-423b-aec8-e327bb32263c"). InnerVolumeSpecName "kube-api-access-vh2bd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.696131 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-scripts" (OuterVolumeSpecName: "scripts") pod "c45c9d6a-f99b-423b-aec8-e327bb32263c" (UID: "c45c9d6a-f99b-423b-aec8-e327bb32263c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.704525 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "c45c9d6a-f99b-423b-aec8-e327bb32263c" (UID: "c45c9d6a-f99b-423b-aec8-e327bb32263c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.759004 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c45c9d6a-f99b-423b-aec8-e327bb32263c" (UID: "c45c9d6a-f99b-423b-aec8-e327bb32263c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.774595 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.774638 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vh2bd\" (UniqueName: \"kubernetes.io/projected/c45c9d6a-f99b-423b-aec8-e327bb32263c-kube-api-access-vh2bd\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.774654 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.774669 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c45c9d6a-f99b-423b-aec8-e327bb32263c-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.774682 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/c45c9d6a-f99b-423b-aec8-e327bb32263c-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.774695 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.793259 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-config-data" (OuterVolumeSpecName: "config-data") pod "c45c9d6a-f99b-423b-aec8-e327bb32263c" (UID: "c45c9d6a-f99b-423b-aec8-e327bb32263c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:16:53 crc kubenswrapper[4831]: I1203 08:16:53.876501 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c45c9d6a-f99b-423b-aec8-e327bb32263c-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.132766 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"c45c9d6a-f99b-423b-aec8-e327bb32263c","Type":"ContainerDied","Data":"66da56e6aad11d0fc56b9450b88e51c8b3edd858f2919f08496cef4ff5511b48"} Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.132834 4831 scope.go:117] "RemoveContainer" containerID="6287dfd308493f36dec77f60a6cf190daf888295f121b1221922e9c339a1d726" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.132881 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.162250 4831 scope.go:117] "RemoveContainer" containerID="441404eb8cb543fdbc63a8bb7af7cee9e7c21032a427c3ec88667d2477bf985c" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.170371 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.183164 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.193624 4831 scope.go:117] "RemoveContainer" containerID="8e9d2dcf38a1c328cdc3daab593426c6e15b26c61456e85c6f18705ff5b2f38c" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.205287 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:16:54 crc kubenswrapper[4831]: E1203 08:16:54.205839 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="ceilometer-central-agent" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.205862 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="ceilometer-central-agent" Dec 03 08:16:54 crc kubenswrapper[4831]: E1203 08:16:54.205875 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="proxy-httpd" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.205883 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="proxy-httpd" Dec 03 08:16:54 crc kubenswrapper[4831]: E1203 08:16:54.205905 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b217d44f-743f-48e6-a819-3483767f288b" containerName="init" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.205911 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b217d44f-743f-48e6-a819-3483767f288b" containerName="init" Dec 03 08:16:54 crc kubenswrapper[4831]: E1203 08:16:54.205922 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="ceilometer-notification-agent" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.205928 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="ceilometer-notification-agent" Dec 03 08:16:54 crc kubenswrapper[4831]: E1203 08:16:54.205950 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b217d44f-743f-48e6-a819-3483767f288b" containerName="dnsmasq-dns" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.205957 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b217d44f-743f-48e6-a819-3483767f288b" containerName="dnsmasq-dns" Dec 03 08:16:54 crc kubenswrapper[4831]: E1203 08:16:54.205970 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="sg-core" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.205977 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="sg-core" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.206160 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="proxy-httpd" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.206180 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="ceilometer-notification-agent" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.206192 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="sg-core" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.206204 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" containerName="ceilometer-central-agent" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.206212 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b217d44f-743f-48e6-a819-3483767f288b" containerName="dnsmasq-dns" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.208326 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.210963 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.211203 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.217758 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.226069 4831 scope.go:117] "RemoveContainer" containerID="03cec33174c748df5f9b0e263f08b5ac2cdc459584c8f48f8cca1ee0d2ff9c5b" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.285191 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-scripts\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.285229 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jl42\" (UniqueName: \"kubernetes.io/projected/0cc0b745-32e7-4c31-ad91-c51f041a9836-kube-api-access-6jl42\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.285254 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.285279 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc0b745-32e7-4c31-ad91-c51f041a9836-log-httpd\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.285438 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc0b745-32e7-4c31-ad91-c51f041a9836-run-httpd\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.285501 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-config-data\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.285717 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.387817 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-scripts\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.387873 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jl42\" (UniqueName: \"kubernetes.io/projected/0cc0b745-32e7-4c31-ad91-c51f041a9836-kube-api-access-6jl42\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.387906 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.387937 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc0b745-32e7-4c31-ad91-c51f041a9836-log-httpd\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.387991 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc0b745-32e7-4c31-ad91-c51f041a9836-run-httpd\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.388022 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-config-data\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.388119 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.388543 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc0b745-32e7-4c31-ad91-c51f041a9836-run-httpd\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.388628 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc0b745-32e7-4c31-ad91-c51f041a9836-log-httpd\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.392547 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.392554 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-scripts\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.393265 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.393525 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-config-data\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.411015 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jl42\" (UniqueName: \"kubernetes.io/projected/0cc0b745-32e7-4c31-ad91-c51f041a9836-kube-api-access-6jl42\") pod \"ceilometer-0\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " pod="openstack/ceilometer-0" Dec 03 08:16:54 crc kubenswrapper[4831]: I1203 08:16:54.528398 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:16:55 crc kubenswrapper[4831]: I1203 08:16:55.024881 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b217d44f-743f-48e6-a819-3483767f288b" path="/var/lib/kubelet/pods/b217d44f-743f-48e6-a819-3483767f288b/volumes" Dec 03 08:16:55 crc kubenswrapper[4831]: I1203 08:16:55.025942 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c45c9d6a-f99b-423b-aec8-e327bb32263c" path="/var/lib/kubelet/pods/c45c9d6a-f99b-423b-aec8-e327bb32263c/volumes" Dec 03 08:16:55 crc kubenswrapper[4831]: I1203 08:16:55.054541 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:16:55 crc kubenswrapper[4831]: W1203 08:16:55.058447 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0cc0b745_32e7_4c31_ad91_c51f041a9836.slice/crio-f0038dd5104cf6fe5ff54fa5d8a141d28f9694a2103b3c1225232e700c0adf94 WatchSource:0}: Error finding container f0038dd5104cf6fe5ff54fa5d8a141d28f9694a2103b3c1225232e700c0adf94: Status 404 returned error can't find the container with id f0038dd5104cf6fe5ff54fa5d8a141d28f9694a2103b3c1225232e700c0adf94 Dec 03 08:16:55 crc kubenswrapper[4831]: I1203 08:16:55.144696 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cc0b745-32e7-4c31-ad91-c51f041a9836","Type":"ContainerStarted","Data":"f0038dd5104cf6fe5ff54fa5d8a141d28f9694a2103b3c1225232e700c0adf94"} Dec 03 08:16:55 crc kubenswrapper[4831]: I1203 08:16:55.194806 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:16:56 crc kubenswrapper[4831]: I1203 08:16:56.157772 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cc0b745-32e7-4c31-ad91-c51f041a9836","Type":"ContainerStarted","Data":"347a56ba3f002595bcb2038b6954987ff5b3044de178459b2d3723fcbae6e042"} Dec 03 08:16:57 crc kubenswrapper[4831]: I1203 08:16:57.185726 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cc0b745-32e7-4c31-ad91-c51f041a9836","Type":"ContainerStarted","Data":"41dd686c4d27db2d367289ffe12aaa02fae5cf59d91c25931661ca5246e26c35"} Dec 03 08:16:57 crc kubenswrapper[4831]: I1203 08:16:57.597249 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:16:57 crc kubenswrapper[4831]: I1203 08:16:57.597305 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:16:58 crc kubenswrapper[4831]: I1203 08:16:58.202749 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cc0b745-32e7-4c31-ad91-c51f041a9836","Type":"ContainerStarted","Data":"77b74ef84489863ca2f1f29efb10e43b20418c4355ff5117989ea5ddf33e6f5a"} Dec 03 08:17:00 crc kubenswrapper[4831]: I1203 08:17:00.225190 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cc0b745-32e7-4c31-ad91-c51f041a9836","Type":"ContainerStarted","Data":"c6ea0593e52841cfe315b1a1fbbe5a0e3b8ae71f884283e33c98998a4f71acda"} Dec 03 08:17:00 crc kubenswrapper[4831]: I1203 08:17:00.225444 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="ceilometer-central-agent" containerID="cri-o://347a56ba3f002595bcb2038b6954987ff5b3044de178459b2d3723fcbae6e042" gracePeriod=30 Dec 03 08:17:00 crc kubenswrapper[4831]: I1203 08:17:00.225957 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="proxy-httpd" containerID="cri-o://c6ea0593e52841cfe315b1a1fbbe5a0e3b8ae71f884283e33c98998a4f71acda" gracePeriod=30 Dec 03 08:17:00 crc kubenswrapper[4831]: I1203 08:17:00.226072 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="sg-core" containerID="cri-o://77b74ef84489863ca2f1f29efb10e43b20418c4355ff5117989ea5ddf33e6f5a" gracePeriod=30 Dec 03 08:17:00 crc kubenswrapper[4831]: I1203 08:17:00.226104 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="ceilometer-notification-agent" containerID="cri-o://41dd686c4d27db2d367289ffe12aaa02fae5cf59d91c25931661ca5246e26c35" gracePeriod=30 Dec 03 08:17:00 crc kubenswrapper[4831]: I1203 08:17:00.226183 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 08:17:00 crc kubenswrapper[4831]: I1203 08:17:00.253348 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.3942748209999998 podStartE2EDuration="6.253326989s" podCreationTimestamp="2025-12-03 08:16:54 +0000 UTC" firstStartedPulling="2025-12-03 08:16:55.060752495 +0000 UTC m=+6352.404336013" lastFinishedPulling="2025-12-03 08:16:58.919804663 +0000 UTC m=+6356.263388181" observedRunningTime="2025-12-03 08:17:00.24952352 +0000 UTC m=+6357.593107048" watchObservedRunningTime="2025-12-03 08:17:00.253326989 +0000 UTC m=+6357.596910507" Dec 03 08:17:01 crc kubenswrapper[4831]: I1203 08:17:01.242994 4831 generic.go:334] "Generic (PLEG): container finished" podID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerID="c6ea0593e52841cfe315b1a1fbbe5a0e3b8ae71f884283e33c98998a4f71acda" exitCode=0 Dec 03 08:17:01 crc kubenswrapper[4831]: I1203 08:17:01.243309 4831 generic.go:334] "Generic (PLEG): container finished" podID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerID="77b74ef84489863ca2f1f29efb10e43b20418c4355ff5117989ea5ddf33e6f5a" exitCode=2 Dec 03 08:17:01 crc kubenswrapper[4831]: I1203 08:17:01.243336 4831 generic.go:334] "Generic (PLEG): container finished" podID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerID="41dd686c4d27db2d367289ffe12aaa02fae5cf59d91c25931661ca5246e26c35" exitCode=0 Dec 03 08:17:01 crc kubenswrapper[4831]: I1203 08:17:01.243050 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cc0b745-32e7-4c31-ad91-c51f041a9836","Type":"ContainerDied","Data":"c6ea0593e52841cfe315b1a1fbbe5a0e3b8ae71f884283e33c98998a4f71acda"} Dec 03 08:17:01 crc kubenswrapper[4831]: I1203 08:17:01.243388 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cc0b745-32e7-4c31-ad91-c51f041a9836","Type":"ContainerDied","Data":"77b74ef84489863ca2f1f29efb10e43b20418c4355ff5117989ea5ddf33e6f5a"} Dec 03 08:17:01 crc kubenswrapper[4831]: I1203 08:17:01.243408 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cc0b745-32e7-4c31-ad91-c51f041a9836","Type":"ContainerDied","Data":"41dd686c4d27db2d367289ffe12aaa02fae5cf59d91c25931661ca5246e26c35"} Dec 03 08:17:01 crc kubenswrapper[4831]: I1203 08:17:01.495957 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Dec 03 08:17:01 crc kubenswrapper[4831]: I1203 08:17:01.911870 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.086921 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-config-data\") pod \"0cc0b745-32e7-4c31-ad91-c51f041a9836\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.087252 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-scripts\") pod \"0cc0b745-32e7-4c31-ad91-c51f041a9836\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.087364 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jl42\" (UniqueName: \"kubernetes.io/projected/0cc0b745-32e7-4c31-ad91-c51f041a9836-kube-api-access-6jl42\") pod \"0cc0b745-32e7-4c31-ad91-c51f041a9836\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.087458 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc0b745-32e7-4c31-ad91-c51f041a9836-run-httpd\") pod \"0cc0b745-32e7-4c31-ad91-c51f041a9836\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.087575 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-combined-ca-bundle\") pod \"0cc0b745-32e7-4c31-ad91-c51f041a9836\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.087725 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-sg-core-conf-yaml\") pod \"0cc0b745-32e7-4c31-ad91-c51f041a9836\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.087983 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc0b745-32e7-4c31-ad91-c51f041a9836-log-httpd\") pod \"0cc0b745-32e7-4c31-ad91-c51f041a9836\" (UID: \"0cc0b745-32e7-4c31-ad91-c51f041a9836\") " Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.089619 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc0b745-32e7-4c31-ad91-c51f041a9836-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "0cc0b745-32e7-4c31-ad91-c51f041a9836" (UID: "0cc0b745-32e7-4c31-ad91-c51f041a9836"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.092557 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0cc0b745-32e7-4c31-ad91-c51f041a9836-kube-api-access-6jl42" (OuterVolumeSpecName: "kube-api-access-6jl42") pod "0cc0b745-32e7-4c31-ad91-c51f041a9836" (UID: "0cc0b745-32e7-4c31-ad91-c51f041a9836"). InnerVolumeSpecName "kube-api-access-6jl42". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.094899 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-scripts" (OuterVolumeSpecName: "scripts") pod "0cc0b745-32e7-4c31-ad91-c51f041a9836" (UID: "0cc0b745-32e7-4c31-ad91-c51f041a9836"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.104242 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0cc0b745-32e7-4c31-ad91-c51f041a9836-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "0cc0b745-32e7-4c31-ad91-c51f041a9836" (UID: "0cc0b745-32e7-4c31-ad91-c51f041a9836"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.104950 4831 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc0b745-32e7-4c31-ad91-c51f041a9836-log-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.104984 4831 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-scripts\") on node \"crc\" DevicePath \"\"" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.104995 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jl42\" (UniqueName: \"kubernetes.io/projected/0cc0b745-32e7-4c31-ad91-c51f041a9836-kube-api-access-6jl42\") on node \"crc\" DevicePath \"\"" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.105004 4831 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/0cc0b745-32e7-4c31-ad91-c51f041a9836-run-httpd\") on node \"crc\" DevicePath \"\"" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.153492 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "0cc0b745-32e7-4c31-ad91-c51f041a9836" (UID: "0cc0b745-32e7-4c31-ad91-c51f041a9836"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.207301 4831 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.207591 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0cc0b745-32e7-4c31-ad91-c51f041a9836" (UID: "0cc0b745-32e7-4c31-ad91-c51f041a9836"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.243155 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-config-data" (OuterVolumeSpecName: "config-data") pod "0cc0b745-32e7-4c31-ad91-c51f041a9836" (UID: "0cc0b745-32e7-4c31-ad91-c51f041a9836"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.259033 4831 generic.go:334] "Generic (PLEG): container finished" podID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerID="347a56ba3f002595bcb2038b6954987ff5b3044de178459b2d3723fcbae6e042" exitCode=0 Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.259076 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cc0b745-32e7-4c31-ad91-c51f041a9836","Type":"ContainerDied","Data":"347a56ba3f002595bcb2038b6954987ff5b3044de178459b2d3723fcbae6e042"} Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.259103 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"0cc0b745-32e7-4c31-ad91-c51f041a9836","Type":"ContainerDied","Data":"f0038dd5104cf6fe5ff54fa5d8a141d28f9694a2103b3c1225232e700c0adf94"} Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.259120 4831 scope.go:117] "RemoveContainer" containerID="c6ea0593e52841cfe315b1a1fbbe5a0e3b8ae71f884283e33c98998a4f71acda" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.259252 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.297456 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.305828 4831 scope.go:117] "RemoveContainer" containerID="77b74ef84489863ca2f1f29efb10e43b20418c4355ff5117989ea5ddf33e6f5a" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.307485 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.309072 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.309199 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0cc0b745-32e7-4c31-ad91-c51f041a9836-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.317557 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:17:02 crc kubenswrapper[4831]: E1203 08:17:02.318057 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="ceilometer-central-agent" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.318073 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="ceilometer-central-agent" Dec 03 08:17:02 crc kubenswrapper[4831]: E1203 08:17:02.318098 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="ceilometer-notification-agent" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.318105 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="ceilometer-notification-agent" Dec 03 08:17:02 crc kubenswrapper[4831]: E1203 08:17:02.318122 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="sg-core" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.318128 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="sg-core" Dec 03 08:17:02 crc kubenswrapper[4831]: E1203 08:17:02.318140 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="proxy-httpd" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.318145 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="proxy-httpd" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.318437 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="sg-core" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.318457 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="ceilometer-notification-agent" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.318467 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="ceilometer-central-agent" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.318479 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" containerName="proxy-httpd" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.320330 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.327903 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.328070 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.354179 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.361543 4831 scope.go:117] "RemoveContainer" containerID="41dd686c4d27db2d367289ffe12aaa02fae5cf59d91c25931661ca5246e26c35" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.386031 4831 scope.go:117] "RemoveContainer" containerID="347a56ba3f002595bcb2038b6954987ff5b3044de178459b2d3723fcbae6e042" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.410011 4831 scope.go:117] "RemoveContainer" containerID="c6ea0593e52841cfe315b1a1fbbe5a0e3b8ae71f884283e33c98998a4f71acda" Dec 03 08:17:02 crc kubenswrapper[4831]: E1203 08:17:02.410510 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6ea0593e52841cfe315b1a1fbbe5a0e3b8ae71f884283e33c98998a4f71acda\": container with ID starting with c6ea0593e52841cfe315b1a1fbbe5a0e3b8ae71f884283e33c98998a4f71acda not found: ID does not exist" containerID="c6ea0593e52841cfe315b1a1fbbe5a0e3b8ae71f884283e33c98998a4f71acda" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.410549 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6ea0593e52841cfe315b1a1fbbe5a0e3b8ae71f884283e33c98998a4f71acda"} err="failed to get container status \"c6ea0593e52841cfe315b1a1fbbe5a0e3b8ae71f884283e33c98998a4f71acda\": rpc error: code = NotFound desc = could not find container \"c6ea0593e52841cfe315b1a1fbbe5a0e3b8ae71f884283e33c98998a4f71acda\": container with ID starting with c6ea0593e52841cfe315b1a1fbbe5a0e3b8ae71f884283e33c98998a4f71acda not found: ID does not exist" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.410575 4831 scope.go:117] "RemoveContainer" containerID="77b74ef84489863ca2f1f29efb10e43b20418c4355ff5117989ea5ddf33e6f5a" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.410927 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6db43ea-cd06-4251-92b9-3b66231110ba-run-httpd\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.411037 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6db43ea-cd06-4251-92b9-3b66231110ba-log-httpd\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: E1203 08:17:02.411048 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77b74ef84489863ca2f1f29efb10e43b20418c4355ff5117989ea5ddf33e6f5a\": container with ID starting with 77b74ef84489863ca2f1f29efb10e43b20418c4355ff5117989ea5ddf33e6f5a not found: ID does not exist" containerID="77b74ef84489863ca2f1f29efb10e43b20418c4355ff5117989ea5ddf33e6f5a" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.411066 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6db43ea-cd06-4251-92b9-3b66231110ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.411086 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6db43ea-cd06-4251-92b9-3b66231110ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.411486 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6db43ea-cd06-4251-92b9-3b66231110ba-scripts\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.411698 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6db43ea-cd06-4251-92b9-3b66231110ba-config-data\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.411794 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77m9c\" (UniqueName: \"kubernetes.io/projected/f6db43ea-cd06-4251-92b9-3b66231110ba-kube-api-access-77m9c\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.411090 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77b74ef84489863ca2f1f29efb10e43b20418c4355ff5117989ea5ddf33e6f5a"} err="failed to get container status \"77b74ef84489863ca2f1f29efb10e43b20418c4355ff5117989ea5ddf33e6f5a\": rpc error: code = NotFound desc = could not find container \"77b74ef84489863ca2f1f29efb10e43b20418c4355ff5117989ea5ddf33e6f5a\": container with ID starting with 77b74ef84489863ca2f1f29efb10e43b20418c4355ff5117989ea5ddf33e6f5a not found: ID does not exist" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.412045 4831 scope.go:117] "RemoveContainer" containerID="41dd686c4d27db2d367289ffe12aaa02fae5cf59d91c25931661ca5246e26c35" Dec 03 08:17:02 crc kubenswrapper[4831]: E1203 08:17:02.412612 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41dd686c4d27db2d367289ffe12aaa02fae5cf59d91c25931661ca5246e26c35\": container with ID starting with 41dd686c4d27db2d367289ffe12aaa02fae5cf59d91c25931661ca5246e26c35 not found: ID does not exist" containerID="41dd686c4d27db2d367289ffe12aaa02fae5cf59d91c25931661ca5246e26c35" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.412655 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41dd686c4d27db2d367289ffe12aaa02fae5cf59d91c25931661ca5246e26c35"} err="failed to get container status \"41dd686c4d27db2d367289ffe12aaa02fae5cf59d91c25931661ca5246e26c35\": rpc error: code = NotFound desc = could not find container \"41dd686c4d27db2d367289ffe12aaa02fae5cf59d91c25931661ca5246e26c35\": container with ID starting with 41dd686c4d27db2d367289ffe12aaa02fae5cf59d91c25931661ca5246e26c35 not found: ID does not exist" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.412686 4831 scope.go:117] "RemoveContainer" containerID="347a56ba3f002595bcb2038b6954987ff5b3044de178459b2d3723fcbae6e042" Dec 03 08:17:02 crc kubenswrapper[4831]: E1203 08:17:02.413048 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"347a56ba3f002595bcb2038b6954987ff5b3044de178459b2d3723fcbae6e042\": container with ID starting with 347a56ba3f002595bcb2038b6954987ff5b3044de178459b2d3723fcbae6e042 not found: ID does not exist" containerID="347a56ba3f002595bcb2038b6954987ff5b3044de178459b2d3723fcbae6e042" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.413080 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"347a56ba3f002595bcb2038b6954987ff5b3044de178459b2d3723fcbae6e042"} err="failed to get container status \"347a56ba3f002595bcb2038b6954987ff5b3044de178459b2d3723fcbae6e042\": rpc error: code = NotFound desc = could not find container \"347a56ba3f002595bcb2038b6954987ff5b3044de178459b2d3723fcbae6e042\": container with ID starting with 347a56ba3f002595bcb2038b6954987ff5b3044de178459b2d3723fcbae6e042 not found: ID does not exist" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.514027 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6db43ea-cd06-4251-92b9-3b66231110ba-scripts\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.514113 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6db43ea-cd06-4251-92b9-3b66231110ba-config-data\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.514154 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77m9c\" (UniqueName: \"kubernetes.io/projected/f6db43ea-cd06-4251-92b9-3b66231110ba-kube-api-access-77m9c\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.514197 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6db43ea-cd06-4251-92b9-3b66231110ba-run-httpd\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.514252 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6db43ea-cd06-4251-92b9-3b66231110ba-log-httpd\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.514270 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6db43ea-cd06-4251-92b9-3b66231110ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.514291 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6db43ea-cd06-4251-92b9-3b66231110ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.515236 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6db43ea-cd06-4251-92b9-3b66231110ba-log-httpd\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.515246 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f6db43ea-cd06-4251-92b9-3b66231110ba-run-httpd\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.519370 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f6db43ea-cd06-4251-92b9-3b66231110ba-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.519678 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f6db43ea-cd06-4251-92b9-3b66231110ba-scripts\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.521105 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f6db43ea-cd06-4251-92b9-3b66231110ba-config-data\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.524134 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f6db43ea-cd06-4251-92b9-3b66231110ba-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.532850 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77m9c\" (UniqueName: \"kubernetes.io/projected/f6db43ea-cd06-4251-92b9-3b66231110ba-kube-api-access-77m9c\") pod \"ceilometer-0\" (UID: \"f6db43ea-cd06-4251-92b9-3b66231110ba\") " pod="openstack/ceilometer-0" Dec 03 08:17:02 crc kubenswrapper[4831]: I1203 08:17:02.663588 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Dec 03 08:17:03 crc kubenswrapper[4831]: I1203 08:17:03.042761 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0cc0b745-32e7-4c31-ad91-c51f041a9836" path="/var/lib/kubelet/pods/0cc0b745-32e7-4c31-ad91-c51f041a9836/volumes" Dec 03 08:17:03 crc kubenswrapper[4831]: I1203 08:17:03.062525 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Dec 03 08:17:03 crc kubenswrapper[4831]: I1203 08:17:03.220996 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Dec 03 08:17:03 crc kubenswrapper[4831]: I1203 08:17:03.270272 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6db43ea-cd06-4251-92b9-3b66231110ba","Type":"ContainerStarted","Data":"fb2ee255b300bbabf7baf1f27fc443e0881c44b427a619f0982b9701b75a4c27"} Dec 03 08:17:03 crc kubenswrapper[4831]: I1203 08:17:03.406908 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Dec 03 08:17:03 crc kubenswrapper[4831]: I1203 08:17:03.441805 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Dec 03 08:17:04 crc kubenswrapper[4831]: I1203 08:17:04.288183 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6db43ea-cd06-4251-92b9-3b66231110ba","Type":"ContainerStarted","Data":"f1f09b357e0300a91b2bb4a7830afe43e47684c07a367109d3d65fa0eb84dd7a"} Dec 03 08:17:05 crc kubenswrapper[4831]: I1203 08:17:05.298846 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6db43ea-cd06-4251-92b9-3b66231110ba","Type":"ContainerStarted","Data":"11f5e9c072d8ee8367534cd8c888857d27b182b1f75b8e22ce62e419233a9973"} Dec 03 08:17:06 crc kubenswrapper[4831]: I1203 08:17:06.310795 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6db43ea-cd06-4251-92b9-3b66231110ba","Type":"ContainerStarted","Data":"716e79cab405817b60da12fadf2f60d5f76ed6db7b548aebe7dc06c109131652"} Dec 03 08:17:08 crc kubenswrapper[4831]: I1203 08:17:08.347237 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f6db43ea-cd06-4251-92b9-3b66231110ba","Type":"ContainerStarted","Data":"ecf8e0bc151527b421ebb77b67ec41060e3c79640da462bfbdc7f0ba5a2c3108"} Dec 03 08:17:08 crc kubenswrapper[4831]: I1203 08:17:08.347667 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Dec 03 08:17:08 crc kubenswrapper[4831]: I1203 08:17:08.386720 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.378905246 podStartE2EDuration="6.386698457s" podCreationTimestamp="2025-12-03 08:17:02 +0000 UTC" firstStartedPulling="2025-12-03 08:17:03.235872145 +0000 UTC m=+6360.579455653" lastFinishedPulling="2025-12-03 08:17:07.243665356 +0000 UTC m=+6364.587248864" observedRunningTime="2025-12-03 08:17:08.369492261 +0000 UTC m=+6365.713075769" watchObservedRunningTime="2025-12-03 08:17:08.386698457 +0000 UTC m=+6365.730281965" Dec 03 08:17:24 crc kubenswrapper[4831]: I1203 08:17:24.478060 4831 scope.go:117] "RemoveContainer" containerID="8abd76b0bd850cc8d760482a65b0688aa91ebaba0281bdc1736e721dd5d31911" Dec 03 08:17:24 crc kubenswrapper[4831]: I1203 08:17:24.526102 4831 scope.go:117] "RemoveContainer" containerID="73a3bc39da2ba251a65c32c919cb5e27ca0fd492c08cdecb2d19a87f429875c0" Dec 03 08:17:24 crc kubenswrapper[4831]: I1203 08:17:24.565736 4831 scope.go:117] "RemoveContainer" containerID="60f57e9f1925805d73f9e26237c75ccc38b17081cf89e1d75e623dcd98b9d30c" Dec 03 08:17:26 crc kubenswrapper[4831]: I1203 08:17:26.937534 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-ds5tb"] Dec 03 08:17:26 crc kubenswrapper[4831]: I1203 08:17:26.942888 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:26 crc kubenswrapper[4831]: I1203 08:17:26.948509 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ds5tb"] Dec 03 08:17:26 crc kubenswrapper[4831]: I1203 08:17:26.994550 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-catalog-content\") pod \"certified-operators-ds5tb\" (UID: \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\") " pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:26 crc kubenswrapper[4831]: I1203 08:17:26.994697 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89f94\" (UniqueName: \"kubernetes.io/projected/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-kube-api-access-89f94\") pod \"certified-operators-ds5tb\" (UID: \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\") " pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:26 crc kubenswrapper[4831]: I1203 08:17:26.994763 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-utilities\") pod \"certified-operators-ds5tb\" (UID: \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\") " pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:27 crc kubenswrapper[4831]: I1203 08:17:27.096791 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-utilities\") pod \"certified-operators-ds5tb\" (UID: \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\") " pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:27 crc kubenswrapper[4831]: I1203 08:17:27.097094 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-catalog-content\") pod \"certified-operators-ds5tb\" (UID: \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\") " pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:27 crc kubenswrapper[4831]: I1203 08:17:27.097178 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-89f94\" (UniqueName: \"kubernetes.io/projected/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-kube-api-access-89f94\") pod \"certified-operators-ds5tb\" (UID: \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\") " pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:27 crc kubenswrapper[4831]: I1203 08:17:27.098947 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-utilities\") pod \"certified-operators-ds5tb\" (UID: \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\") " pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:27 crc kubenswrapper[4831]: I1203 08:17:27.098991 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-catalog-content\") pod \"certified-operators-ds5tb\" (UID: \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\") " pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:27 crc kubenswrapper[4831]: I1203 08:17:27.121334 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-89f94\" (UniqueName: \"kubernetes.io/projected/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-kube-api-access-89f94\") pod \"certified-operators-ds5tb\" (UID: \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\") " pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:27 crc kubenswrapper[4831]: I1203 08:17:27.306525 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:27 crc kubenswrapper[4831]: I1203 08:17:27.598129 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:17:27 crc kubenswrapper[4831]: I1203 08:17:27.598772 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:17:27 crc kubenswrapper[4831]: I1203 08:17:27.837454 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-ds5tb"] Dec 03 08:17:28 crc kubenswrapper[4831]: I1203 08:17:28.603362 4831 generic.go:334] "Generic (PLEG): container finished" podID="f787a486-4e3a-4abb-a255-b9ba80bbbd1d" containerID="0ad2e7e1c1ae9f45088bc71879fd0fa22c37c523bea7c18880ad6029b0fce58b" exitCode=0 Dec 03 08:17:28 crc kubenswrapper[4831]: I1203 08:17:28.603502 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds5tb" event={"ID":"f787a486-4e3a-4abb-a255-b9ba80bbbd1d","Type":"ContainerDied","Data":"0ad2e7e1c1ae9f45088bc71879fd0fa22c37c523bea7c18880ad6029b0fce58b"} Dec 03 08:17:28 crc kubenswrapper[4831]: I1203 08:17:28.603700 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds5tb" event={"ID":"f787a486-4e3a-4abb-a255-b9ba80bbbd1d","Type":"ContainerStarted","Data":"f7622b5ae51562260b6da8569aaafb552f53c8b9feda44d23311a6b791235dad"} Dec 03 08:17:30 crc kubenswrapper[4831]: I1203 08:17:30.624043 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds5tb" event={"ID":"f787a486-4e3a-4abb-a255-b9ba80bbbd1d","Type":"ContainerStarted","Data":"48195ee627286695da09383ae2cfeb4fd851b670201743fc689bb29f975f2183"} Dec 03 08:17:31 crc kubenswrapper[4831]: I1203 08:17:31.640458 4831 generic.go:334] "Generic (PLEG): container finished" podID="f787a486-4e3a-4abb-a255-b9ba80bbbd1d" containerID="48195ee627286695da09383ae2cfeb4fd851b670201743fc689bb29f975f2183" exitCode=0 Dec 03 08:17:31 crc kubenswrapper[4831]: I1203 08:17:31.640508 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds5tb" event={"ID":"f787a486-4e3a-4abb-a255-b9ba80bbbd1d","Type":"ContainerDied","Data":"48195ee627286695da09383ae2cfeb4fd851b670201743fc689bb29f975f2183"} Dec 03 08:17:32 crc kubenswrapper[4831]: I1203 08:17:32.655013 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds5tb" event={"ID":"f787a486-4e3a-4abb-a255-b9ba80bbbd1d","Type":"ContainerStarted","Data":"2932d0827a4aacfe71e173fe04bedd15478acdd93e2c6a1124d79d82c9d2a983"} Dec 03 08:17:32 crc kubenswrapper[4831]: I1203 08:17:32.678336 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-ds5tb" podStartSLOduration=3.1431151489999998 podStartE2EDuration="6.678304301s" podCreationTimestamp="2025-12-03 08:17:26 +0000 UTC" firstStartedPulling="2025-12-03 08:17:28.607957611 +0000 UTC m=+6385.951541119" lastFinishedPulling="2025-12-03 08:17:32.143146763 +0000 UTC m=+6389.486730271" observedRunningTime="2025-12-03 08:17:32.673818531 +0000 UTC m=+6390.017402079" watchObservedRunningTime="2025-12-03 08:17:32.678304301 +0000 UTC m=+6390.021887809" Dec 03 08:17:32 crc kubenswrapper[4831]: I1203 08:17:32.678631 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Dec 03 08:17:37 crc kubenswrapper[4831]: I1203 08:17:37.307407 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:37 crc kubenswrapper[4831]: I1203 08:17:37.308000 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:37 crc kubenswrapper[4831]: I1203 08:17:37.358547 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:37 crc kubenswrapper[4831]: I1203 08:17:37.786782 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:37 crc kubenswrapper[4831]: I1203 08:17:37.838478 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ds5tb"] Dec 03 08:17:39 crc kubenswrapper[4831]: I1203 08:17:39.748874 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-ds5tb" podUID="f787a486-4e3a-4abb-a255-b9ba80bbbd1d" containerName="registry-server" containerID="cri-o://2932d0827a4aacfe71e173fe04bedd15478acdd93e2c6a1124d79d82c9d2a983" gracePeriod=2 Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.374288 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.499849 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-89f94\" (UniqueName: \"kubernetes.io/projected/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-kube-api-access-89f94\") pod \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\" (UID: \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\") " Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.499959 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-utilities\") pod \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\" (UID: \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\") " Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.500034 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-catalog-content\") pod \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\" (UID: \"f787a486-4e3a-4abb-a255-b9ba80bbbd1d\") " Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.502102 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-utilities" (OuterVolumeSpecName: "utilities") pod "f787a486-4e3a-4abb-a255-b9ba80bbbd1d" (UID: "f787a486-4e3a-4abb-a255-b9ba80bbbd1d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.511212 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-kube-api-access-89f94" (OuterVolumeSpecName: "kube-api-access-89f94") pod "f787a486-4e3a-4abb-a255-b9ba80bbbd1d" (UID: "f787a486-4e3a-4abb-a255-b9ba80bbbd1d"). InnerVolumeSpecName "kube-api-access-89f94". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.590841 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f787a486-4e3a-4abb-a255-b9ba80bbbd1d" (UID: "f787a486-4e3a-4abb-a255-b9ba80bbbd1d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.602060 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-89f94\" (UniqueName: \"kubernetes.io/projected/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-kube-api-access-89f94\") on node \"crc\" DevicePath \"\"" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.602097 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.602109 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f787a486-4e3a-4abb-a255-b9ba80bbbd1d-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.760396 4831 generic.go:334] "Generic (PLEG): container finished" podID="f787a486-4e3a-4abb-a255-b9ba80bbbd1d" containerID="2932d0827a4aacfe71e173fe04bedd15478acdd93e2c6a1124d79d82c9d2a983" exitCode=0 Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.760438 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds5tb" event={"ID":"f787a486-4e3a-4abb-a255-b9ba80bbbd1d","Type":"ContainerDied","Data":"2932d0827a4aacfe71e173fe04bedd15478acdd93e2c6a1124d79d82c9d2a983"} Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.760474 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-ds5tb" event={"ID":"f787a486-4e3a-4abb-a255-b9ba80bbbd1d","Type":"ContainerDied","Data":"f7622b5ae51562260b6da8569aaafb552f53c8b9feda44d23311a6b791235dad"} Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.760495 4831 scope.go:117] "RemoveContainer" containerID="2932d0827a4aacfe71e173fe04bedd15478acdd93e2c6a1124d79d82c9d2a983" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.760527 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-ds5tb" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.794893 4831 scope.go:117] "RemoveContainer" containerID="48195ee627286695da09383ae2cfeb4fd851b670201743fc689bb29f975f2183" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.803578 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-ds5tb"] Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.818160 4831 scope.go:117] "RemoveContainer" containerID="0ad2e7e1c1ae9f45088bc71879fd0fa22c37c523bea7c18880ad6029b0fce58b" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.823250 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-ds5tb"] Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.877483 4831 scope.go:117] "RemoveContainer" containerID="2932d0827a4aacfe71e173fe04bedd15478acdd93e2c6a1124d79d82c9d2a983" Dec 03 08:17:40 crc kubenswrapper[4831]: E1203 08:17:40.877945 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2932d0827a4aacfe71e173fe04bedd15478acdd93e2c6a1124d79d82c9d2a983\": container with ID starting with 2932d0827a4aacfe71e173fe04bedd15478acdd93e2c6a1124d79d82c9d2a983 not found: ID does not exist" containerID="2932d0827a4aacfe71e173fe04bedd15478acdd93e2c6a1124d79d82c9d2a983" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.877986 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2932d0827a4aacfe71e173fe04bedd15478acdd93e2c6a1124d79d82c9d2a983"} err="failed to get container status \"2932d0827a4aacfe71e173fe04bedd15478acdd93e2c6a1124d79d82c9d2a983\": rpc error: code = NotFound desc = could not find container \"2932d0827a4aacfe71e173fe04bedd15478acdd93e2c6a1124d79d82c9d2a983\": container with ID starting with 2932d0827a4aacfe71e173fe04bedd15478acdd93e2c6a1124d79d82c9d2a983 not found: ID does not exist" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.878017 4831 scope.go:117] "RemoveContainer" containerID="48195ee627286695da09383ae2cfeb4fd851b670201743fc689bb29f975f2183" Dec 03 08:17:40 crc kubenswrapper[4831]: E1203 08:17:40.878338 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48195ee627286695da09383ae2cfeb4fd851b670201743fc689bb29f975f2183\": container with ID starting with 48195ee627286695da09383ae2cfeb4fd851b670201743fc689bb29f975f2183 not found: ID does not exist" containerID="48195ee627286695da09383ae2cfeb4fd851b670201743fc689bb29f975f2183" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.878361 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48195ee627286695da09383ae2cfeb4fd851b670201743fc689bb29f975f2183"} err="failed to get container status \"48195ee627286695da09383ae2cfeb4fd851b670201743fc689bb29f975f2183\": rpc error: code = NotFound desc = could not find container \"48195ee627286695da09383ae2cfeb4fd851b670201743fc689bb29f975f2183\": container with ID starting with 48195ee627286695da09383ae2cfeb4fd851b670201743fc689bb29f975f2183 not found: ID does not exist" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.878373 4831 scope.go:117] "RemoveContainer" containerID="0ad2e7e1c1ae9f45088bc71879fd0fa22c37c523bea7c18880ad6029b0fce58b" Dec 03 08:17:40 crc kubenswrapper[4831]: E1203 08:17:40.878640 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ad2e7e1c1ae9f45088bc71879fd0fa22c37c523bea7c18880ad6029b0fce58b\": container with ID starting with 0ad2e7e1c1ae9f45088bc71879fd0fa22c37c523bea7c18880ad6029b0fce58b not found: ID does not exist" containerID="0ad2e7e1c1ae9f45088bc71879fd0fa22c37c523bea7c18880ad6029b0fce58b" Dec 03 08:17:40 crc kubenswrapper[4831]: I1203 08:17:40.878743 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ad2e7e1c1ae9f45088bc71879fd0fa22c37c523bea7c18880ad6029b0fce58b"} err="failed to get container status \"0ad2e7e1c1ae9f45088bc71879fd0fa22c37c523bea7c18880ad6029b0fce58b\": rpc error: code = NotFound desc = could not find container \"0ad2e7e1c1ae9f45088bc71879fd0fa22c37c523bea7c18880ad6029b0fce58b\": container with ID starting with 0ad2e7e1c1ae9f45088bc71879fd0fa22c37c523bea7c18880ad6029b0fce58b not found: ID does not exist" Dec 03 08:17:41 crc kubenswrapper[4831]: I1203 08:17:41.030036 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f787a486-4e3a-4abb-a255-b9ba80bbbd1d" path="/var/lib/kubelet/pods/f787a486-4e3a-4abb-a255-b9ba80bbbd1d/volumes" Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.779953 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7444cd68c7-2ggnl"] Dec 03 08:17:52 crc kubenswrapper[4831]: E1203 08:17:52.781801 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f787a486-4e3a-4abb-a255-b9ba80bbbd1d" containerName="extract-content" Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.781899 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f787a486-4e3a-4abb-a255-b9ba80bbbd1d" containerName="extract-content" Dec 03 08:17:52 crc kubenswrapper[4831]: E1203 08:17:52.782007 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f787a486-4e3a-4abb-a255-b9ba80bbbd1d" containerName="extract-utilities" Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.782081 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f787a486-4e3a-4abb-a255-b9ba80bbbd1d" containerName="extract-utilities" Dec 03 08:17:52 crc kubenswrapper[4831]: E1203 08:17:52.782154 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f787a486-4e3a-4abb-a255-b9ba80bbbd1d" containerName="registry-server" Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.782217 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f787a486-4e3a-4abb-a255-b9ba80bbbd1d" containerName="registry-server" Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.782562 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f787a486-4e3a-4abb-a255-b9ba80bbbd1d" containerName="registry-server" Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.784067 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.786608 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1" Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.799006 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7444cd68c7-2ggnl"] Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.916356 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-dns-svc\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.917220 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-ovsdbserver-sb\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.917273 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khnhz\" (UniqueName: \"kubernetes.io/projected/3dd64874-324e-4961-ac48-1f768a8f9ebe-kube-api-access-khnhz\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.917299 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-openstack-cell1\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.917346 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-ovsdbserver-nb\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:52 crc kubenswrapper[4831]: I1203 08:17:52.917512 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-config\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.018895 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-dns-svc\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.018950 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-ovsdbserver-sb\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.018980 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khnhz\" (UniqueName: \"kubernetes.io/projected/3dd64874-324e-4961-ac48-1f768a8f9ebe-kube-api-access-khnhz\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.019000 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-openstack-cell1\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.019023 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-ovsdbserver-nb\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.019148 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-config\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.020238 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-openstack-cell1\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.020291 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-config\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.020441 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-ovsdbserver-sb\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.020457 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-dns-svc\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.020979 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-ovsdbserver-nb\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.055551 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khnhz\" (UniqueName: \"kubernetes.io/projected/3dd64874-324e-4961-ac48-1f768a8f9ebe-kube-api-access-khnhz\") pod \"dnsmasq-dns-7444cd68c7-2ggnl\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.112805 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.597596 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7444cd68c7-2ggnl"] Dec 03 08:17:53 crc kubenswrapper[4831]: I1203 08:17:53.901523 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" event={"ID":"3dd64874-324e-4961-ac48-1f768a8f9ebe","Type":"ContainerStarted","Data":"219ed7fce0ff5f4e3a547a2944792274f04e3dd8ea25d0e0854c06ef57449505"} Dec 03 08:17:54 crc kubenswrapper[4831]: I1203 08:17:54.916013 4831 generic.go:334] "Generic (PLEG): container finished" podID="3dd64874-324e-4961-ac48-1f768a8f9ebe" containerID="d56c6dfd6c9477d21153e3ba5a2597e0c8ea4260f5739ca124904c84fa5a9f32" exitCode=0 Dec 03 08:17:54 crc kubenswrapper[4831]: I1203 08:17:54.916071 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" event={"ID":"3dd64874-324e-4961-ac48-1f768a8f9ebe","Type":"ContainerDied","Data":"d56c6dfd6c9477d21153e3ba5a2597e0c8ea4260f5739ca124904c84fa5a9f32"} Dec 03 08:17:55 crc kubenswrapper[4831]: I1203 08:17:55.935407 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" event={"ID":"3dd64874-324e-4961-ac48-1f768a8f9ebe","Type":"ContainerStarted","Data":"52bb5d819838a872ea9e30546ad1e85afd6223bc23a9519348c6576bbb82e9af"} Dec 03 08:17:55 crc kubenswrapper[4831]: I1203 08:17:55.935879 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:17:55 crc kubenswrapper[4831]: I1203 08:17:55.974131 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" podStartSLOduration=3.974105414 podStartE2EDuration="3.974105414s" podCreationTimestamp="2025-12-03 08:17:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:17:55.954849274 +0000 UTC m=+6413.298432832" watchObservedRunningTime="2025-12-03 08:17:55.974105414 +0000 UTC m=+6413.317688952" Dec 03 08:17:57 crc kubenswrapper[4831]: I1203 08:17:57.596447 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:17:57 crc kubenswrapper[4831]: I1203 08:17:57.596887 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:17:57 crc kubenswrapper[4831]: I1203 08:17:57.596964 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 08:17:57 crc kubenswrapper[4831]: I1203 08:17:57.598303 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:17:57 crc kubenswrapper[4831]: I1203 08:17:57.598456 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" gracePeriod=600 Dec 03 08:17:57 crc kubenswrapper[4831]: E1203 08:17:57.733162 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:17:57 crc kubenswrapper[4831]: I1203 08:17:57.959529 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" exitCode=0 Dec 03 08:17:57 crc kubenswrapper[4831]: I1203 08:17:57.959571 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911"} Dec 03 08:17:57 crc kubenswrapper[4831]: I1203 08:17:57.959603 4831 scope.go:117] "RemoveContainer" containerID="98f056b5c45cb7c3ebbde6e6e8b7c794ed7234028aab60b3d5caa65ca4fbade0" Dec 03 08:17:57 crc kubenswrapper[4831]: I1203 08:17:57.960742 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:17:57 crc kubenswrapper[4831]: E1203 08:17:57.961505 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.115250 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.192818 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6448f4c67c-vfcck"] Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.193144 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" podUID="fd0d52c9-2ad8-4410-a080-8d68193ca4e7" containerName="dnsmasq-dns" containerID="cri-o://864892b9eea956390575a465e97829cc94b6ff18d69a7cd820f16cd52a604e90" gracePeriod=10 Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.373133 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5db96bfff9-kgj7z"] Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.380272 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.406960 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5db96bfff9-kgj7z"] Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.592084 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtg78\" (UniqueName: \"kubernetes.io/projected/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-kube-api-access-gtg78\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.592475 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-config\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.592711 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-ovsdbserver-nb\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.592858 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-ovsdbserver-sb\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.592949 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-dns-svc\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.593074 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-openstack-cell1\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.699036 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-config\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.699105 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-ovsdbserver-nb\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.699137 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-ovsdbserver-sb\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.699176 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-dns-svc\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.699223 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-openstack-cell1\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.699288 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gtg78\" (UniqueName: \"kubernetes.io/projected/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-kube-api-access-gtg78\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.700744 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-config\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.701347 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-ovsdbserver-nb\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.702162 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-openstack-cell1\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.702180 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-dns-svc\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.702448 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-ovsdbserver-sb\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.735816 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtg78\" (UniqueName: \"kubernetes.io/projected/4bc527be-3f47-4fac-9edd-4252cc8e6ee1-kube-api-access-gtg78\") pod \"dnsmasq-dns-5db96bfff9-kgj7z\" (UID: \"4bc527be-3f47-4fac-9edd-4252cc8e6ee1\") " pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.824024 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.905719 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jr45r\" (UniqueName: \"kubernetes.io/projected/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-kube-api-access-jr45r\") pod \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.905784 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-dns-svc\") pod \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.905842 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-ovsdbserver-sb\") pod \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.905997 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-config\") pod \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.906023 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-ovsdbserver-nb\") pod \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\" (UID: \"fd0d52c9-2ad8-4410-a080-8d68193ca4e7\") " Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.927633 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-kube-api-access-jr45r" (OuterVolumeSpecName: "kube-api-access-jr45r") pod "fd0d52c9-2ad8-4410-a080-8d68193ca4e7" (UID: "fd0d52c9-2ad8-4410-a080-8d68193ca4e7"). InnerVolumeSpecName "kube-api-access-jr45r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.963601 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-config" (OuterVolumeSpecName: "config") pod "fd0d52c9-2ad8-4410-a080-8d68193ca4e7" (UID: "fd0d52c9-2ad8-4410-a080-8d68193ca4e7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.977699 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fd0d52c9-2ad8-4410-a080-8d68193ca4e7" (UID: "fd0d52c9-2ad8-4410-a080-8d68193ca4e7"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:18:03 crc kubenswrapper[4831]: I1203 08:18:03.980054 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fd0d52c9-2ad8-4410-a080-8d68193ca4e7" (UID: "fd0d52c9-2ad8-4410-a080-8d68193ca4e7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.014307 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.016583 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.016614 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.016627 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.016636 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jr45r\" (UniqueName: \"kubernetes.io/projected/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-kube-api-access-jr45r\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.018925 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fd0d52c9-2ad8-4410-a080-8d68193ca4e7" (UID: "fd0d52c9-2ad8-4410-a080-8d68193ca4e7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.041504 4831 generic.go:334] "Generic (PLEG): container finished" podID="fd0d52c9-2ad8-4410-a080-8d68193ca4e7" containerID="864892b9eea956390575a465e97829cc94b6ff18d69a7cd820f16cd52a604e90" exitCode=0 Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.041583 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.041597 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" event={"ID":"fd0d52c9-2ad8-4410-a080-8d68193ca4e7","Type":"ContainerDied","Data":"864892b9eea956390575a465e97829cc94b6ff18d69a7cd820f16cd52a604e90"} Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.041656 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6448f4c67c-vfcck" event={"ID":"fd0d52c9-2ad8-4410-a080-8d68193ca4e7","Type":"ContainerDied","Data":"164ba62f43fecc4576097d97c16c86fbfb12f1b0be4ebbcbd9140b161d4cf36b"} Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.041677 4831 scope.go:117] "RemoveContainer" containerID="864892b9eea956390575a465e97829cc94b6ff18d69a7cd820f16cd52a604e90" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.085495 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6448f4c67c-vfcck"] Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.086147 4831 scope.go:117] "RemoveContainer" containerID="56d46623a1859183348e18182e4a23ddca9cf73d31374f383f703b029925506d" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.094229 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6448f4c67c-vfcck"] Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.123078 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fd0d52c9-2ad8-4410-a080-8d68193ca4e7-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.124477 4831 scope.go:117] "RemoveContainer" containerID="864892b9eea956390575a465e97829cc94b6ff18d69a7cd820f16cd52a604e90" Dec 03 08:18:04 crc kubenswrapper[4831]: E1203 08:18:04.126708 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"864892b9eea956390575a465e97829cc94b6ff18d69a7cd820f16cd52a604e90\": container with ID starting with 864892b9eea956390575a465e97829cc94b6ff18d69a7cd820f16cd52a604e90 not found: ID does not exist" containerID="864892b9eea956390575a465e97829cc94b6ff18d69a7cd820f16cd52a604e90" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.126922 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"864892b9eea956390575a465e97829cc94b6ff18d69a7cd820f16cd52a604e90"} err="failed to get container status \"864892b9eea956390575a465e97829cc94b6ff18d69a7cd820f16cd52a604e90\": rpc error: code = NotFound desc = could not find container \"864892b9eea956390575a465e97829cc94b6ff18d69a7cd820f16cd52a604e90\": container with ID starting with 864892b9eea956390575a465e97829cc94b6ff18d69a7cd820f16cd52a604e90 not found: ID does not exist" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.126942 4831 scope.go:117] "RemoveContainer" containerID="56d46623a1859183348e18182e4a23ddca9cf73d31374f383f703b029925506d" Dec 03 08:18:04 crc kubenswrapper[4831]: E1203 08:18:04.127373 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56d46623a1859183348e18182e4a23ddca9cf73d31374f383f703b029925506d\": container with ID starting with 56d46623a1859183348e18182e4a23ddca9cf73d31374f383f703b029925506d not found: ID does not exist" containerID="56d46623a1859183348e18182e4a23ddca9cf73d31374f383f703b029925506d" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.127415 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d46623a1859183348e18182e4a23ddca9cf73d31374f383f703b029925506d"} err="failed to get container status \"56d46623a1859183348e18182e4a23ddca9cf73d31374f383f703b029925506d\": rpc error: code = NotFound desc = could not find container \"56d46623a1859183348e18182e4a23ddca9cf73d31374f383f703b029925506d\": container with ID starting with 56d46623a1859183348e18182e4a23ddca9cf73d31374f383f703b029925506d not found: ID does not exist" Dec 03 08:18:04 crc kubenswrapper[4831]: I1203 08:18:04.530588 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5db96bfff9-kgj7z"] Dec 03 08:18:05 crc kubenswrapper[4831]: I1203 08:18:05.030937 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd0d52c9-2ad8-4410-a080-8d68193ca4e7" path="/var/lib/kubelet/pods/fd0d52c9-2ad8-4410-a080-8d68193ca4e7/volumes" Dec 03 08:18:05 crc kubenswrapper[4831]: I1203 08:18:05.059533 4831 generic.go:334] "Generic (PLEG): container finished" podID="4bc527be-3f47-4fac-9edd-4252cc8e6ee1" containerID="2799b55002f20018ccd480a670c9aff61db036254f463c1345ea5e7228af1388" exitCode=0 Dec 03 08:18:05 crc kubenswrapper[4831]: I1203 08:18:05.059595 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" event={"ID":"4bc527be-3f47-4fac-9edd-4252cc8e6ee1","Type":"ContainerDied","Data":"2799b55002f20018ccd480a670c9aff61db036254f463c1345ea5e7228af1388"} Dec 03 08:18:05 crc kubenswrapper[4831]: I1203 08:18:05.059882 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" event={"ID":"4bc527be-3f47-4fac-9edd-4252cc8e6ee1","Type":"ContainerStarted","Data":"53ccfd931248546604be513781504c8072202158051b072d09cced8f844e1de4"} Dec 03 08:18:06 crc kubenswrapper[4831]: I1203 08:18:06.079697 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" event={"ID":"4bc527be-3f47-4fac-9edd-4252cc8e6ee1","Type":"ContainerStarted","Data":"7e68f4dbf79b4522bc41bf4b93300d8960d7de6725fdd0ef55a1c2e9f86a107e"} Dec 03 08:18:06 crc kubenswrapper[4831]: I1203 08:18:06.081622 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:06 crc kubenswrapper[4831]: I1203 08:18:06.121149 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" podStartSLOduration=3.121127274 podStartE2EDuration="3.121127274s" podCreationTimestamp="2025-12-03 08:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:18:06.106464697 +0000 UTC m=+6423.450048255" watchObservedRunningTime="2025-12-03 08:18:06.121127274 +0000 UTC m=+6423.464710782" Dec 03 08:18:10 crc kubenswrapper[4831]: I1203 08:18:10.013607 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:18:10 crc kubenswrapper[4831]: E1203 08:18:10.014739 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:18:14 crc kubenswrapper[4831]: I1203 08:18:14.015576 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5db96bfff9-kgj7z" Dec 03 08:18:14 crc kubenswrapper[4831]: I1203 08:18:14.184073 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7444cd68c7-2ggnl"] Dec 03 08:18:14 crc kubenswrapper[4831]: I1203 08:18:14.184581 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" podUID="3dd64874-324e-4961-ac48-1f768a8f9ebe" containerName="dnsmasq-dns" containerID="cri-o://52bb5d819838a872ea9e30546ad1e85afd6223bc23a9519348c6576bbb82e9af" gracePeriod=10 Dec 03 08:18:14 crc kubenswrapper[4831]: I1203 08:18:14.895046 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.080446 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-ovsdbserver-nb\") pod \"3dd64874-324e-4961-ac48-1f768a8f9ebe\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.080834 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-openstack-cell1\") pod \"3dd64874-324e-4961-ac48-1f768a8f9ebe\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.080979 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-config\") pod \"3dd64874-324e-4961-ac48-1f768a8f9ebe\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.081110 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-ovsdbserver-sb\") pod \"3dd64874-324e-4961-ac48-1f768a8f9ebe\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.081162 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khnhz\" (UniqueName: \"kubernetes.io/projected/3dd64874-324e-4961-ac48-1f768a8f9ebe-kube-api-access-khnhz\") pod \"3dd64874-324e-4961-ac48-1f768a8f9ebe\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.081239 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-dns-svc\") pod \"3dd64874-324e-4961-ac48-1f768a8f9ebe\" (UID: \"3dd64874-324e-4961-ac48-1f768a8f9ebe\") " Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.101574 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3dd64874-324e-4961-ac48-1f768a8f9ebe-kube-api-access-khnhz" (OuterVolumeSpecName: "kube-api-access-khnhz") pod "3dd64874-324e-4961-ac48-1f768a8f9ebe" (UID: "3dd64874-324e-4961-ac48-1f768a8f9ebe"). InnerVolumeSpecName "kube-api-access-khnhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.144126 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-openstack-cell1" (OuterVolumeSpecName: "openstack-cell1") pod "3dd64874-324e-4961-ac48-1f768a8f9ebe" (UID: "3dd64874-324e-4961-ac48-1f768a8f9ebe"). InnerVolumeSpecName "openstack-cell1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.149897 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-config" (OuterVolumeSpecName: "config") pod "3dd64874-324e-4961-ac48-1f768a8f9ebe" (UID: "3dd64874-324e-4961-ac48-1f768a8f9ebe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.162201 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "3dd64874-324e-4961-ac48-1f768a8f9ebe" (UID: "3dd64874-324e-4961-ac48-1f768a8f9ebe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.166945 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "3dd64874-324e-4961-ac48-1f768a8f9ebe" (UID: "3dd64874-324e-4961-ac48-1f768a8f9ebe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.170304 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "3dd64874-324e-4961-ac48-1f768a8f9ebe" (UID: "3dd64874-324e-4961-ac48-1f768a8f9ebe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.183721 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.183757 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khnhz\" (UniqueName: \"kubernetes.io/projected/3dd64874-324e-4961-ac48-1f768a8f9ebe-kube-api-access-khnhz\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.183771 4831 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-dns-svc\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.183782 4831 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.183793 4831 reconciler_common.go:293] "Volume detached for volume \"openstack-cell1\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.183805 4831 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3dd64874-324e-4961-ac48-1f768a8f9ebe-config\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.196645 4831 generic.go:334] "Generic (PLEG): container finished" podID="3dd64874-324e-4961-ac48-1f768a8f9ebe" containerID="52bb5d819838a872ea9e30546ad1e85afd6223bc23a9519348c6576bbb82e9af" exitCode=0 Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.196694 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" event={"ID":"3dd64874-324e-4961-ac48-1f768a8f9ebe","Type":"ContainerDied","Data":"52bb5d819838a872ea9e30546ad1e85afd6223bc23a9519348c6576bbb82e9af"} Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.196722 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" event={"ID":"3dd64874-324e-4961-ac48-1f768a8f9ebe","Type":"ContainerDied","Data":"219ed7fce0ff5f4e3a547a2944792274f04e3dd8ea25d0e0854c06ef57449505"} Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.196741 4831 scope.go:117] "RemoveContainer" containerID="52bb5d819838a872ea9e30546ad1e85afd6223bc23a9519348c6576bbb82e9af" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.196893 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7444cd68c7-2ggnl" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.245598 4831 scope.go:117] "RemoveContainer" containerID="d56c6dfd6c9477d21153e3ba5a2597e0c8ea4260f5739ca124904c84fa5a9f32" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.255511 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7444cd68c7-2ggnl"] Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.265739 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7444cd68c7-2ggnl"] Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.276124 4831 scope.go:117] "RemoveContainer" containerID="52bb5d819838a872ea9e30546ad1e85afd6223bc23a9519348c6576bbb82e9af" Dec 03 08:18:15 crc kubenswrapper[4831]: E1203 08:18:15.277206 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"52bb5d819838a872ea9e30546ad1e85afd6223bc23a9519348c6576bbb82e9af\": container with ID starting with 52bb5d819838a872ea9e30546ad1e85afd6223bc23a9519348c6576bbb82e9af not found: ID does not exist" containerID="52bb5d819838a872ea9e30546ad1e85afd6223bc23a9519348c6576bbb82e9af" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.277257 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"52bb5d819838a872ea9e30546ad1e85afd6223bc23a9519348c6576bbb82e9af"} err="failed to get container status \"52bb5d819838a872ea9e30546ad1e85afd6223bc23a9519348c6576bbb82e9af\": rpc error: code = NotFound desc = could not find container \"52bb5d819838a872ea9e30546ad1e85afd6223bc23a9519348c6576bbb82e9af\": container with ID starting with 52bb5d819838a872ea9e30546ad1e85afd6223bc23a9519348c6576bbb82e9af not found: ID does not exist" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.277286 4831 scope.go:117] "RemoveContainer" containerID="d56c6dfd6c9477d21153e3ba5a2597e0c8ea4260f5739ca124904c84fa5a9f32" Dec 03 08:18:15 crc kubenswrapper[4831]: E1203 08:18:15.277608 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d56c6dfd6c9477d21153e3ba5a2597e0c8ea4260f5739ca124904c84fa5a9f32\": container with ID starting with d56c6dfd6c9477d21153e3ba5a2597e0c8ea4260f5739ca124904c84fa5a9f32 not found: ID does not exist" containerID="d56c6dfd6c9477d21153e3ba5a2597e0c8ea4260f5739ca124904c84fa5a9f32" Dec 03 08:18:15 crc kubenswrapper[4831]: I1203 08:18:15.277629 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d56c6dfd6c9477d21153e3ba5a2597e0c8ea4260f5739ca124904c84fa5a9f32"} err="failed to get container status \"d56c6dfd6c9477d21153e3ba5a2597e0c8ea4260f5739ca124904c84fa5a9f32\": rpc error: code = NotFound desc = could not find container \"d56c6dfd6c9477d21153e3ba5a2597e0c8ea4260f5739ca124904c84fa5a9f32\": container with ID starting with d56c6dfd6c9477d21153e3ba5a2597e0c8ea4260f5739ca124904c84fa5a9f32 not found: ID does not exist" Dec 03 08:18:17 crc kubenswrapper[4831]: I1203 08:18:17.033033 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3dd64874-324e-4961-ac48-1f768a8f9ebe" path="/var/lib/kubelet/pods/3dd64874-324e-4961-ac48-1f768a8f9ebe/volumes" Dec 03 08:18:23 crc kubenswrapper[4831]: I1203 08:18:23.022083 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:18:23 crc kubenswrapper[4831]: E1203 08:18:23.022920 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.750853 4831 scope.go:117] "RemoveContainer" containerID="4b30e49b85f19483632d3406a2b8c323cb1f4a08f16fc633981378ca2004b5e9" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.824596 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw"] Dec 03 08:18:24 crc kubenswrapper[4831]: E1203 08:18:24.825337 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0d52c9-2ad8-4410-a080-8d68193ca4e7" containerName="dnsmasq-dns" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.825368 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0d52c9-2ad8-4410-a080-8d68193ca4e7" containerName="dnsmasq-dns" Dec 03 08:18:24 crc kubenswrapper[4831]: E1203 08:18:24.825428 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd0d52c9-2ad8-4410-a080-8d68193ca4e7" containerName="init" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.825441 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd0d52c9-2ad8-4410-a080-8d68193ca4e7" containerName="init" Dec 03 08:18:24 crc kubenswrapper[4831]: E1203 08:18:24.825497 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd64874-324e-4961-ac48-1f768a8f9ebe" containerName="dnsmasq-dns" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.825511 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd64874-324e-4961-ac48-1f768a8f9ebe" containerName="dnsmasq-dns" Dec 03 08:18:24 crc kubenswrapper[4831]: E1203 08:18:24.825553 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3dd64874-324e-4961-ac48-1f768a8f9ebe" containerName="init" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.825565 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="3dd64874-324e-4961-ac48-1f768a8f9ebe" containerName="init" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.825935 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="3dd64874-324e-4961-ac48-1f768a8f9ebe" containerName="dnsmasq-dns" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.825993 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd0d52c9-2ad8-4410-a080-8d68193ca4e7" containerName="dnsmasq-dns" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.827424 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.830867 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.832818 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.833019 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.833392 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.842236 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw"] Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.922212 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.922307 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.922404 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.922441 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztg86\" (UniqueName: \"kubernetes.io/projected/b73afe21-6869-4926-b295-8ec52f0e41be-kube-api-access-ztg86\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.922815 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:24 crc kubenswrapper[4831]: I1203 08:18:24.994770 4831 scope.go:117] "RemoveContainer" containerID="f7671b418fa36a50c5d5c171e0f1d8106072742f881b63c834b8c5b2bf3bcd32" Dec 03 08:18:25 crc kubenswrapper[4831]: I1203 08:18:25.024039 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:25 crc kubenswrapper[4831]: I1203 08:18:25.024132 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:25 crc kubenswrapper[4831]: I1203 08:18:25.024177 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:25 crc kubenswrapper[4831]: I1203 08:18:25.024207 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:25 crc kubenswrapper[4831]: I1203 08:18:25.024224 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztg86\" (UniqueName: \"kubernetes.io/projected/b73afe21-6869-4926-b295-8ec52f0e41be-kube-api-access-ztg86\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:25 crc kubenswrapper[4831]: I1203 08:18:25.033291 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-ssh-key\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:25 crc kubenswrapper[4831]: I1203 08:18:25.033296 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-pre-adoption-validation-combined-ca-bundle\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:25 crc kubenswrapper[4831]: I1203 08:18:25.033829 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-ceph\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:25 crc kubenswrapper[4831]: I1203 08:18:25.034221 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-inventory\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:25 crc kubenswrapper[4831]: I1203 08:18:25.045725 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztg86\" (UniqueName: \"kubernetes.io/projected/b73afe21-6869-4926-b295-8ec52f0e41be-kube-api-access-ztg86\") pod \"pre-adoption-validation-openstack-pre-adoption-openstack-c746qw\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:25 crc kubenswrapper[4831]: I1203 08:18:25.158856 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:25 crc kubenswrapper[4831]: I1203 08:18:25.729412 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw"] Dec 03 08:18:26 crc kubenswrapper[4831]: I1203 08:18:26.404558 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" event={"ID":"b73afe21-6869-4926-b295-8ec52f0e41be","Type":"ContainerStarted","Data":"c8993c21c8c75b0c8610eb373a0dfbf6393f1d772ed3bd76c27db62eac75a1ad"} Dec 03 08:18:37 crc kubenswrapper[4831]: I1203 08:18:37.013519 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:18:37 crc kubenswrapper[4831]: E1203 08:18:37.014270 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:18:40 crc kubenswrapper[4831]: I1203 08:18:40.591648 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" event={"ID":"b73afe21-6869-4926-b295-8ec52f0e41be","Type":"ContainerStarted","Data":"796f16f2d2232b30630323437295e00b01827f22294ea418a13a9285c8ef9354"} Dec 03 08:18:40 crc kubenswrapper[4831]: I1203 08:18:40.614825 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" podStartSLOduration=2.979390086 podStartE2EDuration="16.614806267s" podCreationTimestamp="2025-12-03 08:18:24 +0000 UTC" firstStartedPulling="2025-12-03 08:18:25.731702044 +0000 UTC m=+6443.075285552" lastFinishedPulling="2025-12-03 08:18:39.367118225 +0000 UTC m=+6456.710701733" observedRunningTime="2025-12-03 08:18:40.605407474 +0000 UTC m=+6457.948990992" watchObservedRunningTime="2025-12-03 08:18:40.614806267 +0000 UTC m=+6457.958389765" Dec 03 08:18:51 crc kubenswrapper[4831]: I1203 08:18:51.078502 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:18:51 crc kubenswrapper[4831]: E1203 08:18:51.080011 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:18:52 crc kubenswrapper[4831]: I1203 08:18:52.738375 4831 generic.go:334] "Generic (PLEG): container finished" podID="b73afe21-6869-4926-b295-8ec52f0e41be" containerID="796f16f2d2232b30630323437295e00b01827f22294ea418a13a9285c8ef9354" exitCode=0 Dec 03 08:18:52 crc kubenswrapper[4831]: I1203 08:18:52.738470 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" event={"ID":"b73afe21-6869-4926-b295-8ec52f0e41be","Type":"ContainerDied","Data":"796f16f2d2232b30630323437295e00b01827f22294ea418a13a9285c8ef9354"} Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.279391 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.389744 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-ceph\") pod \"b73afe21-6869-4926-b295-8ec52f0e41be\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.389799 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-pre-adoption-validation-combined-ca-bundle\") pod \"b73afe21-6869-4926-b295-8ec52f0e41be\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.389960 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztg86\" (UniqueName: \"kubernetes.io/projected/b73afe21-6869-4926-b295-8ec52f0e41be-kube-api-access-ztg86\") pod \"b73afe21-6869-4926-b295-8ec52f0e41be\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.389988 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-ssh-key\") pod \"b73afe21-6869-4926-b295-8ec52f0e41be\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.390171 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-inventory\") pod \"b73afe21-6869-4926-b295-8ec52f0e41be\" (UID: \"b73afe21-6869-4926-b295-8ec52f0e41be\") " Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.397393 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-pre-adoption-validation-combined-ca-bundle" (OuterVolumeSpecName: "pre-adoption-validation-combined-ca-bundle") pod "b73afe21-6869-4926-b295-8ec52f0e41be" (UID: "b73afe21-6869-4926-b295-8ec52f0e41be"). InnerVolumeSpecName "pre-adoption-validation-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.397715 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-ceph" (OuterVolumeSpecName: "ceph") pod "b73afe21-6869-4926-b295-8ec52f0e41be" (UID: "b73afe21-6869-4926-b295-8ec52f0e41be"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.402562 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b73afe21-6869-4926-b295-8ec52f0e41be-kube-api-access-ztg86" (OuterVolumeSpecName: "kube-api-access-ztg86") pod "b73afe21-6869-4926-b295-8ec52f0e41be" (UID: "b73afe21-6869-4926-b295-8ec52f0e41be"). InnerVolumeSpecName "kube-api-access-ztg86". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.430675 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-inventory" (OuterVolumeSpecName: "inventory") pod "b73afe21-6869-4926-b295-8ec52f0e41be" (UID: "b73afe21-6869-4926-b295-8ec52f0e41be"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.439555 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b73afe21-6869-4926-b295-8ec52f0e41be" (UID: "b73afe21-6869-4926-b295-8ec52f0e41be"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.492768 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.492797 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.492810 4831 reconciler_common.go:293] "Volume detached for volume \"pre-adoption-validation-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-pre-adoption-validation-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.492822 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztg86\" (UniqueName: \"kubernetes.io/projected/b73afe21-6869-4926-b295-8ec52f0e41be-kube-api-access-ztg86\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.492832 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b73afe21-6869-4926-b295-8ec52f0e41be-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.769604 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" event={"ID":"b73afe21-6869-4926-b295-8ec52f0e41be","Type":"ContainerDied","Data":"c8993c21c8c75b0c8610eb373a0dfbf6393f1d772ed3bd76c27db62eac75a1ad"} Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.769670 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c8993c21c8c75b0c8610eb373a0dfbf6393f1d772ed3bd76c27db62eac75a1ad" Dec 03 08:18:54 crc kubenswrapper[4831]: I1203 08:18:54.769789 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/pre-adoption-validation-openstack-pre-adoption-openstack-c746qw" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.266592 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t"] Dec 03 08:18:57 crc kubenswrapper[4831]: E1203 08:18:57.267525 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b73afe21-6869-4926-b295-8ec52f0e41be" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.267543 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b73afe21-6869-4926-b295-8ec52f0e41be" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.267745 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b73afe21-6869-4926-b295-8ec52f0e41be" containerName="pre-adoption-validation-openstack-pre-adoption-openstack-cell1" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.268459 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.272352 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.272620 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.272979 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.273996 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.290053 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t"] Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.374930 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.375101 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpq7w\" (UniqueName: \"kubernetes.io/projected/307848db-fd00-472b-8653-c35696f43e6d-kube-api-access-fpq7w\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.375392 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.375492 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.375532 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.477716 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.477794 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.477888 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.477959 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpq7w\" (UniqueName: \"kubernetes.io/projected/307848db-fd00-472b-8653-c35696f43e6d-kube-api-access-fpq7w\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.478080 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.484038 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-inventory\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.484264 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-tripleo-cleanup-combined-ca-bundle\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.487183 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-ceph\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.499887 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-ssh-key\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.514152 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpq7w\" (UniqueName: \"kubernetes.io/projected/307848db-fd00-472b-8653-c35696f43e6d-kube-api-access-fpq7w\") pod \"tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:57 crc kubenswrapper[4831]: I1203 08:18:57.617408 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:18:58 crc kubenswrapper[4831]: I1203 08:18:58.002967 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t"] Dec 03 08:18:58 crc kubenswrapper[4831]: I1203 08:18:58.013844 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:18:58 crc kubenswrapper[4831]: I1203 08:18:58.822973 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" event={"ID":"307848db-fd00-472b-8653-c35696f43e6d","Type":"ContainerStarted","Data":"bcbdb721cc50dfca87fdf81b8ae2580c32e9414bbb862d2dde2894d486972c0e"} Dec 03 08:18:58 crc kubenswrapper[4831]: I1203 08:18:58.823259 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" event={"ID":"307848db-fd00-472b-8653-c35696f43e6d","Type":"ContainerStarted","Data":"d69e6727ea15ea945641e2fba35a1ec7a5ce4cf6c0a457ec122a5fc253afd1da"} Dec 03 08:19:04 crc kubenswrapper[4831]: I1203 08:19:04.013036 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:19:04 crc kubenswrapper[4831]: E1203 08:19:04.014132 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:19:17 crc kubenswrapper[4831]: I1203 08:19:17.013739 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:19:17 crc kubenswrapper[4831]: E1203 08:19:17.015826 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.050963 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" podStartSLOduration=31.825166301 podStartE2EDuration="32.050936363s" podCreationTimestamp="2025-12-03 08:18:57 +0000 UTC" firstStartedPulling="2025-12-03 08:18:58.013543204 +0000 UTC m=+6475.357126722" lastFinishedPulling="2025-12-03 08:18:58.239313236 +0000 UTC m=+6475.582896784" observedRunningTime="2025-12-03 08:18:58.845297661 +0000 UTC m=+6476.188881199" watchObservedRunningTime="2025-12-03 08:19:29.050936363 +0000 UTC m=+6506.394519891" Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.054053 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-create-vtngw"] Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.069783 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-create-vtngw"] Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.412574 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-xvgl2"] Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.415356 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.438215 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvgl2"] Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.496110 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2060617-587d-4b87-a374-49b4fb2a2fd7-utilities\") pod \"redhat-marketplace-xvgl2\" (UID: \"f2060617-587d-4b87-a374-49b4fb2a2fd7\") " pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.496189 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2060617-587d-4b87-a374-49b4fb2a2fd7-catalog-content\") pod \"redhat-marketplace-xvgl2\" (UID: \"f2060617-587d-4b87-a374-49b4fb2a2fd7\") " pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.496284 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztc69\" (UniqueName: \"kubernetes.io/projected/f2060617-587d-4b87-a374-49b4fb2a2fd7-kube-api-access-ztc69\") pod \"redhat-marketplace-xvgl2\" (UID: \"f2060617-587d-4b87-a374-49b4fb2a2fd7\") " pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.598304 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2060617-587d-4b87-a374-49b4fb2a2fd7-utilities\") pod \"redhat-marketplace-xvgl2\" (UID: \"f2060617-587d-4b87-a374-49b4fb2a2fd7\") " pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.598455 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2060617-587d-4b87-a374-49b4fb2a2fd7-catalog-content\") pod \"redhat-marketplace-xvgl2\" (UID: \"f2060617-587d-4b87-a374-49b4fb2a2fd7\") " pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.598594 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztc69\" (UniqueName: \"kubernetes.io/projected/f2060617-587d-4b87-a374-49b4fb2a2fd7-kube-api-access-ztc69\") pod \"redhat-marketplace-xvgl2\" (UID: \"f2060617-587d-4b87-a374-49b4fb2a2fd7\") " pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.598785 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2060617-587d-4b87-a374-49b4fb2a2fd7-utilities\") pod \"redhat-marketplace-xvgl2\" (UID: \"f2060617-587d-4b87-a374-49b4fb2a2fd7\") " pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.599037 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2060617-587d-4b87-a374-49b4fb2a2fd7-catalog-content\") pod \"redhat-marketplace-xvgl2\" (UID: \"f2060617-587d-4b87-a374-49b4fb2a2fd7\") " pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.633375 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztc69\" (UniqueName: \"kubernetes.io/projected/f2060617-587d-4b87-a374-49b4fb2a2fd7-kube-api-access-ztc69\") pod \"redhat-marketplace-xvgl2\" (UID: \"f2060617-587d-4b87-a374-49b4fb2a2fd7\") " pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:29 crc kubenswrapper[4831]: I1203 08:19:29.784024 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:30 crc kubenswrapper[4831]: I1203 08:19:30.014032 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:19:30 crc kubenswrapper[4831]: E1203 08:19:30.014569 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:19:30 crc kubenswrapper[4831]: I1203 08:19:30.264130 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvgl2"] Dec 03 08:19:31 crc kubenswrapper[4831]: I1203 08:19:31.026074 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c6118f2-abda-41fc-9160-cf14d0742581" path="/var/lib/kubelet/pods/2c6118f2-abda-41fc-9160-cf14d0742581/volumes" Dec 03 08:19:31 crc kubenswrapper[4831]: I1203 08:19:31.033821 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-9d49-account-create-update-vtgjt"] Dec 03 08:19:31 crc kubenswrapper[4831]: I1203 08:19:31.042967 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-9d49-account-create-update-vtgjt"] Dec 03 08:19:31 crc kubenswrapper[4831]: I1203 08:19:31.239792 4831 generic.go:334] "Generic (PLEG): container finished" podID="f2060617-587d-4b87-a374-49b4fb2a2fd7" containerID="420ca0e5c93f80aaae0ce41277ab1426d05eb0ed2c2a8c7cdce9fd90866567d7" exitCode=0 Dec 03 08:19:31 crc kubenswrapper[4831]: I1203 08:19:31.239859 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvgl2" event={"ID":"f2060617-587d-4b87-a374-49b4fb2a2fd7","Type":"ContainerDied","Data":"420ca0e5c93f80aaae0ce41277ab1426d05eb0ed2c2a8c7cdce9fd90866567d7"} Dec 03 08:19:31 crc kubenswrapper[4831]: I1203 08:19:31.239906 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvgl2" event={"ID":"f2060617-587d-4b87-a374-49b4fb2a2fd7","Type":"ContainerStarted","Data":"a534afc8968b70915d92eea5a360f092be4d052d0e09a93afa557ac0fa865b61"} Dec 03 08:19:33 crc kubenswrapper[4831]: I1203 08:19:33.074296 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="693d9075-b11e-4420-b6d2-53f50b6cbeaf" path="/var/lib/kubelet/pods/693d9075-b11e-4420-b6d2-53f50b6cbeaf/volumes" Dec 03 08:19:33 crc kubenswrapper[4831]: I1203 08:19:33.262251 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvgl2" event={"ID":"f2060617-587d-4b87-a374-49b4fb2a2fd7","Type":"ContainerStarted","Data":"84f688b0286584e4f66a51ed2977990897ac98e2bb0be0779c5f491b22fee276"} Dec 03 08:19:34 crc kubenswrapper[4831]: I1203 08:19:34.280992 4831 generic.go:334] "Generic (PLEG): container finished" podID="f2060617-587d-4b87-a374-49b4fb2a2fd7" containerID="84f688b0286584e4f66a51ed2977990897ac98e2bb0be0779c5f491b22fee276" exitCode=0 Dec 03 08:19:34 crc kubenswrapper[4831]: I1203 08:19:34.281052 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvgl2" event={"ID":"f2060617-587d-4b87-a374-49b4fb2a2fd7","Type":"ContainerDied","Data":"84f688b0286584e4f66a51ed2977990897ac98e2bb0be0779c5f491b22fee276"} Dec 03 08:19:36 crc kubenswrapper[4831]: I1203 08:19:36.307791 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvgl2" event={"ID":"f2060617-587d-4b87-a374-49b4fb2a2fd7","Type":"ContainerStarted","Data":"59cee462251d1ae9d7df1f7cfc91e609b18b845b0381cfed94842e397d0660a8"} Dec 03 08:19:36 crc kubenswrapper[4831]: I1203 08:19:36.355401 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-xvgl2" podStartSLOduration=3.453063409 podStartE2EDuration="7.355368986s" podCreationTimestamp="2025-12-03 08:19:29 +0000 UTC" firstStartedPulling="2025-12-03 08:19:31.24336158 +0000 UTC m=+6508.586945108" lastFinishedPulling="2025-12-03 08:19:35.145667177 +0000 UTC m=+6512.489250685" observedRunningTime="2025-12-03 08:19:36.335079714 +0000 UTC m=+6513.678663262" watchObservedRunningTime="2025-12-03 08:19:36.355368986 +0000 UTC m=+6513.698952544" Dec 03 08:19:37 crc kubenswrapper[4831]: I1203 08:19:37.050636 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-persistence-db-create-6xrql"] Dec 03 08:19:37 crc kubenswrapper[4831]: I1203 08:19:37.061062 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-persistence-db-create-6xrql"] Dec 03 08:19:38 crc kubenswrapper[4831]: I1203 08:19:38.050589 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-b4f4-account-create-update-zldmg"] Dec 03 08:19:38 crc kubenswrapper[4831]: I1203 08:19:38.063967 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-b4f4-account-create-update-zldmg"] Dec 03 08:19:39 crc kubenswrapper[4831]: I1203 08:19:39.029229 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3" path="/var/lib/kubelet/pods/9cdbd5f3-bcc4-4a5c-87be-e2fa05c6b2a3/volumes" Dec 03 08:19:39 crc kubenswrapper[4831]: I1203 08:19:39.030490 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5c8c1d0-6e54-4b99-9f75-f4620e738213" path="/var/lib/kubelet/pods/f5c8c1d0-6e54-4b99-9f75-f4620e738213/volumes" Dec 03 08:19:39 crc kubenswrapper[4831]: I1203 08:19:39.785132 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:39 crc kubenswrapper[4831]: I1203 08:19:39.785373 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:39 crc kubenswrapper[4831]: I1203 08:19:39.924198 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:40 crc kubenswrapper[4831]: I1203 08:19:40.430710 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:40 crc kubenswrapper[4831]: I1203 08:19:40.486340 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvgl2"] Dec 03 08:19:42 crc kubenswrapper[4831]: I1203 08:19:42.015342 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:19:42 crc kubenswrapper[4831]: E1203 08:19:42.016120 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:19:42 crc kubenswrapper[4831]: I1203 08:19:42.544432 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-xvgl2" podUID="f2060617-587d-4b87-a374-49b4fb2a2fd7" containerName="registry-server" containerID="cri-o://59cee462251d1ae9d7df1f7cfc91e609b18b845b0381cfed94842e397d0660a8" gracePeriod=2 Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.052579 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.215171 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2060617-587d-4b87-a374-49b4fb2a2fd7-catalog-content\") pod \"f2060617-587d-4b87-a374-49b4fb2a2fd7\" (UID: \"f2060617-587d-4b87-a374-49b4fb2a2fd7\") " Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.215541 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2060617-587d-4b87-a374-49b4fb2a2fd7-utilities\") pod \"f2060617-587d-4b87-a374-49b4fb2a2fd7\" (UID: \"f2060617-587d-4b87-a374-49b4fb2a2fd7\") " Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.215739 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztc69\" (UniqueName: \"kubernetes.io/projected/f2060617-587d-4b87-a374-49b4fb2a2fd7-kube-api-access-ztc69\") pod \"f2060617-587d-4b87-a374-49b4fb2a2fd7\" (UID: \"f2060617-587d-4b87-a374-49b4fb2a2fd7\") " Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.217124 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2060617-587d-4b87-a374-49b4fb2a2fd7-utilities" (OuterVolumeSpecName: "utilities") pod "f2060617-587d-4b87-a374-49b4fb2a2fd7" (UID: "f2060617-587d-4b87-a374-49b4fb2a2fd7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.219572 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f2060617-587d-4b87-a374-49b4fb2a2fd7-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.226152 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2060617-587d-4b87-a374-49b4fb2a2fd7-kube-api-access-ztc69" (OuterVolumeSpecName: "kube-api-access-ztc69") pod "f2060617-587d-4b87-a374-49b4fb2a2fd7" (UID: "f2060617-587d-4b87-a374-49b4fb2a2fd7"). InnerVolumeSpecName "kube-api-access-ztc69". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.247031 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2060617-587d-4b87-a374-49b4fb2a2fd7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f2060617-587d-4b87-a374-49b4fb2a2fd7" (UID: "f2060617-587d-4b87-a374-49b4fb2a2fd7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.322193 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztc69\" (UniqueName: \"kubernetes.io/projected/f2060617-587d-4b87-a374-49b4fb2a2fd7-kube-api-access-ztc69\") on node \"crc\" DevicePath \"\"" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.322230 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f2060617-587d-4b87-a374-49b4fb2a2fd7-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.556448 4831 generic.go:334] "Generic (PLEG): container finished" podID="f2060617-587d-4b87-a374-49b4fb2a2fd7" containerID="59cee462251d1ae9d7df1f7cfc91e609b18b845b0381cfed94842e397d0660a8" exitCode=0 Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.556507 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvgl2" event={"ID":"f2060617-587d-4b87-a374-49b4fb2a2fd7","Type":"ContainerDied","Data":"59cee462251d1ae9d7df1f7cfc91e609b18b845b0381cfed94842e397d0660a8"} Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.556538 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-xvgl2" event={"ID":"f2060617-587d-4b87-a374-49b4fb2a2fd7","Type":"ContainerDied","Data":"a534afc8968b70915d92eea5a360f092be4d052d0e09a93afa557ac0fa865b61"} Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.556557 4831 scope.go:117] "RemoveContainer" containerID="59cee462251d1ae9d7df1f7cfc91e609b18b845b0381cfed94842e397d0660a8" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.556741 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-xvgl2" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.592629 4831 scope.go:117] "RemoveContainer" containerID="84f688b0286584e4f66a51ed2977990897ac98e2bb0be0779c5f491b22fee276" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.602875 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvgl2"] Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.614103 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-xvgl2"] Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.636023 4831 scope.go:117] "RemoveContainer" containerID="420ca0e5c93f80aaae0ce41277ab1426d05eb0ed2c2a8c7cdce9fd90866567d7" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.681335 4831 scope.go:117] "RemoveContainer" containerID="59cee462251d1ae9d7df1f7cfc91e609b18b845b0381cfed94842e397d0660a8" Dec 03 08:19:43 crc kubenswrapper[4831]: E1203 08:19:43.681699 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59cee462251d1ae9d7df1f7cfc91e609b18b845b0381cfed94842e397d0660a8\": container with ID starting with 59cee462251d1ae9d7df1f7cfc91e609b18b845b0381cfed94842e397d0660a8 not found: ID does not exist" containerID="59cee462251d1ae9d7df1f7cfc91e609b18b845b0381cfed94842e397d0660a8" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.681730 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59cee462251d1ae9d7df1f7cfc91e609b18b845b0381cfed94842e397d0660a8"} err="failed to get container status \"59cee462251d1ae9d7df1f7cfc91e609b18b845b0381cfed94842e397d0660a8\": rpc error: code = NotFound desc = could not find container \"59cee462251d1ae9d7df1f7cfc91e609b18b845b0381cfed94842e397d0660a8\": container with ID starting with 59cee462251d1ae9d7df1f7cfc91e609b18b845b0381cfed94842e397d0660a8 not found: ID does not exist" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.681750 4831 scope.go:117] "RemoveContainer" containerID="84f688b0286584e4f66a51ed2977990897ac98e2bb0be0779c5f491b22fee276" Dec 03 08:19:43 crc kubenswrapper[4831]: E1203 08:19:43.682128 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84f688b0286584e4f66a51ed2977990897ac98e2bb0be0779c5f491b22fee276\": container with ID starting with 84f688b0286584e4f66a51ed2977990897ac98e2bb0be0779c5f491b22fee276 not found: ID does not exist" containerID="84f688b0286584e4f66a51ed2977990897ac98e2bb0be0779c5f491b22fee276" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.682150 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84f688b0286584e4f66a51ed2977990897ac98e2bb0be0779c5f491b22fee276"} err="failed to get container status \"84f688b0286584e4f66a51ed2977990897ac98e2bb0be0779c5f491b22fee276\": rpc error: code = NotFound desc = could not find container \"84f688b0286584e4f66a51ed2977990897ac98e2bb0be0779c5f491b22fee276\": container with ID starting with 84f688b0286584e4f66a51ed2977990897ac98e2bb0be0779c5f491b22fee276 not found: ID does not exist" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.682164 4831 scope.go:117] "RemoveContainer" containerID="420ca0e5c93f80aaae0ce41277ab1426d05eb0ed2c2a8c7cdce9fd90866567d7" Dec 03 08:19:43 crc kubenswrapper[4831]: E1203 08:19:43.682571 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"420ca0e5c93f80aaae0ce41277ab1426d05eb0ed2c2a8c7cdce9fd90866567d7\": container with ID starting with 420ca0e5c93f80aaae0ce41277ab1426d05eb0ed2c2a8c7cdce9fd90866567d7 not found: ID does not exist" containerID="420ca0e5c93f80aaae0ce41277ab1426d05eb0ed2c2a8c7cdce9fd90866567d7" Dec 03 08:19:43 crc kubenswrapper[4831]: I1203 08:19:43.682617 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"420ca0e5c93f80aaae0ce41277ab1426d05eb0ed2c2a8c7cdce9fd90866567d7"} err="failed to get container status \"420ca0e5c93f80aaae0ce41277ab1426d05eb0ed2c2a8c7cdce9fd90866567d7\": rpc error: code = NotFound desc = could not find container \"420ca0e5c93f80aaae0ce41277ab1426d05eb0ed2c2a8c7cdce9fd90866567d7\": container with ID starting with 420ca0e5c93f80aaae0ce41277ab1426d05eb0ed2c2a8c7cdce9fd90866567d7 not found: ID does not exist" Dec 03 08:19:45 crc kubenswrapper[4831]: I1203 08:19:45.037158 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2060617-587d-4b87-a374-49b4fb2a2fd7" path="/var/lib/kubelet/pods/f2060617-587d-4b87-a374-49b4fb2a2fd7/volumes" Dec 03 08:19:54 crc kubenswrapper[4831]: I1203 08:19:54.013686 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:19:54 crc kubenswrapper[4831]: E1203 08:19:54.014625 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:20:06 crc kubenswrapper[4831]: I1203 08:20:06.014168 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:20:06 crc kubenswrapper[4831]: E1203 08:20:06.017531 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:20:17 crc kubenswrapper[4831]: I1203 08:20:17.013975 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:20:17 crc kubenswrapper[4831]: E1203 08:20:17.015056 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:20:19 crc kubenswrapper[4831]: I1203 08:20:19.047106 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/octavia-db-sync-mk5pz"] Dec 03 08:20:19 crc kubenswrapper[4831]: I1203 08:20:19.068767 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/octavia-db-sync-mk5pz"] Dec 03 08:20:21 crc kubenswrapper[4831]: I1203 08:20:21.029205 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c" path="/var/lib/kubelet/pods/7cd6ee4f-27fd-4e4a-aea7-5862219fdf3c/volumes" Dec 03 08:20:25 crc kubenswrapper[4831]: I1203 08:20:25.145416 4831 scope.go:117] "RemoveContainer" containerID="dd23da52a8cb037b3156127215b643e85357e4d660c950e512c3a1a1c6c75628" Dec 03 08:20:25 crc kubenswrapper[4831]: I1203 08:20:25.181889 4831 scope.go:117] "RemoveContainer" containerID="9c743015b310deec86cc3b40b4ece090a4fba60b8366f77ecc886cbd5e3e3e3e" Dec 03 08:20:25 crc kubenswrapper[4831]: I1203 08:20:25.236459 4831 scope.go:117] "RemoveContainer" containerID="2f0f9521b47b441378ea565ea86104d2d347b013bcb1156239e119eb5d036712" Dec 03 08:20:25 crc kubenswrapper[4831]: I1203 08:20:25.295512 4831 scope.go:117] "RemoveContainer" containerID="19ba57eb0cab60fd76b8207156a52ab068b6b3cd8cb4107a9d3a3285e3c219cc" Dec 03 08:20:25 crc kubenswrapper[4831]: I1203 08:20:25.341352 4831 scope.go:117] "RemoveContainer" containerID="a43e7f1eadecaf8f2edb5a97b55f867a2382ed52337791bb6e47c00b2f9a4094" Dec 03 08:20:25 crc kubenswrapper[4831]: I1203 08:20:25.398975 4831 scope.go:117] "RemoveContainer" containerID="e4dd7560ac1a7be74e9ad6ee92475b899ef996e4a5122626bf36aece0fa1fe41" Dec 03 08:20:29 crc kubenswrapper[4831]: I1203 08:20:29.013347 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:20:29 crc kubenswrapper[4831]: E1203 08:20:29.014584 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:20:40 crc kubenswrapper[4831]: I1203 08:20:40.013311 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:20:40 crc kubenswrapper[4831]: E1203 08:20:40.014426 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:20:51 crc kubenswrapper[4831]: I1203 08:20:51.015026 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:20:51 crc kubenswrapper[4831]: E1203 08:20:51.015818 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:21:02 crc kubenswrapper[4831]: I1203 08:21:02.013986 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:21:02 crc kubenswrapper[4831]: E1203 08:21:02.015264 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:21:14 crc kubenswrapper[4831]: I1203 08:21:14.014048 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:21:14 crc kubenswrapper[4831]: E1203 08:21:14.015627 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:21:29 crc kubenswrapper[4831]: I1203 08:21:29.013612 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:21:29 crc kubenswrapper[4831]: E1203 08:21:29.014209 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.584701 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vh487"] Dec 03 08:21:35 crc kubenswrapper[4831]: E1203 08:21:35.585666 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2060617-587d-4b87-a374-49b4fb2a2fd7" containerName="extract-utilities" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.585680 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2060617-587d-4b87-a374-49b4fb2a2fd7" containerName="extract-utilities" Dec 03 08:21:35 crc kubenswrapper[4831]: E1203 08:21:35.585698 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2060617-587d-4b87-a374-49b4fb2a2fd7" containerName="extract-content" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.585704 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2060617-587d-4b87-a374-49b4fb2a2fd7" containerName="extract-content" Dec 03 08:21:35 crc kubenswrapper[4831]: E1203 08:21:35.585724 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2060617-587d-4b87-a374-49b4fb2a2fd7" containerName="registry-server" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.585730 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2060617-587d-4b87-a374-49b4fb2a2fd7" containerName="registry-server" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.585942 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2060617-587d-4b87-a374-49b4fb2a2fd7" containerName="registry-server" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.587457 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.595946 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vh487"] Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.688547 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqwjh\" (UniqueName: \"kubernetes.io/projected/332abd9c-a42c-444b-96d8-8721c6999921-kube-api-access-nqwjh\") pod \"community-operators-vh487\" (UID: \"332abd9c-a42c-444b-96d8-8721c6999921\") " pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.688748 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332abd9c-a42c-444b-96d8-8721c6999921-catalog-content\") pod \"community-operators-vh487\" (UID: \"332abd9c-a42c-444b-96d8-8721c6999921\") " pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.688832 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332abd9c-a42c-444b-96d8-8721c6999921-utilities\") pod \"community-operators-vh487\" (UID: \"332abd9c-a42c-444b-96d8-8721c6999921\") " pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.790738 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332abd9c-a42c-444b-96d8-8721c6999921-utilities\") pod \"community-operators-vh487\" (UID: \"332abd9c-a42c-444b-96d8-8721c6999921\") " pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.790892 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqwjh\" (UniqueName: \"kubernetes.io/projected/332abd9c-a42c-444b-96d8-8721c6999921-kube-api-access-nqwjh\") pod \"community-operators-vh487\" (UID: \"332abd9c-a42c-444b-96d8-8721c6999921\") " pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.790981 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332abd9c-a42c-444b-96d8-8721c6999921-catalog-content\") pod \"community-operators-vh487\" (UID: \"332abd9c-a42c-444b-96d8-8721c6999921\") " pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.791497 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332abd9c-a42c-444b-96d8-8721c6999921-catalog-content\") pod \"community-operators-vh487\" (UID: \"332abd9c-a42c-444b-96d8-8721c6999921\") " pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.792109 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332abd9c-a42c-444b-96d8-8721c6999921-utilities\") pod \"community-operators-vh487\" (UID: \"332abd9c-a42c-444b-96d8-8721c6999921\") " pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:35 crc kubenswrapper[4831]: I1203 08:21:35.819788 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqwjh\" (UniqueName: \"kubernetes.io/projected/332abd9c-a42c-444b-96d8-8721c6999921-kube-api-access-nqwjh\") pod \"community-operators-vh487\" (UID: \"332abd9c-a42c-444b-96d8-8721c6999921\") " pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:36 crc kubenswrapper[4831]: I1203 08:21:36.004137 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:36 crc kubenswrapper[4831]: I1203 08:21:36.497111 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vh487"] Dec 03 08:21:36 crc kubenswrapper[4831]: W1203 08:21:36.504925 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod332abd9c_a42c_444b_96d8_8721c6999921.slice/crio-8bde1c0ced475139d2baafff67370dd2dad7b0f62ffc887af3cd38ee61c9a3c5 WatchSource:0}: Error finding container 8bde1c0ced475139d2baafff67370dd2dad7b0f62ffc887af3cd38ee61c9a3c5: Status 404 returned error can't find the container with id 8bde1c0ced475139d2baafff67370dd2dad7b0f62ffc887af3cd38ee61c9a3c5 Dec 03 08:21:36 crc kubenswrapper[4831]: I1203 08:21:36.723364 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh487" event={"ID":"332abd9c-a42c-444b-96d8-8721c6999921","Type":"ContainerStarted","Data":"369826b42e01d779094c52eacabe06240fb954e97bfcac3873c1b9dc6e236595"} Dec 03 08:21:36 crc kubenswrapper[4831]: I1203 08:21:36.723405 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh487" event={"ID":"332abd9c-a42c-444b-96d8-8721c6999921","Type":"ContainerStarted","Data":"8bde1c0ced475139d2baafff67370dd2dad7b0f62ffc887af3cd38ee61c9a3c5"} Dec 03 08:21:37 crc kubenswrapper[4831]: I1203 08:21:37.743825 4831 generic.go:334] "Generic (PLEG): container finished" podID="332abd9c-a42c-444b-96d8-8721c6999921" containerID="369826b42e01d779094c52eacabe06240fb954e97bfcac3873c1b9dc6e236595" exitCode=0 Dec 03 08:21:37 crc kubenswrapper[4831]: I1203 08:21:37.744102 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh487" event={"ID":"332abd9c-a42c-444b-96d8-8721c6999921","Type":"ContainerDied","Data":"369826b42e01d779094c52eacabe06240fb954e97bfcac3873c1b9dc6e236595"} Dec 03 08:21:37 crc kubenswrapper[4831]: I1203 08:21:37.969400 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xw29f"] Dec 03 08:21:37 crc kubenswrapper[4831]: I1203 08:21:37.972040 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:37 crc kubenswrapper[4831]: I1203 08:21:37.981417 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xw29f"] Dec 03 08:21:38 crc kubenswrapper[4831]: I1203 08:21:38.039925 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff03abb-783a-42fa-8771-a455f23f972a-utilities\") pod \"redhat-operators-xw29f\" (UID: \"eff03abb-783a-42fa-8771-a455f23f972a\") " pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:38 crc kubenswrapper[4831]: I1203 08:21:38.039967 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phq9q\" (UniqueName: \"kubernetes.io/projected/eff03abb-783a-42fa-8771-a455f23f972a-kube-api-access-phq9q\") pod \"redhat-operators-xw29f\" (UID: \"eff03abb-783a-42fa-8771-a455f23f972a\") " pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:38 crc kubenswrapper[4831]: I1203 08:21:38.040408 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff03abb-783a-42fa-8771-a455f23f972a-catalog-content\") pod \"redhat-operators-xw29f\" (UID: \"eff03abb-783a-42fa-8771-a455f23f972a\") " pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:38 crc kubenswrapper[4831]: I1203 08:21:38.143671 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff03abb-783a-42fa-8771-a455f23f972a-utilities\") pod \"redhat-operators-xw29f\" (UID: \"eff03abb-783a-42fa-8771-a455f23f972a\") " pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:38 crc kubenswrapper[4831]: I1203 08:21:38.144112 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-phq9q\" (UniqueName: \"kubernetes.io/projected/eff03abb-783a-42fa-8771-a455f23f972a-kube-api-access-phq9q\") pod \"redhat-operators-xw29f\" (UID: \"eff03abb-783a-42fa-8771-a455f23f972a\") " pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:38 crc kubenswrapper[4831]: I1203 08:21:38.144337 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff03abb-783a-42fa-8771-a455f23f972a-catalog-content\") pod \"redhat-operators-xw29f\" (UID: \"eff03abb-783a-42fa-8771-a455f23f972a\") " pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:38 crc kubenswrapper[4831]: I1203 08:21:38.144620 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff03abb-783a-42fa-8771-a455f23f972a-utilities\") pod \"redhat-operators-xw29f\" (UID: \"eff03abb-783a-42fa-8771-a455f23f972a\") " pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:38 crc kubenswrapper[4831]: I1203 08:21:38.144966 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff03abb-783a-42fa-8771-a455f23f972a-catalog-content\") pod \"redhat-operators-xw29f\" (UID: \"eff03abb-783a-42fa-8771-a455f23f972a\") " pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:38 crc kubenswrapper[4831]: I1203 08:21:38.170214 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-phq9q\" (UniqueName: \"kubernetes.io/projected/eff03abb-783a-42fa-8771-a455f23f972a-kube-api-access-phq9q\") pod \"redhat-operators-xw29f\" (UID: \"eff03abb-783a-42fa-8771-a455f23f972a\") " pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:38 crc kubenswrapper[4831]: I1203 08:21:38.289568 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:38 crc kubenswrapper[4831]: I1203 08:21:38.864373 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xw29f"] Dec 03 08:21:38 crc kubenswrapper[4831]: W1203 08:21:38.868547 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeff03abb_783a_42fa_8771_a455f23f972a.slice/crio-68725c994a2ea4610cb3f07e923a0485316c0ead652b7c02e596736575ccfddf WatchSource:0}: Error finding container 68725c994a2ea4610cb3f07e923a0485316c0ead652b7c02e596736575ccfddf: Status 404 returned error can't find the container with id 68725c994a2ea4610cb3f07e923a0485316c0ead652b7c02e596736575ccfddf Dec 03 08:21:39 crc kubenswrapper[4831]: I1203 08:21:39.768420 4831 generic.go:334] "Generic (PLEG): container finished" podID="eff03abb-783a-42fa-8771-a455f23f972a" containerID="59daee1ee8917e6ecf3d565959dda29cf9c14e537d155229bd5aa3b199a1441d" exitCode=0 Dec 03 08:21:39 crc kubenswrapper[4831]: I1203 08:21:39.768483 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw29f" event={"ID":"eff03abb-783a-42fa-8771-a455f23f972a","Type":"ContainerDied","Data":"59daee1ee8917e6ecf3d565959dda29cf9c14e537d155229bd5aa3b199a1441d"} Dec 03 08:21:39 crc kubenswrapper[4831]: I1203 08:21:39.769234 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw29f" event={"ID":"eff03abb-783a-42fa-8771-a455f23f972a","Type":"ContainerStarted","Data":"68725c994a2ea4610cb3f07e923a0485316c0ead652b7c02e596736575ccfddf"} Dec 03 08:21:40 crc kubenswrapper[4831]: I1203 08:21:40.013085 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:21:40 crc kubenswrapper[4831]: E1203 08:21:40.013396 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:21:40 crc kubenswrapper[4831]: I1203 08:21:40.781977 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh487" event={"ID":"332abd9c-a42c-444b-96d8-8721c6999921","Type":"ContainerStarted","Data":"63dab45d84a87009ec5298d5a60a18c5cfb3e76c60368180fbe6fe469c61e188"} Dec 03 08:21:42 crc kubenswrapper[4831]: I1203 08:21:42.814771 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw29f" event={"ID":"eff03abb-783a-42fa-8771-a455f23f972a","Type":"ContainerStarted","Data":"95da6e6670892e3262d6560394ac5ef152eaaa8857a43cb3205a05d5e40eff54"} Dec 03 08:21:42 crc kubenswrapper[4831]: I1203 08:21:42.817616 4831 generic.go:334] "Generic (PLEG): container finished" podID="332abd9c-a42c-444b-96d8-8721c6999921" containerID="63dab45d84a87009ec5298d5a60a18c5cfb3e76c60368180fbe6fe469c61e188" exitCode=0 Dec 03 08:21:42 crc kubenswrapper[4831]: I1203 08:21:42.817657 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh487" event={"ID":"332abd9c-a42c-444b-96d8-8721c6999921","Type":"ContainerDied","Data":"63dab45d84a87009ec5298d5a60a18c5cfb3e76c60368180fbe6fe469c61e188"} Dec 03 08:21:43 crc kubenswrapper[4831]: I1203 08:21:43.896402 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh487" event={"ID":"332abd9c-a42c-444b-96d8-8721c6999921","Type":"ContainerStarted","Data":"fe65bc786564c7628c12745e59d434864ec58c1ab0b3e84cfbd41b01da71e3eb"} Dec 03 08:21:43 crc kubenswrapper[4831]: I1203 08:21:43.934544 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vh487" podStartSLOduration=3.46378985 podStartE2EDuration="8.934514028s" podCreationTimestamp="2025-12-03 08:21:35 +0000 UTC" firstStartedPulling="2025-12-03 08:21:37.746180057 +0000 UTC m=+6635.089763565" lastFinishedPulling="2025-12-03 08:21:43.216904235 +0000 UTC m=+6640.560487743" observedRunningTime="2025-12-03 08:21:43.931764182 +0000 UTC m=+6641.275347690" watchObservedRunningTime="2025-12-03 08:21:43.934514028 +0000 UTC m=+6641.278097536" Dec 03 08:21:46 crc kubenswrapper[4831]: I1203 08:21:46.004784 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:46 crc kubenswrapper[4831]: I1203 08:21:46.005206 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:46 crc kubenswrapper[4831]: I1203 08:21:46.926653 4831 generic.go:334] "Generic (PLEG): container finished" podID="eff03abb-783a-42fa-8771-a455f23f972a" containerID="95da6e6670892e3262d6560394ac5ef152eaaa8857a43cb3205a05d5e40eff54" exitCode=0 Dec 03 08:21:46 crc kubenswrapper[4831]: I1203 08:21:46.926760 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw29f" event={"ID":"eff03abb-783a-42fa-8771-a455f23f972a","Type":"ContainerDied","Data":"95da6e6670892e3262d6560394ac5ef152eaaa8857a43cb3205a05d5e40eff54"} Dec 03 08:21:47 crc kubenswrapper[4831]: I1203 08:21:47.062425 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-vh487" podUID="332abd9c-a42c-444b-96d8-8721c6999921" containerName="registry-server" probeResult="failure" output=< Dec 03 08:21:47 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 03 08:21:47 crc kubenswrapper[4831]: > Dec 03 08:21:48 crc kubenswrapper[4831]: I1203 08:21:48.951378 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw29f" event={"ID":"eff03abb-783a-42fa-8771-a455f23f972a","Type":"ContainerStarted","Data":"81f163c828a14121db92801baae655775028560b052f669454c1b400f9202c8f"} Dec 03 08:21:48 crc kubenswrapper[4831]: I1203 08:21:48.974058 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xw29f" podStartSLOduration=3.931534622 podStartE2EDuration="11.974028652s" podCreationTimestamp="2025-12-03 08:21:37 +0000 UTC" firstStartedPulling="2025-12-03 08:21:39.770771438 +0000 UTC m=+6637.114354946" lastFinishedPulling="2025-12-03 08:21:47.813265458 +0000 UTC m=+6645.156848976" observedRunningTime="2025-12-03 08:21:48.973179256 +0000 UTC m=+6646.316762774" watchObservedRunningTime="2025-12-03 08:21:48.974028652 +0000 UTC m=+6646.317612160" Dec 03 08:21:52 crc kubenswrapper[4831]: I1203 08:21:52.012626 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:21:52 crc kubenswrapper[4831]: E1203 08:21:52.013819 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:21:56 crc kubenswrapper[4831]: I1203 08:21:56.053408 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:56 crc kubenswrapper[4831]: I1203 08:21:56.109286 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:56 crc kubenswrapper[4831]: I1203 08:21:56.591409 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vh487"] Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.044424 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vh487" podUID="332abd9c-a42c-444b-96d8-8721c6999921" containerName="registry-server" containerID="cri-o://fe65bc786564c7628c12745e59d434864ec58c1ab0b3e84cfbd41b01da71e3eb" gracePeriod=2 Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.289770 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.291048 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.347779 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.775986 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.868268 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332abd9c-a42c-444b-96d8-8721c6999921-utilities\") pod \"332abd9c-a42c-444b-96d8-8721c6999921\" (UID: \"332abd9c-a42c-444b-96d8-8721c6999921\") " Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.868491 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqwjh\" (UniqueName: \"kubernetes.io/projected/332abd9c-a42c-444b-96d8-8721c6999921-kube-api-access-nqwjh\") pod \"332abd9c-a42c-444b-96d8-8721c6999921\" (UID: \"332abd9c-a42c-444b-96d8-8721c6999921\") " Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.868540 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332abd9c-a42c-444b-96d8-8721c6999921-catalog-content\") pod \"332abd9c-a42c-444b-96d8-8721c6999921\" (UID: \"332abd9c-a42c-444b-96d8-8721c6999921\") " Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.869064 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/332abd9c-a42c-444b-96d8-8721c6999921-utilities" (OuterVolumeSpecName: "utilities") pod "332abd9c-a42c-444b-96d8-8721c6999921" (UID: "332abd9c-a42c-444b-96d8-8721c6999921"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.874136 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/332abd9c-a42c-444b-96d8-8721c6999921-kube-api-access-nqwjh" (OuterVolumeSpecName: "kube-api-access-nqwjh") pod "332abd9c-a42c-444b-96d8-8721c6999921" (UID: "332abd9c-a42c-444b-96d8-8721c6999921"). InnerVolumeSpecName "kube-api-access-nqwjh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.918091 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/332abd9c-a42c-444b-96d8-8721c6999921-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "332abd9c-a42c-444b-96d8-8721c6999921" (UID: "332abd9c-a42c-444b-96d8-8721c6999921"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.970885 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/332abd9c-a42c-444b-96d8-8721c6999921-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.970917 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqwjh\" (UniqueName: \"kubernetes.io/projected/332abd9c-a42c-444b-96d8-8721c6999921-kube-api-access-nqwjh\") on node \"crc\" DevicePath \"\"" Dec 03 08:21:58 crc kubenswrapper[4831]: I1203 08:21:58.970927 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/332abd9c-a42c-444b-96d8-8721c6999921-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.061642 4831 generic.go:334] "Generic (PLEG): container finished" podID="332abd9c-a42c-444b-96d8-8721c6999921" containerID="fe65bc786564c7628c12745e59d434864ec58c1ab0b3e84cfbd41b01da71e3eb" exitCode=0 Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.063522 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vh487" Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.063576 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh487" event={"ID":"332abd9c-a42c-444b-96d8-8721c6999921","Type":"ContainerDied","Data":"fe65bc786564c7628c12745e59d434864ec58c1ab0b3e84cfbd41b01da71e3eb"} Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.063636 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vh487" event={"ID":"332abd9c-a42c-444b-96d8-8721c6999921","Type":"ContainerDied","Data":"8bde1c0ced475139d2baafff67370dd2dad7b0f62ffc887af3cd38ee61c9a3c5"} Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.063654 4831 scope.go:117] "RemoveContainer" containerID="fe65bc786564c7628c12745e59d434864ec58c1ab0b3e84cfbd41b01da71e3eb" Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.099438 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vh487"] Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.107049 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vh487"] Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.113860 4831 scope.go:117] "RemoveContainer" containerID="63dab45d84a87009ec5298d5a60a18c5cfb3e76c60368180fbe6fe469c61e188" Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.123186 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.146405 4831 scope.go:117] "RemoveContainer" containerID="369826b42e01d779094c52eacabe06240fb954e97bfcac3873c1b9dc6e236595" Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.196297 4831 scope.go:117] "RemoveContainer" containerID="fe65bc786564c7628c12745e59d434864ec58c1ab0b3e84cfbd41b01da71e3eb" Dec 03 08:21:59 crc kubenswrapper[4831]: E1203 08:21:59.197045 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe65bc786564c7628c12745e59d434864ec58c1ab0b3e84cfbd41b01da71e3eb\": container with ID starting with fe65bc786564c7628c12745e59d434864ec58c1ab0b3e84cfbd41b01da71e3eb not found: ID does not exist" containerID="fe65bc786564c7628c12745e59d434864ec58c1ab0b3e84cfbd41b01da71e3eb" Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.197085 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe65bc786564c7628c12745e59d434864ec58c1ab0b3e84cfbd41b01da71e3eb"} err="failed to get container status \"fe65bc786564c7628c12745e59d434864ec58c1ab0b3e84cfbd41b01da71e3eb\": rpc error: code = NotFound desc = could not find container \"fe65bc786564c7628c12745e59d434864ec58c1ab0b3e84cfbd41b01da71e3eb\": container with ID starting with fe65bc786564c7628c12745e59d434864ec58c1ab0b3e84cfbd41b01da71e3eb not found: ID does not exist" Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.197114 4831 scope.go:117] "RemoveContainer" containerID="63dab45d84a87009ec5298d5a60a18c5cfb3e76c60368180fbe6fe469c61e188" Dec 03 08:21:59 crc kubenswrapper[4831]: E1203 08:21:59.198008 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63dab45d84a87009ec5298d5a60a18c5cfb3e76c60368180fbe6fe469c61e188\": container with ID starting with 63dab45d84a87009ec5298d5a60a18c5cfb3e76c60368180fbe6fe469c61e188 not found: ID does not exist" containerID="63dab45d84a87009ec5298d5a60a18c5cfb3e76c60368180fbe6fe469c61e188" Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.198067 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63dab45d84a87009ec5298d5a60a18c5cfb3e76c60368180fbe6fe469c61e188"} err="failed to get container status \"63dab45d84a87009ec5298d5a60a18c5cfb3e76c60368180fbe6fe469c61e188\": rpc error: code = NotFound desc = could not find container \"63dab45d84a87009ec5298d5a60a18c5cfb3e76c60368180fbe6fe469c61e188\": container with ID starting with 63dab45d84a87009ec5298d5a60a18c5cfb3e76c60368180fbe6fe469c61e188 not found: ID does not exist" Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.198103 4831 scope.go:117] "RemoveContainer" containerID="369826b42e01d779094c52eacabe06240fb954e97bfcac3873c1b9dc6e236595" Dec 03 08:21:59 crc kubenswrapper[4831]: E1203 08:21:59.198691 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"369826b42e01d779094c52eacabe06240fb954e97bfcac3873c1b9dc6e236595\": container with ID starting with 369826b42e01d779094c52eacabe06240fb954e97bfcac3873c1b9dc6e236595 not found: ID does not exist" containerID="369826b42e01d779094c52eacabe06240fb954e97bfcac3873c1b9dc6e236595" Dec 03 08:21:59 crc kubenswrapper[4831]: I1203 08:21:59.198727 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"369826b42e01d779094c52eacabe06240fb954e97bfcac3873c1b9dc6e236595"} err="failed to get container status \"369826b42e01d779094c52eacabe06240fb954e97bfcac3873c1b9dc6e236595\": rpc error: code = NotFound desc = could not find container \"369826b42e01d779094c52eacabe06240fb954e97bfcac3873c1b9dc6e236595\": container with ID starting with 369826b42e01d779094c52eacabe06240fb954e97bfcac3873c1b9dc6e236595 not found: ID does not exist" Dec 03 08:22:01 crc kubenswrapper[4831]: I1203 08:22:01.026985 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="332abd9c-a42c-444b-96d8-8721c6999921" path="/var/lib/kubelet/pods/332abd9c-a42c-444b-96d8-8721c6999921/volumes" Dec 03 08:22:01 crc kubenswrapper[4831]: I1203 08:22:01.383987 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xw29f"] Dec 03 08:22:02 crc kubenswrapper[4831]: I1203 08:22:02.097043 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xw29f" podUID="eff03abb-783a-42fa-8771-a455f23f972a" containerName="registry-server" containerID="cri-o://81f163c828a14121db92801baae655775028560b052f669454c1b400f9202c8f" gracePeriod=2 Dec 03 08:22:02 crc kubenswrapper[4831]: I1203 08:22:02.692553 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:22:02 crc kubenswrapper[4831]: I1203 08:22:02.749281 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff03abb-783a-42fa-8771-a455f23f972a-catalog-content\") pod \"eff03abb-783a-42fa-8771-a455f23f972a\" (UID: \"eff03abb-783a-42fa-8771-a455f23f972a\") " Dec 03 08:22:02 crc kubenswrapper[4831]: I1203 08:22:02.749484 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phq9q\" (UniqueName: \"kubernetes.io/projected/eff03abb-783a-42fa-8771-a455f23f972a-kube-api-access-phq9q\") pod \"eff03abb-783a-42fa-8771-a455f23f972a\" (UID: \"eff03abb-783a-42fa-8771-a455f23f972a\") " Dec 03 08:22:02 crc kubenswrapper[4831]: I1203 08:22:02.749529 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff03abb-783a-42fa-8771-a455f23f972a-utilities\") pod \"eff03abb-783a-42fa-8771-a455f23f972a\" (UID: \"eff03abb-783a-42fa-8771-a455f23f972a\") " Dec 03 08:22:02 crc kubenswrapper[4831]: I1203 08:22:02.751337 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eff03abb-783a-42fa-8771-a455f23f972a-utilities" (OuterVolumeSpecName: "utilities") pod "eff03abb-783a-42fa-8771-a455f23f972a" (UID: "eff03abb-783a-42fa-8771-a455f23f972a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:22:02 crc kubenswrapper[4831]: I1203 08:22:02.765755 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff03abb-783a-42fa-8771-a455f23f972a-kube-api-access-phq9q" (OuterVolumeSpecName: "kube-api-access-phq9q") pod "eff03abb-783a-42fa-8771-a455f23f972a" (UID: "eff03abb-783a-42fa-8771-a455f23f972a"). InnerVolumeSpecName "kube-api-access-phq9q". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:22:02 crc kubenswrapper[4831]: I1203 08:22:02.853637 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-phq9q\" (UniqueName: \"kubernetes.io/projected/eff03abb-783a-42fa-8771-a455f23f972a-kube-api-access-phq9q\") on node \"crc\" DevicePath \"\"" Dec 03 08:22:02 crc kubenswrapper[4831]: I1203 08:22:02.853727 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff03abb-783a-42fa-8771-a455f23f972a-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:22:02 crc kubenswrapper[4831]: I1203 08:22:02.880146 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eff03abb-783a-42fa-8771-a455f23f972a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eff03abb-783a-42fa-8771-a455f23f972a" (UID: "eff03abb-783a-42fa-8771-a455f23f972a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:22:02 crc kubenswrapper[4831]: I1203 08:22:02.956534 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff03abb-783a-42fa-8771-a455f23f972a-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.111002 4831 generic.go:334] "Generic (PLEG): container finished" podID="eff03abb-783a-42fa-8771-a455f23f972a" containerID="81f163c828a14121db92801baae655775028560b052f669454c1b400f9202c8f" exitCode=0 Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.111055 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw29f" event={"ID":"eff03abb-783a-42fa-8771-a455f23f972a","Type":"ContainerDied","Data":"81f163c828a14121db92801baae655775028560b052f669454c1b400f9202c8f"} Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.111090 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xw29f" event={"ID":"eff03abb-783a-42fa-8771-a455f23f972a","Type":"ContainerDied","Data":"68725c994a2ea4610cb3f07e923a0485316c0ead652b7c02e596736575ccfddf"} Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.111112 4831 scope.go:117] "RemoveContainer" containerID="81f163c828a14121db92801baae655775028560b052f669454c1b400f9202c8f" Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.111223 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xw29f" Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.142938 4831 scope.go:117] "RemoveContainer" containerID="95da6e6670892e3262d6560394ac5ef152eaaa8857a43cb3205a05d5e40eff54" Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.143055 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xw29f"] Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.152880 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xw29f"] Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.193056 4831 scope.go:117] "RemoveContainer" containerID="59daee1ee8917e6ecf3d565959dda29cf9c14e537d155229bd5aa3b199a1441d" Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.240815 4831 scope.go:117] "RemoveContainer" containerID="81f163c828a14121db92801baae655775028560b052f669454c1b400f9202c8f" Dec 03 08:22:03 crc kubenswrapper[4831]: E1203 08:22:03.241296 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"81f163c828a14121db92801baae655775028560b052f669454c1b400f9202c8f\": container with ID starting with 81f163c828a14121db92801baae655775028560b052f669454c1b400f9202c8f not found: ID does not exist" containerID="81f163c828a14121db92801baae655775028560b052f669454c1b400f9202c8f" Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.241368 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"81f163c828a14121db92801baae655775028560b052f669454c1b400f9202c8f"} err="failed to get container status \"81f163c828a14121db92801baae655775028560b052f669454c1b400f9202c8f\": rpc error: code = NotFound desc = could not find container \"81f163c828a14121db92801baae655775028560b052f669454c1b400f9202c8f\": container with ID starting with 81f163c828a14121db92801baae655775028560b052f669454c1b400f9202c8f not found: ID does not exist" Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.241397 4831 scope.go:117] "RemoveContainer" containerID="95da6e6670892e3262d6560394ac5ef152eaaa8857a43cb3205a05d5e40eff54" Dec 03 08:22:03 crc kubenswrapper[4831]: E1203 08:22:03.241738 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95da6e6670892e3262d6560394ac5ef152eaaa8857a43cb3205a05d5e40eff54\": container with ID starting with 95da6e6670892e3262d6560394ac5ef152eaaa8857a43cb3205a05d5e40eff54 not found: ID does not exist" containerID="95da6e6670892e3262d6560394ac5ef152eaaa8857a43cb3205a05d5e40eff54" Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.241775 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95da6e6670892e3262d6560394ac5ef152eaaa8857a43cb3205a05d5e40eff54"} err="failed to get container status \"95da6e6670892e3262d6560394ac5ef152eaaa8857a43cb3205a05d5e40eff54\": rpc error: code = NotFound desc = could not find container \"95da6e6670892e3262d6560394ac5ef152eaaa8857a43cb3205a05d5e40eff54\": container with ID starting with 95da6e6670892e3262d6560394ac5ef152eaaa8857a43cb3205a05d5e40eff54 not found: ID does not exist" Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.241811 4831 scope.go:117] "RemoveContainer" containerID="59daee1ee8917e6ecf3d565959dda29cf9c14e537d155229bd5aa3b199a1441d" Dec 03 08:22:03 crc kubenswrapper[4831]: E1203 08:22:03.242044 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59daee1ee8917e6ecf3d565959dda29cf9c14e537d155229bd5aa3b199a1441d\": container with ID starting with 59daee1ee8917e6ecf3d565959dda29cf9c14e537d155229bd5aa3b199a1441d not found: ID does not exist" containerID="59daee1ee8917e6ecf3d565959dda29cf9c14e537d155229bd5aa3b199a1441d" Dec 03 08:22:03 crc kubenswrapper[4831]: I1203 08:22:03.242073 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59daee1ee8917e6ecf3d565959dda29cf9c14e537d155229bd5aa3b199a1441d"} err="failed to get container status \"59daee1ee8917e6ecf3d565959dda29cf9c14e537d155229bd5aa3b199a1441d\": rpc error: code = NotFound desc = could not find container \"59daee1ee8917e6ecf3d565959dda29cf9c14e537d155229bd5aa3b199a1441d\": container with ID starting with 59daee1ee8917e6ecf3d565959dda29cf9c14e537d155229bd5aa3b199a1441d not found: ID does not exist" Dec 03 08:22:05 crc kubenswrapper[4831]: I1203 08:22:05.034419 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff03abb-783a-42fa-8771-a455f23f972a" path="/var/lib/kubelet/pods/eff03abb-783a-42fa-8771-a455f23f972a/volumes" Dec 03 08:22:06 crc kubenswrapper[4831]: I1203 08:22:06.013939 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:22:06 crc kubenswrapper[4831]: E1203 08:22:06.014276 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:22:17 crc kubenswrapper[4831]: I1203 08:22:17.013411 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:22:17 crc kubenswrapper[4831]: E1203 08:22:17.014849 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:22:30 crc kubenswrapper[4831]: I1203 08:22:30.013007 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:22:30 crc kubenswrapper[4831]: E1203 08:22:30.013751 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:22:41 crc kubenswrapper[4831]: I1203 08:22:41.015523 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:22:41 crc kubenswrapper[4831]: E1203 08:22:41.017580 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:22:55 crc kubenswrapper[4831]: I1203 08:22:55.013346 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:22:55 crc kubenswrapper[4831]: E1203 08:22:55.014068 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:23:09 crc kubenswrapper[4831]: I1203 08:23:09.013939 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:23:09 crc kubenswrapper[4831]: I1203 08:23:09.838845 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"759264b96e116c4acf060f11d65606e11ddea3512a056f26cfafa45fbfb9fda2"} Dec 03 08:23:29 crc kubenswrapper[4831]: I1203 08:23:29.081578 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-e94a-account-create-update-rk45r"] Dec 03 08:23:29 crc kubenswrapper[4831]: I1203 08:23:29.098444 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-create-lvjzd"] Dec 03 08:23:29 crc kubenswrapper[4831]: I1203 08:23:29.108628 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-e94a-account-create-update-rk45r"] Dec 03 08:23:29 crc kubenswrapper[4831]: I1203 08:23:29.119235 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-create-lvjzd"] Dec 03 08:23:31 crc kubenswrapper[4831]: I1203 08:23:31.025146 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b9f139e-f14a-4bc6-81b4-a9635b47c78e" path="/var/lib/kubelet/pods/6b9f139e-f14a-4bc6-81b4-a9635b47c78e/volumes" Dec 03 08:23:31 crc kubenswrapper[4831]: I1203 08:23:31.026031 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1fdc7ed-1970-42d9-a759-9b9fa3566070" path="/var/lib/kubelet/pods/c1fdc7ed-1970-42d9-a759-9b9fa3566070/volumes" Dec 03 08:23:44 crc kubenswrapper[4831]: I1203 08:23:44.047344 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/heat-db-sync-l5zkp"] Dec 03 08:23:44 crc kubenswrapper[4831]: I1203 08:23:44.063377 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/heat-db-sync-l5zkp"] Dec 03 08:23:45 crc kubenswrapper[4831]: I1203 08:23:45.027629 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb6265a8-c4d5-43b4-b025-35a01e45411f" path="/var/lib/kubelet/pods/fb6265a8-c4d5-43b4-b025-35a01e45411f/volumes" Dec 03 08:24:25 crc kubenswrapper[4831]: I1203 08:24:25.716145 4831 scope.go:117] "RemoveContainer" containerID="64d4d0a737b87028c1806f952df218053da6065c4b1690bfcc007fb406fcc685" Dec 03 08:24:25 crc kubenswrapper[4831]: I1203 08:24:25.748571 4831 scope.go:117] "RemoveContainer" containerID="b15a651c9f0c4e0bd6d04443d3e2b8688b5688b55c1a07a6678c42b0483741a1" Dec 03 08:24:25 crc kubenswrapper[4831]: I1203 08:24:25.795157 4831 scope.go:117] "RemoveContainer" containerID="66f7ba7aa5e988e305a00d2993bce58bbd7a812efb1462d67e0b03bbd30e2a10" Dec 03 08:25:27 crc kubenswrapper[4831]: I1203 08:25:27.597164 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:25:27 crc kubenswrapper[4831]: I1203 08:25:27.597845 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:25:54 crc kubenswrapper[4831]: I1203 08:25:54.044521 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-create-f7s6p"] Dec 03 08:25:54 crc kubenswrapper[4831]: I1203 08:25:54.056121 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-c002-account-create-update-ks2p9"] Dec 03 08:25:54 crc kubenswrapper[4831]: I1203 08:25:54.064992 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-create-f7s6p"] Dec 03 08:25:54 crc kubenswrapper[4831]: I1203 08:25:54.073075 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-c002-account-create-update-ks2p9"] Dec 03 08:25:55 crc kubenswrapper[4831]: I1203 08:25:55.035433 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aab1c516-0cc9-4ee3-b0ac-993e92c6fd10" path="/var/lib/kubelet/pods/aab1c516-0cc9-4ee3-b0ac-993e92c6fd10/volumes" Dec 03 08:25:55 crc kubenswrapper[4831]: I1203 08:25:55.036903 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8020737-2ead-4709-ad77-8433e4fb38cb" path="/var/lib/kubelet/pods/e8020737-2ead-4709-ad77-8433e4fb38cb/volumes" Dec 03 08:25:57 crc kubenswrapper[4831]: I1203 08:25:57.596306 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:25:57 crc kubenswrapper[4831]: I1203 08:25:57.596806 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:26:07 crc kubenswrapper[4831]: I1203 08:26:07.039919 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/aodh-db-sync-4c4nv"] Dec 03 08:26:07 crc kubenswrapper[4831]: I1203 08:26:07.054253 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/aodh-db-sync-4c4nv"] Dec 03 08:26:09 crc kubenswrapper[4831]: I1203 08:26:09.030125 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e766432e-74e3-4160-adf1-1d2406683662" path="/var/lib/kubelet/pods/e766432e-74e3-4160-adf1-1d2406683662/volumes" Dec 03 08:26:25 crc kubenswrapper[4831]: I1203 08:26:25.946738 4831 scope.go:117] "RemoveContainer" containerID="ede2f4831e82f16cdcd581ca83e3c1e0e006bfc3f238dbeb11528004f043a658" Dec 03 08:26:25 crc kubenswrapper[4831]: I1203 08:26:25.991893 4831 scope.go:117] "RemoveContainer" containerID="489140d2e14c9afa177a1880cdf4b96133f8335120e51c370bdbcca6f3915763" Dec 03 08:26:26 crc kubenswrapper[4831]: I1203 08:26:26.033399 4831 scope.go:117] "RemoveContainer" containerID="ddf3a5fc7ce171a4af600381e10d4bc90fd1126b1a5bddc8b0ad55abe51a8845" Dec 03 08:26:27 crc kubenswrapper[4831]: I1203 08:26:27.596381 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:26:27 crc kubenswrapper[4831]: I1203 08:26:27.596749 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:26:27 crc kubenswrapper[4831]: I1203 08:26:27.596805 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 08:26:27 crc kubenswrapper[4831]: I1203 08:26:27.597967 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"759264b96e116c4acf060f11d65606e11ddea3512a056f26cfafa45fbfb9fda2"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:26:27 crc kubenswrapper[4831]: I1203 08:26:27.598027 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://759264b96e116c4acf060f11d65606e11ddea3512a056f26cfafa45fbfb9fda2" gracePeriod=600 Dec 03 08:26:28 crc kubenswrapper[4831]: I1203 08:26:28.035403 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-lqtzw"] Dec 03 08:26:28 crc kubenswrapper[4831]: I1203 08:26:28.049081 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-b907-account-create-update-8kt2t"] Dec 03 08:26:28 crc kubenswrapper[4831]: I1203 08:26:28.061582 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-lqtzw"] Dec 03 08:26:28 crc kubenswrapper[4831]: I1203 08:26:28.069111 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-b907-account-create-update-8kt2t"] Dec 03 08:26:28 crc kubenswrapper[4831]: I1203 08:26:28.128593 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="759264b96e116c4acf060f11d65606e11ddea3512a056f26cfafa45fbfb9fda2" exitCode=0 Dec 03 08:26:28 crc kubenswrapper[4831]: I1203 08:26:28.128634 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"759264b96e116c4acf060f11d65606e11ddea3512a056f26cfafa45fbfb9fda2"} Dec 03 08:26:28 crc kubenswrapper[4831]: I1203 08:26:28.128661 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4"} Dec 03 08:26:28 crc kubenswrapper[4831]: I1203 08:26:28.128676 4831 scope.go:117] "RemoveContainer" containerID="914a4460736924385a3004ccee997bcced8cb1b11119ff2f03c75b773771d911" Dec 03 08:26:29 crc kubenswrapper[4831]: I1203 08:26:29.029167 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80ab9d94-b820-49a4-a0a4-bf0c6311a691" path="/var/lib/kubelet/pods/80ab9d94-b820-49a4-a0a4-bf0c6311a691/volumes" Dec 03 08:26:29 crc kubenswrapper[4831]: I1203 08:26:29.030555 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cbb68c80-d879-4f3a-8ea3-beea7b78a02f" path="/var/lib/kubelet/pods/cbb68c80-d879-4f3a-8ea3-beea7b78a02f/volumes" Dec 03 08:26:40 crc kubenswrapper[4831]: I1203 08:26:40.062379 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-67jx5"] Dec 03 08:26:40 crc kubenswrapper[4831]: I1203 08:26:40.071585 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-67jx5"] Dec 03 08:26:41 crc kubenswrapper[4831]: I1203 08:26:41.030850 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bfbd107-20c1-4f67-bd53-cd68fb695b07" path="/var/lib/kubelet/pods/1bfbd107-20c1-4f67-bd53-cd68fb695b07/volumes" Dec 03 08:27:26 crc kubenswrapper[4831]: I1203 08:27:26.143853 4831 scope.go:117] "RemoveContainer" containerID="addcaa2377214201ea8a70fc4215385378472f1ccfd73b56d1fbbe4459a0925f" Dec 03 08:27:26 crc kubenswrapper[4831]: I1203 08:27:26.173843 4831 scope.go:117] "RemoveContainer" containerID="a64c6b415ba478e145090cd9f084bc6147d762b59ffd9d1e875357ca9346b0f7" Dec 03 08:27:26 crc kubenswrapper[4831]: I1203 08:27:26.238477 4831 scope.go:117] "RemoveContainer" containerID="b3718b6295b09e61d17fb5f2eae8f839312104a9223855aae44a689d3bfde73c" Dec 03 08:28:27 crc kubenswrapper[4831]: I1203 08:28:27.597088 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:28:27 crc kubenswrapper[4831]: I1203 08:28:27.597660 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:28:57 crc kubenswrapper[4831]: I1203 08:28:57.597242 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:28:57 crc kubenswrapper[4831]: I1203 08:28:57.597843 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:29:27 crc kubenswrapper[4831]: I1203 08:29:27.596583 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:29:27 crc kubenswrapper[4831]: I1203 08:29:27.597132 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:29:27 crc kubenswrapper[4831]: I1203 08:29:27.597184 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 08:29:27 crc kubenswrapper[4831]: I1203 08:29:27.598096 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:29:27 crc kubenswrapper[4831]: I1203 08:29:27.598155 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" gracePeriod=600 Dec 03 08:29:27 crc kubenswrapper[4831]: E1203 08:29:27.758901 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:29:28 crc kubenswrapper[4831]: I1203 08:29:28.024426 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" exitCode=0 Dec 03 08:29:28 crc kubenswrapper[4831]: I1203 08:29:28.024506 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4"} Dec 03 08:29:28 crc kubenswrapper[4831]: I1203 08:29:28.024781 4831 scope.go:117] "RemoveContainer" containerID="759264b96e116c4acf060f11d65606e11ddea3512a056f26cfafa45fbfb9fda2" Dec 03 08:29:28 crc kubenswrapper[4831]: I1203 08:29:28.025418 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:29:28 crc kubenswrapper[4831]: E1203 08:29:28.025666 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:29:39 crc kubenswrapper[4831]: I1203 08:29:39.013785 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:29:39 crc kubenswrapper[4831]: E1203 08:29:39.014741 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:29:49 crc kubenswrapper[4831]: I1203 08:29:49.246328 4831 generic.go:334] "Generic (PLEG): container finished" podID="307848db-fd00-472b-8653-c35696f43e6d" containerID="bcbdb721cc50dfca87fdf81b8ae2580c32e9414bbb862d2dde2894d486972c0e" exitCode=0 Dec 03 08:29:49 crc kubenswrapper[4831]: I1203 08:29:49.246356 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" event={"ID":"307848db-fd00-472b-8653-c35696f43e6d","Type":"ContainerDied","Data":"bcbdb721cc50dfca87fdf81b8ae2580c32e9414bbb862d2dde2894d486972c0e"} Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.695024 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.873681 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-ceph\") pod \"307848db-fd00-472b-8653-c35696f43e6d\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.873926 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-ssh-key\") pod \"307848db-fd00-472b-8653-c35696f43e6d\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.873954 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpq7w\" (UniqueName: \"kubernetes.io/projected/307848db-fd00-472b-8653-c35696f43e6d-kube-api-access-fpq7w\") pod \"307848db-fd00-472b-8653-c35696f43e6d\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.874008 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-inventory\") pod \"307848db-fd00-472b-8653-c35696f43e6d\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.874143 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-tripleo-cleanup-combined-ca-bundle\") pod \"307848db-fd00-472b-8653-c35696f43e6d\" (UID: \"307848db-fd00-472b-8653-c35696f43e6d\") " Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.879393 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-tripleo-cleanup-combined-ca-bundle" (OuterVolumeSpecName: "tripleo-cleanup-combined-ca-bundle") pod "307848db-fd00-472b-8653-c35696f43e6d" (UID: "307848db-fd00-472b-8653-c35696f43e6d"). InnerVolumeSpecName "tripleo-cleanup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.879387 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/307848db-fd00-472b-8653-c35696f43e6d-kube-api-access-fpq7w" (OuterVolumeSpecName: "kube-api-access-fpq7w") pod "307848db-fd00-472b-8653-c35696f43e6d" (UID: "307848db-fd00-472b-8653-c35696f43e6d"). InnerVolumeSpecName "kube-api-access-fpq7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.880701 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-ceph" (OuterVolumeSpecName: "ceph") pod "307848db-fd00-472b-8653-c35696f43e6d" (UID: "307848db-fd00-472b-8653-c35696f43e6d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.908719 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "307848db-fd00-472b-8653-c35696f43e6d" (UID: "307848db-fd00-472b-8653-c35696f43e6d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.911505 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-inventory" (OuterVolumeSpecName: "inventory") pod "307848db-fd00-472b-8653-c35696f43e6d" (UID: "307848db-fd00-472b-8653-c35696f43e6d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.976694 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.976726 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpq7w\" (UniqueName: \"kubernetes.io/projected/307848db-fd00-472b-8653-c35696f43e6d-kube-api-access-fpq7w\") on node \"crc\" DevicePath \"\"" Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.976739 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.976749 4831 reconciler_common.go:293] "Volume detached for volume \"tripleo-cleanup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-tripleo-cleanup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:29:50 crc kubenswrapper[4831]: I1203 08:29:50.976758 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/307848db-fd00-472b-8653-c35696f43e6d-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:29:51 crc kubenswrapper[4831]: I1203 08:29:51.271398 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" event={"ID":"307848db-fd00-472b-8653-c35696f43e6d","Type":"ContainerDied","Data":"d69e6727ea15ea945641e2fba35a1ec7a5ce4cf6c0a457ec122a5fc253afd1da"} Dec 03 08:29:51 crc kubenswrapper[4831]: I1203 08:29:51.271460 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t" Dec 03 08:29:51 crc kubenswrapper[4831]: I1203 08:29:51.271445 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d69e6727ea15ea945641e2fba35a1ec7a5ce4cf6c0a457ec122a5fc253afd1da" Dec 03 08:29:53 crc kubenswrapper[4831]: I1203 08:29:53.029201 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:29:53 crc kubenswrapper[4831]: E1203 08:29:53.030094 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.803631 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-ksczq"] Dec 03 08:29:54 crc kubenswrapper[4831]: E1203 08:29:54.804278 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff03abb-783a-42fa-8771-a455f23f972a" containerName="extract-content" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.804290 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff03abb-783a-42fa-8771-a455f23f972a" containerName="extract-content" Dec 03 08:29:54 crc kubenswrapper[4831]: E1203 08:29:54.804327 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332abd9c-a42c-444b-96d8-8721c6999921" containerName="extract-utilities" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.804333 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="332abd9c-a42c-444b-96d8-8721c6999921" containerName="extract-utilities" Dec 03 08:29:54 crc kubenswrapper[4831]: E1203 08:29:54.804345 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332abd9c-a42c-444b-96d8-8721c6999921" containerName="extract-content" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.804352 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="332abd9c-a42c-444b-96d8-8721c6999921" containerName="extract-content" Dec 03 08:29:54 crc kubenswrapper[4831]: E1203 08:29:54.804361 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="307848db-fd00-472b-8653-c35696f43e6d" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.804368 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="307848db-fd00-472b-8653-c35696f43e6d" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 03 08:29:54 crc kubenswrapper[4831]: E1203 08:29:54.804376 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff03abb-783a-42fa-8771-a455f23f972a" containerName="registry-server" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.804383 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff03abb-783a-42fa-8771-a455f23f972a" containerName="registry-server" Dec 03 08:29:54 crc kubenswrapper[4831]: E1203 08:29:54.804399 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="332abd9c-a42c-444b-96d8-8721c6999921" containerName="registry-server" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.804404 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="332abd9c-a42c-444b-96d8-8721c6999921" containerName="registry-server" Dec 03 08:29:54 crc kubenswrapper[4831]: E1203 08:29:54.804425 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff03abb-783a-42fa-8771-a455f23f972a" containerName="extract-utilities" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.804430 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff03abb-783a-42fa-8771-a455f23f972a" containerName="extract-utilities" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.804640 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="307848db-fd00-472b-8653-c35696f43e6d" containerName="tripleo-cleanup-tripleo-cleanup-openstack-cell1" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.804661 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff03abb-783a-42fa-8771-a455f23f972a" containerName="registry-server" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.804681 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="332abd9c-a42c-444b-96d8-8721c6999921" containerName="registry-server" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.805447 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.807786 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.808011 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.809575 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.810107 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.813367 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-ksczq"] Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.964063 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-inventory\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.964145 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.964245 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-ceph\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.964282 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:54 crc kubenswrapper[4831]: I1203 08:29:54.964344 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdqdq\" (UniqueName: \"kubernetes.io/projected/8e3a439c-c9c5-432d-8fbd-c9854822d349-kube-api-access-mdqdq\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:55 crc kubenswrapper[4831]: I1203 08:29:55.066883 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-inventory\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:55 crc kubenswrapper[4831]: I1203 08:29:55.066982 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:55 crc kubenswrapper[4831]: I1203 08:29:55.067059 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-ceph\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:55 crc kubenswrapper[4831]: I1203 08:29:55.067100 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:55 crc kubenswrapper[4831]: I1203 08:29:55.067138 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdqdq\" (UniqueName: \"kubernetes.io/projected/8e3a439c-c9c5-432d-8fbd-c9854822d349-kube-api-access-mdqdq\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:55 crc kubenswrapper[4831]: I1203 08:29:55.073700 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-inventory\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:55 crc kubenswrapper[4831]: I1203 08:29:55.074250 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-bootstrap-combined-ca-bundle\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:55 crc kubenswrapper[4831]: I1203 08:29:55.075837 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-ceph\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:55 crc kubenswrapper[4831]: I1203 08:29:55.076663 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-ssh-key\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:55 crc kubenswrapper[4831]: I1203 08:29:55.090372 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdqdq\" (UniqueName: \"kubernetes.io/projected/8e3a439c-c9c5-432d-8fbd-c9854822d349-kube-api-access-mdqdq\") pod \"bootstrap-openstack-openstack-cell1-ksczq\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:55 crc kubenswrapper[4831]: I1203 08:29:55.125491 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:29:55 crc kubenswrapper[4831]: I1203 08:29:55.700864 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-openstack-openstack-cell1-ksczq"] Dec 03 08:29:55 crc kubenswrapper[4831]: I1203 08:29:55.702523 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:29:56 crc kubenswrapper[4831]: I1203 08:29:56.328624 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" event={"ID":"8e3a439c-c9c5-432d-8fbd-c9854822d349","Type":"ContainerStarted","Data":"fbbb8239229a64e5c4f14d5aaec95f537836db0dcf17629bd01d9fb33231296a"} Dec 03 08:29:56 crc kubenswrapper[4831]: I1203 08:29:56.329180 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" event={"ID":"8e3a439c-c9c5-432d-8fbd-c9854822d349","Type":"ContainerStarted","Data":"efc41627ba1d5c9509dd861f407e12ad8de9859e43f09f44f040c51a4f6441a1"} Dec 03 08:29:56 crc kubenswrapper[4831]: I1203 08:29:56.361694 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" podStartSLOduration=2.169391511 podStartE2EDuration="2.361671869s" podCreationTimestamp="2025-12-03 08:29:54 +0000 UTC" firstStartedPulling="2025-12-03 08:29:55.702330256 +0000 UTC m=+7133.045913764" lastFinishedPulling="2025-12-03 08:29:55.894610604 +0000 UTC m=+7133.238194122" observedRunningTime="2025-12-03 08:29:56.347164418 +0000 UTC m=+7133.690747946" watchObservedRunningTime="2025-12-03 08:29:56.361671869 +0000 UTC m=+7133.705255387" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.144286 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz"] Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.146114 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.151182 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.151221 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.170265 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz"] Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.280752 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/451128f9-59a1-4ac3-a52c-472c4b87c8c5-secret-volume\") pod \"collect-profiles-29412510-jlcjz\" (UID: \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.280997 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z26r\" (UniqueName: \"kubernetes.io/projected/451128f9-59a1-4ac3-a52c-472c4b87c8c5-kube-api-access-8z26r\") pod \"collect-profiles-29412510-jlcjz\" (UID: \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.281063 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/451128f9-59a1-4ac3-a52c-472c4b87c8c5-config-volume\") pod \"collect-profiles-29412510-jlcjz\" (UID: \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.383102 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z26r\" (UniqueName: \"kubernetes.io/projected/451128f9-59a1-4ac3-a52c-472c4b87c8c5-kube-api-access-8z26r\") pod \"collect-profiles-29412510-jlcjz\" (UID: \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.383213 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/451128f9-59a1-4ac3-a52c-472c4b87c8c5-config-volume\") pod \"collect-profiles-29412510-jlcjz\" (UID: \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.383281 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/451128f9-59a1-4ac3-a52c-472c4b87c8c5-secret-volume\") pod \"collect-profiles-29412510-jlcjz\" (UID: \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.384227 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/451128f9-59a1-4ac3-a52c-472c4b87c8c5-config-volume\") pod \"collect-profiles-29412510-jlcjz\" (UID: \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.389514 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/451128f9-59a1-4ac3-a52c-472c4b87c8c5-secret-volume\") pod \"collect-profiles-29412510-jlcjz\" (UID: \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.409509 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z26r\" (UniqueName: \"kubernetes.io/projected/451128f9-59a1-4ac3-a52c-472c4b87c8c5-kube-api-access-8z26r\") pod \"collect-profiles-29412510-jlcjz\" (UID: \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.470593 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" Dec 03 08:30:00 crc kubenswrapper[4831]: I1203 08:30:00.919902 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz"] Dec 03 08:30:00 crc kubenswrapper[4831]: W1203 08:30:00.952587 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod451128f9_59a1_4ac3_a52c_472c4b87c8c5.slice/crio-4142124d257ec7880c8ce9420e166c893f0e4a4c377572f85ec6517e0ced42c9 WatchSource:0}: Error finding container 4142124d257ec7880c8ce9420e166c893f0e4a4c377572f85ec6517e0ced42c9: Status 404 returned error can't find the container with id 4142124d257ec7880c8ce9420e166c893f0e4a4c377572f85ec6517e0ced42c9 Dec 03 08:30:01 crc kubenswrapper[4831]: I1203 08:30:01.378261 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" event={"ID":"451128f9-59a1-4ac3-a52c-472c4b87c8c5","Type":"ContainerStarted","Data":"3db66a56d102c8c488a606708f65976b7ac7a4369d1975f7b0467d1f366bfc1d"} Dec 03 08:30:01 crc kubenswrapper[4831]: I1203 08:30:01.378308 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" event={"ID":"451128f9-59a1-4ac3-a52c-472c4b87c8c5","Type":"ContainerStarted","Data":"4142124d257ec7880c8ce9420e166c893f0e4a4c377572f85ec6517e0ced42c9"} Dec 03 08:30:01 crc kubenswrapper[4831]: I1203 08:30:01.398985 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" podStartSLOduration=1.398965657 podStartE2EDuration="1.398965657s" podCreationTimestamp="2025-12-03 08:30:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:30:01.398669277 +0000 UTC m=+7138.742252775" watchObservedRunningTime="2025-12-03 08:30:01.398965657 +0000 UTC m=+7138.742549155" Dec 03 08:30:02 crc kubenswrapper[4831]: I1203 08:30:02.392939 4831 generic.go:334] "Generic (PLEG): container finished" podID="451128f9-59a1-4ac3-a52c-472c4b87c8c5" containerID="3db66a56d102c8c488a606708f65976b7ac7a4369d1975f7b0467d1f366bfc1d" exitCode=0 Dec 03 08:30:02 crc kubenswrapper[4831]: I1203 08:30:02.393005 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" event={"ID":"451128f9-59a1-4ac3-a52c-472c4b87c8c5","Type":"ContainerDied","Data":"3db66a56d102c8c488a606708f65976b7ac7a4369d1975f7b0467d1f366bfc1d"} Dec 03 08:30:03 crc kubenswrapper[4831]: I1203 08:30:03.778312 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" Dec 03 08:30:03 crc kubenswrapper[4831]: I1203 08:30:03.869773 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/451128f9-59a1-4ac3-a52c-472c4b87c8c5-secret-volume\") pod \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\" (UID: \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\") " Dec 03 08:30:03 crc kubenswrapper[4831]: I1203 08:30:03.869942 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8z26r\" (UniqueName: \"kubernetes.io/projected/451128f9-59a1-4ac3-a52c-472c4b87c8c5-kube-api-access-8z26r\") pod \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\" (UID: \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\") " Dec 03 08:30:03 crc kubenswrapper[4831]: I1203 08:30:03.869975 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/451128f9-59a1-4ac3-a52c-472c4b87c8c5-config-volume\") pod \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\" (UID: \"451128f9-59a1-4ac3-a52c-472c4b87c8c5\") " Dec 03 08:30:03 crc kubenswrapper[4831]: I1203 08:30:03.870784 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/451128f9-59a1-4ac3-a52c-472c4b87c8c5-config-volume" (OuterVolumeSpecName: "config-volume") pod "451128f9-59a1-4ac3-a52c-472c4b87c8c5" (UID: "451128f9-59a1-4ac3-a52c-472c4b87c8c5"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:30:03 crc kubenswrapper[4831]: I1203 08:30:03.876521 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/451128f9-59a1-4ac3-a52c-472c4b87c8c5-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "451128f9-59a1-4ac3-a52c-472c4b87c8c5" (UID: "451128f9-59a1-4ac3-a52c-472c4b87c8c5"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:30:03 crc kubenswrapper[4831]: I1203 08:30:03.879568 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/451128f9-59a1-4ac3-a52c-472c4b87c8c5-kube-api-access-8z26r" (OuterVolumeSpecName: "kube-api-access-8z26r") pod "451128f9-59a1-4ac3-a52c-472c4b87c8c5" (UID: "451128f9-59a1-4ac3-a52c-472c4b87c8c5"). InnerVolumeSpecName "kube-api-access-8z26r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:30:03 crc kubenswrapper[4831]: I1203 08:30:03.972243 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/451128f9-59a1-4ac3-a52c-472c4b87c8c5-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:30:03 crc kubenswrapper[4831]: I1203 08:30:03.972287 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8z26r\" (UniqueName: \"kubernetes.io/projected/451128f9-59a1-4ac3-a52c-472c4b87c8c5-kube-api-access-8z26r\") on node \"crc\" DevicePath \"\"" Dec 03 08:30:03 crc kubenswrapper[4831]: I1203 08:30:03.972298 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/451128f9-59a1-4ac3-a52c-472c4b87c8c5-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:30:04 crc kubenswrapper[4831]: I1203 08:30:04.013232 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:30:04 crc kubenswrapper[4831]: E1203 08:30:04.013717 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:30:04 crc kubenswrapper[4831]: I1203 08:30:04.425958 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" event={"ID":"451128f9-59a1-4ac3-a52c-472c4b87c8c5","Type":"ContainerDied","Data":"4142124d257ec7880c8ce9420e166c893f0e4a4c377572f85ec6517e0ced42c9"} Dec 03 08:30:04 crc kubenswrapper[4831]: I1203 08:30:04.426009 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4142124d257ec7880c8ce9420e166c893f0e4a4c377572f85ec6517e0ced42c9" Dec 03 08:30:04 crc kubenswrapper[4831]: I1203 08:30:04.426062 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz" Dec 03 08:30:04 crc kubenswrapper[4831]: I1203 08:30:04.487743 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m"] Dec 03 08:30:04 crc kubenswrapper[4831]: I1203 08:30:04.496808 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412465-t7g9m"] Dec 03 08:30:05 crc kubenswrapper[4831]: I1203 08:30:05.027954 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aaf4a58-6301-4f40-99a4-90993f851a8d" path="/var/lib/kubelet/pods/9aaf4a58-6301-4f40-99a4-90993f851a8d/volumes" Dec 03 08:30:19 crc kubenswrapper[4831]: I1203 08:30:19.012642 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:30:19 crc kubenswrapper[4831]: E1203 08:30:19.013527 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:30:26 crc kubenswrapper[4831]: I1203 08:30:26.377184 4831 scope.go:117] "RemoveContainer" containerID="7e1d6c9217938e0f35c6dd4a17856d7002ac9e4bdfab8ece00be02a1dda9e274" Dec 03 08:30:34 crc kubenswrapper[4831]: I1203 08:30:34.013531 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:30:34 crc kubenswrapper[4831]: E1203 08:30:34.014424 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:30:49 crc kubenswrapper[4831]: I1203 08:30:49.012688 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:30:49 crc kubenswrapper[4831]: E1203 08:30:49.013563 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:31:04 crc kubenswrapper[4831]: I1203 08:31:04.012834 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:31:04 crc kubenswrapper[4831]: E1203 08:31:04.013757 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:31:19 crc kubenswrapper[4831]: I1203 08:31:19.014041 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:31:19 crc kubenswrapper[4831]: E1203 08:31:19.014988 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:31:30 crc kubenswrapper[4831]: I1203 08:31:30.013784 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:31:30 crc kubenswrapper[4831]: E1203 08:31:30.014594 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.346643 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-xdtjg"] Dec 03 08:31:41 crc kubenswrapper[4831]: E1203 08:31:41.347805 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="451128f9-59a1-4ac3-a52c-472c4b87c8c5" containerName="collect-profiles" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.347824 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="451128f9-59a1-4ac3-a52c-472c4b87c8c5" containerName="collect-profiles" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.348163 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="451128f9-59a1-4ac3-a52c-472c4b87c8c5" containerName="collect-profiles" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.350175 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.360674 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xdtjg"] Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.457045 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tnvf\" (UniqueName: \"kubernetes.io/projected/b8d69ecf-0220-401b-afc0-dedff203b7b5-kube-api-access-8tnvf\") pod \"redhat-operators-xdtjg\" (UID: \"b8d69ecf-0220-401b-afc0-dedff203b7b5\") " pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.457650 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8d69ecf-0220-401b-afc0-dedff203b7b5-utilities\") pod \"redhat-operators-xdtjg\" (UID: \"b8d69ecf-0220-401b-afc0-dedff203b7b5\") " pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.457817 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8d69ecf-0220-401b-afc0-dedff203b7b5-catalog-content\") pod \"redhat-operators-xdtjg\" (UID: \"b8d69ecf-0220-401b-afc0-dedff203b7b5\") " pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.559409 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8d69ecf-0220-401b-afc0-dedff203b7b5-utilities\") pod \"redhat-operators-xdtjg\" (UID: \"b8d69ecf-0220-401b-afc0-dedff203b7b5\") " pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.559765 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8d69ecf-0220-401b-afc0-dedff203b7b5-catalog-content\") pod \"redhat-operators-xdtjg\" (UID: \"b8d69ecf-0220-401b-afc0-dedff203b7b5\") " pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.559848 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8tnvf\" (UniqueName: \"kubernetes.io/projected/b8d69ecf-0220-401b-afc0-dedff203b7b5-kube-api-access-8tnvf\") pod \"redhat-operators-xdtjg\" (UID: \"b8d69ecf-0220-401b-afc0-dedff203b7b5\") " pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.560782 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8d69ecf-0220-401b-afc0-dedff203b7b5-utilities\") pod \"redhat-operators-xdtjg\" (UID: \"b8d69ecf-0220-401b-afc0-dedff203b7b5\") " pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.561101 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8d69ecf-0220-401b-afc0-dedff203b7b5-catalog-content\") pod \"redhat-operators-xdtjg\" (UID: \"b8d69ecf-0220-401b-afc0-dedff203b7b5\") " pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.585003 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tnvf\" (UniqueName: \"kubernetes.io/projected/b8d69ecf-0220-401b-afc0-dedff203b7b5-kube-api-access-8tnvf\") pod \"redhat-operators-xdtjg\" (UID: \"b8d69ecf-0220-401b-afc0-dedff203b7b5\") " pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:31:41 crc kubenswrapper[4831]: I1203 08:31:41.701229 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:31:42 crc kubenswrapper[4831]: I1203 08:31:42.392365 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-xdtjg"] Dec 03 08:31:42 crc kubenswrapper[4831]: I1203 08:31:42.444608 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdtjg" event={"ID":"b8d69ecf-0220-401b-afc0-dedff203b7b5","Type":"ContainerStarted","Data":"c6a4bc57c50dfb8f0fd5015df4a9670228237eb9ddeda1dd326578d9ab080a20"} Dec 03 08:31:43 crc kubenswrapper[4831]: I1203 08:31:43.459113 4831 generic.go:334] "Generic (PLEG): container finished" podID="b8d69ecf-0220-401b-afc0-dedff203b7b5" containerID="4db75f7c2cadff9f3bab7132198ea34a8c04dd5623edb951141224662132910c" exitCode=0 Dec 03 08:31:43 crc kubenswrapper[4831]: I1203 08:31:43.459511 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdtjg" event={"ID":"b8d69ecf-0220-401b-afc0-dedff203b7b5","Type":"ContainerDied","Data":"4db75f7c2cadff9f3bab7132198ea34a8c04dd5623edb951141224662132910c"} Dec 03 08:31:44 crc kubenswrapper[4831]: I1203 08:31:44.013124 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:31:44 crc kubenswrapper[4831]: E1203 08:31:44.013477 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:31:44 crc kubenswrapper[4831]: I1203 08:31:44.470963 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdtjg" event={"ID":"b8d69ecf-0220-401b-afc0-dedff203b7b5","Type":"ContainerStarted","Data":"6be4977f42f53574004eaef1a9cb352cee8c627235a001fb8abb2ee23f601678"} Dec 03 08:31:46 crc kubenswrapper[4831]: I1203 08:31:46.492531 4831 generic.go:334] "Generic (PLEG): container finished" podID="b8d69ecf-0220-401b-afc0-dedff203b7b5" containerID="6be4977f42f53574004eaef1a9cb352cee8c627235a001fb8abb2ee23f601678" exitCode=0 Dec 03 08:31:46 crc kubenswrapper[4831]: I1203 08:31:46.492601 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdtjg" event={"ID":"b8d69ecf-0220-401b-afc0-dedff203b7b5","Type":"ContainerDied","Data":"6be4977f42f53574004eaef1a9cb352cee8c627235a001fb8abb2ee23f601678"} Dec 03 08:31:48 crc kubenswrapper[4831]: I1203 08:31:48.529266 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdtjg" event={"ID":"b8d69ecf-0220-401b-afc0-dedff203b7b5","Type":"ContainerStarted","Data":"4db257333628a06ed7322ecd0445f31f032b9a2e627fdba4d885a3f0637830c9"} Dec 03 08:31:48 crc kubenswrapper[4831]: I1203 08:31:48.557154 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-xdtjg" podStartSLOduration=2.974999723 podStartE2EDuration="7.557134657s" podCreationTimestamp="2025-12-03 08:31:41 +0000 UTC" firstStartedPulling="2025-12-03 08:31:43.464133644 +0000 UTC m=+7240.807717152" lastFinishedPulling="2025-12-03 08:31:48.046268578 +0000 UTC m=+7245.389852086" observedRunningTime="2025-12-03 08:31:48.553301707 +0000 UTC m=+7245.896885215" watchObservedRunningTime="2025-12-03 08:31:48.557134657 +0000 UTC m=+7245.900718165" Dec 03 08:31:51 crc kubenswrapper[4831]: I1203 08:31:51.702071 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:31:51 crc kubenswrapper[4831]: I1203 08:31:51.704413 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:31:52 crc kubenswrapper[4831]: I1203 08:31:52.747274 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-xdtjg" podUID="b8d69ecf-0220-401b-afc0-dedff203b7b5" containerName="registry-server" probeResult="failure" output=< Dec 03 08:31:52 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 03 08:31:52 crc kubenswrapper[4831]: > Dec 03 08:31:57 crc kubenswrapper[4831]: I1203 08:31:57.012892 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:31:57 crc kubenswrapper[4831]: E1203 08:31:57.013765 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:32:01 crc kubenswrapper[4831]: I1203 08:32:01.767886 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:32:01 crc kubenswrapper[4831]: I1203 08:32:01.825025 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:32:02 crc kubenswrapper[4831]: I1203 08:32:02.011246 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xdtjg"] Dec 03 08:32:03 crc kubenswrapper[4831]: I1203 08:32:03.666721 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-xdtjg" podUID="b8d69ecf-0220-401b-afc0-dedff203b7b5" containerName="registry-server" containerID="cri-o://4db257333628a06ed7322ecd0445f31f032b9a2e627fdba4d885a3f0637830c9" gracePeriod=2 Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.205706 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.283027 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8d69ecf-0220-401b-afc0-dedff203b7b5-catalog-content\") pod \"b8d69ecf-0220-401b-afc0-dedff203b7b5\" (UID: \"b8d69ecf-0220-401b-afc0-dedff203b7b5\") " Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.283165 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tnvf\" (UniqueName: \"kubernetes.io/projected/b8d69ecf-0220-401b-afc0-dedff203b7b5-kube-api-access-8tnvf\") pod \"b8d69ecf-0220-401b-afc0-dedff203b7b5\" (UID: \"b8d69ecf-0220-401b-afc0-dedff203b7b5\") " Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.283272 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8d69ecf-0220-401b-afc0-dedff203b7b5-utilities\") pod \"b8d69ecf-0220-401b-afc0-dedff203b7b5\" (UID: \"b8d69ecf-0220-401b-afc0-dedff203b7b5\") " Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.284176 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8d69ecf-0220-401b-afc0-dedff203b7b5-utilities" (OuterVolumeSpecName: "utilities") pod "b8d69ecf-0220-401b-afc0-dedff203b7b5" (UID: "b8d69ecf-0220-401b-afc0-dedff203b7b5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.288198 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d69ecf-0220-401b-afc0-dedff203b7b5-kube-api-access-8tnvf" (OuterVolumeSpecName: "kube-api-access-8tnvf") pod "b8d69ecf-0220-401b-afc0-dedff203b7b5" (UID: "b8d69ecf-0220-401b-afc0-dedff203b7b5"). InnerVolumeSpecName "kube-api-access-8tnvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.385312 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b8d69ecf-0220-401b-afc0-dedff203b7b5-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.385355 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tnvf\" (UniqueName: \"kubernetes.io/projected/b8d69ecf-0220-401b-afc0-dedff203b7b5-kube-api-access-8tnvf\") on node \"crc\" DevicePath \"\"" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.427432 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8d69ecf-0220-401b-afc0-dedff203b7b5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b8d69ecf-0220-401b-afc0-dedff203b7b5" (UID: "b8d69ecf-0220-401b-afc0-dedff203b7b5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.486870 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b8d69ecf-0220-401b-afc0-dedff203b7b5-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.679023 4831 generic.go:334] "Generic (PLEG): container finished" podID="b8d69ecf-0220-401b-afc0-dedff203b7b5" containerID="4db257333628a06ed7322ecd0445f31f032b9a2e627fdba4d885a3f0637830c9" exitCode=0 Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.679065 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdtjg" event={"ID":"b8d69ecf-0220-401b-afc0-dedff203b7b5","Type":"ContainerDied","Data":"4db257333628a06ed7322ecd0445f31f032b9a2e627fdba4d885a3f0637830c9"} Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.679104 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-xdtjg" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.679140 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-xdtjg" event={"ID":"b8d69ecf-0220-401b-afc0-dedff203b7b5","Type":"ContainerDied","Data":"c6a4bc57c50dfb8f0fd5015df4a9670228237eb9ddeda1dd326578d9ab080a20"} Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.679169 4831 scope.go:117] "RemoveContainer" containerID="4db257333628a06ed7322ecd0445f31f032b9a2e627fdba4d885a3f0637830c9" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.716649 4831 scope.go:117] "RemoveContainer" containerID="6be4977f42f53574004eaef1a9cb352cee8c627235a001fb8abb2ee23f601678" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.727995 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-xdtjg"] Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.736750 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-xdtjg"] Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.753443 4831 scope.go:117] "RemoveContainer" containerID="4db75f7c2cadff9f3bab7132198ea34a8c04dd5623edb951141224662132910c" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.789756 4831 scope.go:117] "RemoveContainer" containerID="4db257333628a06ed7322ecd0445f31f032b9a2e627fdba4d885a3f0637830c9" Dec 03 08:32:04 crc kubenswrapper[4831]: E1203 08:32:04.790178 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db257333628a06ed7322ecd0445f31f032b9a2e627fdba4d885a3f0637830c9\": container with ID starting with 4db257333628a06ed7322ecd0445f31f032b9a2e627fdba4d885a3f0637830c9 not found: ID does not exist" containerID="4db257333628a06ed7322ecd0445f31f032b9a2e627fdba4d885a3f0637830c9" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.790222 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db257333628a06ed7322ecd0445f31f032b9a2e627fdba4d885a3f0637830c9"} err="failed to get container status \"4db257333628a06ed7322ecd0445f31f032b9a2e627fdba4d885a3f0637830c9\": rpc error: code = NotFound desc = could not find container \"4db257333628a06ed7322ecd0445f31f032b9a2e627fdba4d885a3f0637830c9\": container with ID starting with 4db257333628a06ed7322ecd0445f31f032b9a2e627fdba4d885a3f0637830c9 not found: ID does not exist" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.790250 4831 scope.go:117] "RemoveContainer" containerID="6be4977f42f53574004eaef1a9cb352cee8c627235a001fb8abb2ee23f601678" Dec 03 08:32:04 crc kubenswrapper[4831]: E1203 08:32:04.790579 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6be4977f42f53574004eaef1a9cb352cee8c627235a001fb8abb2ee23f601678\": container with ID starting with 6be4977f42f53574004eaef1a9cb352cee8c627235a001fb8abb2ee23f601678 not found: ID does not exist" containerID="6be4977f42f53574004eaef1a9cb352cee8c627235a001fb8abb2ee23f601678" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.790607 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6be4977f42f53574004eaef1a9cb352cee8c627235a001fb8abb2ee23f601678"} err="failed to get container status \"6be4977f42f53574004eaef1a9cb352cee8c627235a001fb8abb2ee23f601678\": rpc error: code = NotFound desc = could not find container \"6be4977f42f53574004eaef1a9cb352cee8c627235a001fb8abb2ee23f601678\": container with ID starting with 6be4977f42f53574004eaef1a9cb352cee8c627235a001fb8abb2ee23f601678 not found: ID does not exist" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.790625 4831 scope.go:117] "RemoveContainer" containerID="4db75f7c2cadff9f3bab7132198ea34a8c04dd5623edb951141224662132910c" Dec 03 08:32:04 crc kubenswrapper[4831]: E1203 08:32:04.790967 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4db75f7c2cadff9f3bab7132198ea34a8c04dd5623edb951141224662132910c\": container with ID starting with 4db75f7c2cadff9f3bab7132198ea34a8c04dd5623edb951141224662132910c not found: ID does not exist" containerID="4db75f7c2cadff9f3bab7132198ea34a8c04dd5623edb951141224662132910c" Dec 03 08:32:04 crc kubenswrapper[4831]: I1203 08:32:04.790993 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4db75f7c2cadff9f3bab7132198ea34a8c04dd5623edb951141224662132910c"} err="failed to get container status \"4db75f7c2cadff9f3bab7132198ea34a8c04dd5623edb951141224662132910c\": rpc error: code = NotFound desc = could not find container \"4db75f7c2cadff9f3bab7132198ea34a8c04dd5623edb951141224662132910c\": container with ID starting with 4db75f7c2cadff9f3bab7132198ea34a8c04dd5623edb951141224662132910c not found: ID does not exist" Dec 03 08:32:05 crc kubenswrapper[4831]: I1203 08:32:05.027136 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8d69ecf-0220-401b-afc0-dedff203b7b5" path="/var/lib/kubelet/pods/b8d69ecf-0220-401b-afc0-dedff203b7b5/volumes" Dec 03 08:32:10 crc kubenswrapper[4831]: I1203 08:32:10.013011 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:32:10 crc kubenswrapper[4831]: E1203 08:32:10.013899 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:32:22 crc kubenswrapper[4831]: I1203 08:32:22.013501 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:32:22 crc kubenswrapper[4831]: E1203 08:32:22.014835 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:32:33 crc kubenswrapper[4831]: I1203 08:32:33.029605 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:32:33 crc kubenswrapper[4831]: E1203 08:32:33.031742 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:32:44 crc kubenswrapper[4831]: I1203 08:32:44.012788 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:32:44 crc kubenswrapper[4831]: E1203 08:32:44.013628 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:32:58 crc kubenswrapper[4831]: I1203 08:32:58.013104 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:32:58 crc kubenswrapper[4831]: E1203 08:32:58.014399 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:33:07 crc kubenswrapper[4831]: I1203 08:33:07.426684 4831 generic.go:334] "Generic (PLEG): container finished" podID="8e3a439c-c9c5-432d-8fbd-c9854822d349" containerID="fbbb8239229a64e5c4f14d5aaec95f537836db0dcf17629bd01d9fb33231296a" exitCode=0 Dec 03 08:33:07 crc kubenswrapper[4831]: I1203 08:33:07.426790 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" event={"ID":"8e3a439c-c9c5-432d-8fbd-c9854822d349","Type":"ContainerDied","Data":"fbbb8239229a64e5c4f14d5aaec95f537836db0dcf17629bd01d9fb33231296a"} Dec 03 08:33:08 crc kubenswrapper[4831]: I1203 08:33:08.916913 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.013961 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:33:09 crc kubenswrapper[4831]: E1203 08:33:09.014449 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.058652 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-bootstrap-combined-ca-bundle\") pod \"8e3a439c-c9c5-432d-8fbd-c9854822d349\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.058821 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-inventory\") pod \"8e3a439c-c9c5-432d-8fbd-c9854822d349\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.058895 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdqdq\" (UniqueName: \"kubernetes.io/projected/8e3a439c-c9c5-432d-8fbd-c9854822d349-kube-api-access-mdqdq\") pod \"8e3a439c-c9c5-432d-8fbd-c9854822d349\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.059860 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-ceph\") pod \"8e3a439c-c9c5-432d-8fbd-c9854822d349\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.059981 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-ssh-key\") pod \"8e3a439c-c9c5-432d-8fbd-c9854822d349\" (UID: \"8e3a439c-c9c5-432d-8fbd-c9854822d349\") " Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.065190 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8e3a439c-c9c5-432d-8fbd-c9854822d349" (UID: "8e3a439c-c9c5-432d-8fbd-c9854822d349"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.065903 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-ceph" (OuterVolumeSpecName: "ceph") pod "8e3a439c-c9c5-432d-8fbd-c9854822d349" (UID: "8e3a439c-c9c5-432d-8fbd-c9854822d349"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.066432 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e3a439c-c9c5-432d-8fbd-c9854822d349-kube-api-access-mdqdq" (OuterVolumeSpecName: "kube-api-access-mdqdq") pod "8e3a439c-c9c5-432d-8fbd-c9854822d349" (UID: "8e3a439c-c9c5-432d-8fbd-c9854822d349"). InnerVolumeSpecName "kube-api-access-mdqdq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.097463 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-inventory" (OuterVolumeSpecName: "inventory") pod "8e3a439c-c9c5-432d-8fbd-c9854822d349" (UID: "8e3a439c-c9c5-432d-8fbd-c9854822d349"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.105281 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8e3a439c-c9c5-432d-8fbd-c9854822d349" (UID: "8e3a439c-c9c5-432d-8fbd-c9854822d349"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.163764 4831 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.163812 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.163825 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdqdq\" (UniqueName: \"kubernetes.io/projected/8e3a439c-c9c5-432d-8fbd-c9854822d349-kube-api-access-mdqdq\") on node \"crc\" DevicePath \"\"" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.163841 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.163854 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8e3a439c-c9c5-432d-8fbd-c9854822d349-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.449244 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" event={"ID":"8e3a439c-c9c5-432d-8fbd-c9854822d349","Type":"ContainerDied","Data":"efc41627ba1d5c9509dd861f407e12ad8de9859e43f09f44f040c51a4f6441a1"} Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.449294 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efc41627ba1d5c9509dd861f407e12ad8de9859e43f09f44f040c51a4f6441a1" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.449404 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-openstack-openstack-cell1-ksczq" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.632931 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-nzmhv"] Dec 03 08:33:09 crc kubenswrapper[4831]: E1203 08:33:09.633442 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e3a439c-c9c5-432d-8fbd-c9854822d349" containerName="bootstrap-openstack-openstack-cell1" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.633463 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e3a439c-c9c5-432d-8fbd-c9854822d349" containerName="bootstrap-openstack-openstack-cell1" Dec 03 08:33:09 crc kubenswrapper[4831]: E1203 08:33:09.633488 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d69ecf-0220-401b-afc0-dedff203b7b5" containerName="extract-utilities" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.633497 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d69ecf-0220-401b-afc0-dedff203b7b5" containerName="extract-utilities" Dec 03 08:33:09 crc kubenswrapper[4831]: E1203 08:33:09.633512 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d69ecf-0220-401b-afc0-dedff203b7b5" containerName="registry-server" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.633519 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d69ecf-0220-401b-afc0-dedff203b7b5" containerName="registry-server" Dec 03 08:33:09 crc kubenswrapper[4831]: E1203 08:33:09.633563 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d69ecf-0220-401b-afc0-dedff203b7b5" containerName="extract-content" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.633571 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d69ecf-0220-401b-afc0-dedff203b7b5" containerName="extract-content" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.633846 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e3a439c-c9c5-432d-8fbd-c9854822d349" containerName="bootstrap-openstack-openstack-cell1" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.633881 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d69ecf-0220-401b-afc0-dedff203b7b5" containerName="registry-server" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.634856 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.640952 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.641118 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.641408 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.641721 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.649581 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-nzmhv"] Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.777563 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-ssh-key\") pod \"download-cache-openstack-openstack-cell1-nzmhv\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.778095 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-ceph\") pod \"download-cache-openstack-openstack-cell1-nzmhv\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.778445 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-inventory\") pod \"download-cache-openstack-openstack-cell1-nzmhv\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.778803 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc2sr\" (UniqueName: \"kubernetes.io/projected/13bbbc69-d429-4189-a64f-070d16440ed4-kube-api-access-mc2sr\") pod \"download-cache-openstack-openstack-cell1-nzmhv\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.882001 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc2sr\" (UniqueName: \"kubernetes.io/projected/13bbbc69-d429-4189-a64f-070d16440ed4-kube-api-access-mc2sr\") pod \"download-cache-openstack-openstack-cell1-nzmhv\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.882248 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-ssh-key\") pod \"download-cache-openstack-openstack-cell1-nzmhv\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.882351 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-ceph\") pod \"download-cache-openstack-openstack-cell1-nzmhv\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.882435 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-inventory\") pod \"download-cache-openstack-openstack-cell1-nzmhv\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.888931 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-inventory\") pod \"download-cache-openstack-openstack-cell1-nzmhv\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.890529 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-ceph\") pod \"download-cache-openstack-openstack-cell1-nzmhv\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.901692 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-ssh-key\") pod \"download-cache-openstack-openstack-cell1-nzmhv\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.906965 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc2sr\" (UniqueName: \"kubernetes.io/projected/13bbbc69-d429-4189-a64f-070d16440ed4-kube-api-access-mc2sr\") pod \"download-cache-openstack-openstack-cell1-nzmhv\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:09 crc kubenswrapper[4831]: I1203 08:33:09.957652 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:33:10 crc kubenswrapper[4831]: I1203 08:33:10.506105 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-openstack-openstack-cell1-nzmhv"] Dec 03 08:33:11 crc kubenswrapper[4831]: I1203 08:33:11.470760 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" event={"ID":"13bbbc69-d429-4189-a64f-070d16440ed4","Type":"ContainerStarted","Data":"e740800a08710b77ad0f51137e8e04ffb9c10981772d80114a5ceb68494fbaff"} Dec 03 08:33:11 crc kubenswrapper[4831]: I1203 08:33:11.472120 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" event={"ID":"13bbbc69-d429-4189-a64f-070d16440ed4","Type":"ContainerStarted","Data":"c00d81eb07e2d8c89dabcd3aa0ace8bc07f0c0d4476a19120fe80db055b3421a"} Dec 03 08:33:11 crc kubenswrapper[4831]: I1203 08:33:11.495712 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" podStartSLOduration=2.300157563 podStartE2EDuration="2.495688222s" podCreationTimestamp="2025-12-03 08:33:09 +0000 UTC" firstStartedPulling="2025-12-03 08:33:10.51170721 +0000 UTC m=+7327.855290718" lastFinishedPulling="2025-12-03 08:33:10.707237869 +0000 UTC m=+7328.050821377" observedRunningTime="2025-12-03 08:33:11.493904237 +0000 UTC m=+7328.837487745" watchObservedRunningTime="2025-12-03 08:33:11.495688222 +0000 UTC m=+7328.839271750" Dec 03 08:33:22 crc kubenswrapper[4831]: I1203 08:33:22.013284 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:33:22 crc kubenswrapper[4831]: E1203 08:33:22.014242 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:33:37 crc kubenswrapper[4831]: I1203 08:33:37.013231 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:33:37 crc kubenswrapper[4831]: E1203 08:33:37.015256 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:33:50 crc kubenswrapper[4831]: I1203 08:33:50.013985 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:33:50 crc kubenswrapper[4831]: E1203 08:33:50.015380 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:34:03 crc kubenswrapper[4831]: I1203 08:34:03.022813 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:34:03 crc kubenswrapper[4831]: E1203 08:34:03.023622 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:34:14 crc kubenswrapper[4831]: I1203 08:34:14.013595 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:34:14 crc kubenswrapper[4831]: E1203 08:34:14.014460 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:34:18 crc kubenswrapper[4831]: I1203 08:34:18.160400 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-lf26g"] Dec 03 08:34:18 crc kubenswrapper[4831]: I1203 08:34:18.173081 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:18 crc kubenswrapper[4831]: I1203 08:34:18.184596 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf26g"] Dec 03 08:34:18 crc kubenswrapper[4831]: I1203 08:34:18.226956 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f4c82ff-cac7-4b5f-804b-465bc5540971-catalog-content\") pod \"redhat-marketplace-lf26g\" (UID: \"2f4c82ff-cac7-4b5f-804b-465bc5540971\") " pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:18 crc kubenswrapper[4831]: I1203 08:34:18.227027 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h47qv\" (UniqueName: \"kubernetes.io/projected/2f4c82ff-cac7-4b5f-804b-465bc5540971-kube-api-access-h47qv\") pod \"redhat-marketplace-lf26g\" (UID: \"2f4c82ff-cac7-4b5f-804b-465bc5540971\") " pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:18 crc kubenswrapper[4831]: I1203 08:34:18.227068 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f4c82ff-cac7-4b5f-804b-465bc5540971-utilities\") pod \"redhat-marketplace-lf26g\" (UID: \"2f4c82ff-cac7-4b5f-804b-465bc5540971\") " pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:18 crc kubenswrapper[4831]: I1203 08:34:18.328764 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f4c82ff-cac7-4b5f-804b-465bc5540971-catalog-content\") pod \"redhat-marketplace-lf26g\" (UID: \"2f4c82ff-cac7-4b5f-804b-465bc5540971\") " pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:18 crc kubenswrapper[4831]: I1203 08:34:18.328873 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h47qv\" (UniqueName: \"kubernetes.io/projected/2f4c82ff-cac7-4b5f-804b-465bc5540971-kube-api-access-h47qv\") pod \"redhat-marketplace-lf26g\" (UID: \"2f4c82ff-cac7-4b5f-804b-465bc5540971\") " pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:18 crc kubenswrapper[4831]: I1203 08:34:18.328941 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f4c82ff-cac7-4b5f-804b-465bc5540971-utilities\") pod \"redhat-marketplace-lf26g\" (UID: \"2f4c82ff-cac7-4b5f-804b-465bc5540971\") " pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:18 crc kubenswrapper[4831]: I1203 08:34:18.330116 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f4c82ff-cac7-4b5f-804b-465bc5540971-catalog-content\") pod \"redhat-marketplace-lf26g\" (UID: \"2f4c82ff-cac7-4b5f-804b-465bc5540971\") " pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:18 crc kubenswrapper[4831]: I1203 08:34:18.330152 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f4c82ff-cac7-4b5f-804b-465bc5540971-utilities\") pod \"redhat-marketplace-lf26g\" (UID: \"2f4c82ff-cac7-4b5f-804b-465bc5540971\") " pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:18 crc kubenswrapper[4831]: I1203 08:34:18.353709 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h47qv\" (UniqueName: \"kubernetes.io/projected/2f4c82ff-cac7-4b5f-804b-465bc5540971-kube-api-access-h47qv\") pod \"redhat-marketplace-lf26g\" (UID: \"2f4c82ff-cac7-4b5f-804b-465bc5540971\") " pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:18 crc kubenswrapper[4831]: I1203 08:34:18.505388 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:19 crc kubenswrapper[4831]: I1203 08:34:19.030816 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf26g"] Dec 03 08:34:19 crc kubenswrapper[4831]: I1203 08:34:19.226418 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf26g" event={"ID":"2f4c82ff-cac7-4b5f-804b-465bc5540971","Type":"ContainerStarted","Data":"1937988f236aaef71f224bf6782b3c83b8de179f53c9f4a6e0c44767bfc53140"} Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.238810 4831 generic.go:334] "Generic (PLEG): container finished" podID="2f4c82ff-cac7-4b5f-804b-465bc5540971" containerID="195fb89fdb5181b3263633ca00710c4dd9ac0d0ac1c7f921ec3726b896771220" exitCode=0 Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.238887 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf26g" event={"ID":"2f4c82ff-cac7-4b5f-804b-465bc5540971","Type":"ContainerDied","Data":"195fb89fdb5181b3263633ca00710c4dd9ac0d0ac1c7f921ec3726b896771220"} Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.539415 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8p62s"] Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.543821 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.551270 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p62s"] Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.583105 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzrnx\" (UniqueName: \"kubernetes.io/projected/705d39fb-c760-41ed-b5d6-ae4d8387631c-kube-api-access-gzrnx\") pod \"community-operators-8p62s\" (UID: \"705d39fb-c760-41ed-b5d6-ae4d8387631c\") " pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.583240 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/705d39fb-c760-41ed-b5d6-ae4d8387631c-catalog-content\") pod \"community-operators-8p62s\" (UID: \"705d39fb-c760-41ed-b5d6-ae4d8387631c\") " pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.583379 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/705d39fb-c760-41ed-b5d6-ae4d8387631c-utilities\") pod \"community-operators-8p62s\" (UID: \"705d39fb-c760-41ed-b5d6-ae4d8387631c\") " pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.685642 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzrnx\" (UniqueName: \"kubernetes.io/projected/705d39fb-c760-41ed-b5d6-ae4d8387631c-kube-api-access-gzrnx\") pod \"community-operators-8p62s\" (UID: \"705d39fb-c760-41ed-b5d6-ae4d8387631c\") " pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.685801 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/705d39fb-c760-41ed-b5d6-ae4d8387631c-catalog-content\") pod \"community-operators-8p62s\" (UID: \"705d39fb-c760-41ed-b5d6-ae4d8387631c\") " pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.685879 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/705d39fb-c760-41ed-b5d6-ae4d8387631c-utilities\") pod \"community-operators-8p62s\" (UID: \"705d39fb-c760-41ed-b5d6-ae4d8387631c\") " pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.686560 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/705d39fb-c760-41ed-b5d6-ae4d8387631c-catalog-content\") pod \"community-operators-8p62s\" (UID: \"705d39fb-c760-41ed-b5d6-ae4d8387631c\") " pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.686589 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/705d39fb-c760-41ed-b5d6-ae4d8387631c-utilities\") pod \"community-operators-8p62s\" (UID: \"705d39fb-c760-41ed-b5d6-ae4d8387631c\") " pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.715641 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzrnx\" (UniqueName: \"kubernetes.io/projected/705d39fb-c760-41ed-b5d6-ae4d8387631c-kube-api-access-gzrnx\") pod \"community-operators-8p62s\" (UID: \"705d39fb-c760-41ed-b5d6-ae4d8387631c\") " pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:20 crc kubenswrapper[4831]: I1203 08:34:20.889583 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.153645 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bk97c"] Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.157046 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.178875 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bk97c"] Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.194693 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/140e7ca4-e997-4f24-a97c-5393b46dcd38-utilities\") pod \"certified-operators-bk97c\" (UID: \"140e7ca4-e997-4f24-a97c-5393b46dcd38\") " pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.194909 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd8vd\" (UniqueName: \"kubernetes.io/projected/140e7ca4-e997-4f24-a97c-5393b46dcd38-kube-api-access-fd8vd\") pod \"certified-operators-bk97c\" (UID: \"140e7ca4-e997-4f24-a97c-5393b46dcd38\") " pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.195095 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/140e7ca4-e997-4f24-a97c-5393b46dcd38-catalog-content\") pod \"certified-operators-bk97c\" (UID: \"140e7ca4-e997-4f24-a97c-5393b46dcd38\") " pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.297341 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/140e7ca4-e997-4f24-a97c-5393b46dcd38-utilities\") pod \"certified-operators-bk97c\" (UID: \"140e7ca4-e997-4f24-a97c-5393b46dcd38\") " pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.297666 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd8vd\" (UniqueName: \"kubernetes.io/projected/140e7ca4-e997-4f24-a97c-5393b46dcd38-kube-api-access-fd8vd\") pod \"certified-operators-bk97c\" (UID: \"140e7ca4-e997-4f24-a97c-5393b46dcd38\") " pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.297722 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/140e7ca4-e997-4f24-a97c-5393b46dcd38-catalog-content\") pod \"certified-operators-bk97c\" (UID: \"140e7ca4-e997-4f24-a97c-5393b46dcd38\") " pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.297900 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/140e7ca4-e997-4f24-a97c-5393b46dcd38-utilities\") pod \"certified-operators-bk97c\" (UID: \"140e7ca4-e997-4f24-a97c-5393b46dcd38\") " pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.298105 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/140e7ca4-e997-4f24-a97c-5393b46dcd38-catalog-content\") pod \"certified-operators-bk97c\" (UID: \"140e7ca4-e997-4f24-a97c-5393b46dcd38\") " pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.316402 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd8vd\" (UniqueName: \"kubernetes.io/projected/140e7ca4-e997-4f24-a97c-5393b46dcd38-kube-api-access-fd8vd\") pod \"certified-operators-bk97c\" (UID: \"140e7ca4-e997-4f24-a97c-5393b46dcd38\") " pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.492960 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8p62s"] Dec 03 08:34:21 crc kubenswrapper[4831]: W1203 08:34:21.497912 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod705d39fb_c760_41ed_b5d6_ae4d8387631c.slice/crio-fa1bac761e086e6bc3f65a5d1052346476b4e0f0aa2adb4604476fc15fca7df3 WatchSource:0}: Error finding container fa1bac761e086e6bc3f65a5d1052346476b4e0f0aa2adb4604476fc15fca7df3: Status 404 returned error can't find the container with id fa1bac761e086e6bc3f65a5d1052346476b4e0f0aa2adb4604476fc15fca7df3 Dec 03 08:34:21 crc kubenswrapper[4831]: I1203 08:34:21.510034 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:22 crc kubenswrapper[4831]: I1203 08:34:22.086876 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bk97c"] Dec 03 08:34:22 crc kubenswrapper[4831]: I1203 08:34:22.258979 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk97c" event={"ID":"140e7ca4-e997-4f24-a97c-5393b46dcd38","Type":"ContainerStarted","Data":"d4757e6eb485dcf1f938b26fd5f5184c52ae7bc5aa97f8d04554c0822f8bfd01"} Dec 03 08:34:22 crc kubenswrapper[4831]: I1203 08:34:22.261509 4831 generic.go:334] "Generic (PLEG): container finished" podID="2f4c82ff-cac7-4b5f-804b-465bc5540971" containerID="b7f08fd0584d3d2ebb2f22a1150e20cf1efea958c737a6b347ccc6561cb7b194" exitCode=0 Dec 03 08:34:22 crc kubenswrapper[4831]: I1203 08:34:22.261578 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf26g" event={"ID":"2f4c82ff-cac7-4b5f-804b-465bc5540971","Type":"ContainerDied","Data":"b7f08fd0584d3d2ebb2f22a1150e20cf1efea958c737a6b347ccc6561cb7b194"} Dec 03 08:34:22 crc kubenswrapper[4831]: I1203 08:34:22.263043 4831 generic.go:334] "Generic (PLEG): container finished" podID="705d39fb-c760-41ed-b5d6-ae4d8387631c" containerID="38010149b2c3d21a59251a286524fdb147009ac35427909e625378c3203e479d" exitCode=0 Dec 03 08:34:22 crc kubenswrapper[4831]: I1203 08:34:22.263081 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p62s" event={"ID":"705d39fb-c760-41ed-b5d6-ae4d8387631c","Type":"ContainerDied","Data":"38010149b2c3d21a59251a286524fdb147009ac35427909e625378c3203e479d"} Dec 03 08:34:22 crc kubenswrapper[4831]: I1203 08:34:22.263103 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p62s" event={"ID":"705d39fb-c760-41ed-b5d6-ae4d8387631c","Type":"ContainerStarted","Data":"fa1bac761e086e6bc3f65a5d1052346476b4e0f0aa2adb4604476fc15fca7df3"} Dec 03 08:34:23 crc kubenswrapper[4831]: I1203 08:34:23.274541 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf26g" event={"ID":"2f4c82ff-cac7-4b5f-804b-465bc5540971","Type":"ContainerStarted","Data":"eb98c51982a0f6625fcc27a0e65d8b6ee3ba55e863475b65f3262488939a7b6a"} Dec 03 08:34:23 crc kubenswrapper[4831]: I1203 08:34:23.277033 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p62s" event={"ID":"705d39fb-c760-41ed-b5d6-ae4d8387631c","Type":"ContainerStarted","Data":"de314ddb5b51354979a6d3c0f1018863177019691a6dd78747b26ea7d4a5421d"} Dec 03 08:34:23 crc kubenswrapper[4831]: I1203 08:34:23.279095 4831 generic.go:334] "Generic (PLEG): container finished" podID="140e7ca4-e997-4f24-a97c-5393b46dcd38" containerID="3f9c9419671d7af0d01f6910e754b776d290b08a28c11abd50acf9eb5c87a825" exitCode=0 Dec 03 08:34:23 crc kubenswrapper[4831]: I1203 08:34:23.279126 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk97c" event={"ID":"140e7ca4-e997-4f24-a97c-5393b46dcd38","Type":"ContainerDied","Data":"3f9c9419671d7af0d01f6910e754b776d290b08a28c11abd50acf9eb5c87a825"} Dec 03 08:34:23 crc kubenswrapper[4831]: I1203 08:34:23.299679 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-lf26g" podStartSLOduration=2.839992164 podStartE2EDuration="5.299655382s" podCreationTimestamp="2025-12-03 08:34:18 +0000 UTC" firstStartedPulling="2025-12-03 08:34:20.24354014 +0000 UTC m=+7397.587123648" lastFinishedPulling="2025-12-03 08:34:22.703203358 +0000 UTC m=+7400.046786866" observedRunningTime="2025-12-03 08:34:23.295823783 +0000 UTC m=+7400.639407291" watchObservedRunningTime="2025-12-03 08:34:23.299655382 +0000 UTC m=+7400.643238890" Dec 03 08:34:24 crc kubenswrapper[4831]: I1203 08:34:24.288807 4831 generic.go:334] "Generic (PLEG): container finished" podID="705d39fb-c760-41ed-b5d6-ae4d8387631c" containerID="de314ddb5b51354979a6d3c0f1018863177019691a6dd78747b26ea7d4a5421d" exitCode=0 Dec 03 08:34:24 crc kubenswrapper[4831]: I1203 08:34:24.288864 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p62s" event={"ID":"705d39fb-c760-41ed-b5d6-ae4d8387631c","Type":"ContainerDied","Data":"de314ddb5b51354979a6d3c0f1018863177019691a6dd78747b26ea7d4a5421d"} Dec 03 08:34:25 crc kubenswrapper[4831]: I1203 08:34:25.312801 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk97c" event={"ID":"140e7ca4-e997-4f24-a97c-5393b46dcd38","Type":"ContainerStarted","Data":"86fda1823ecfe3a523854f675f38bc9649e5c566cc94ebe23b14279f712db0c2"} Dec 03 08:34:25 crc kubenswrapper[4831]: I1203 08:34:25.329938 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p62s" event={"ID":"705d39fb-c760-41ed-b5d6-ae4d8387631c","Type":"ContainerStarted","Data":"f41bd2f47a27a08dd1da0b542fa1613f76ab3374fb5699d9df29114b6297538d"} Dec 03 08:34:25 crc kubenswrapper[4831]: I1203 08:34:25.368398 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8p62s" podStartSLOduration=2.736949328 podStartE2EDuration="5.368379845s" podCreationTimestamp="2025-12-03 08:34:20 +0000 UTC" firstStartedPulling="2025-12-03 08:34:22.264899818 +0000 UTC m=+7399.608483326" lastFinishedPulling="2025-12-03 08:34:24.896330335 +0000 UTC m=+7402.239913843" observedRunningTime="2025-12-03 08:34:25.36370433 +0000 UTC m=+7402.707287838" watchObservedRunningTime="2025-12-03 08:34:25.368379845 +0000 UTC m=+7402.711963353" Dec 03 08:34:26 crc kubenswrapper[4831]: I1203 08:34:26.351051 4831 generic.go:334] "Generic (PLEG): container finished" podID="140e7ca4-e997-4f24-a97c-5393b46dcd38" containerID="86fda1823ecfe3a523854f675f38bc9649e5c566cc94ebe23b14279f712db0c2" exitCode=0 Dec 03 08:34:26 crc kubenswrapper[4831]: I1203 08:34:26.351415 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk97c" event={"ID":"140e7ca4-e997-4f24-a97c-5393b46dcd38","Type":"ContainerDied","Data":"86fda1823ecfe3a523854f675f38bc9649e5c566cc94ebe23b14279f712db0c2"} Dec 03 08:34:27 crc kubenswrapper[4831]: I1203 08:34:27.013366 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:34:27 crc kubenswrapper[4831]: E1203 08:34:27.014626 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:34:27 crc kubenswrapper[4831]: I1203 08:34:27.368092 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk97c" event={"ID":"140e7ca4-e997-4f24-a97c-5393b46dcd38","Type":"ContainerStarted","Data":"1d455036f2594799e3a0697ca5faac0e15df46641813a8b9bf8185cc5017ae63"} Dec 03 08:34:27 crc kubenswrapper[4831]: I1203 08:34:27.397092 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bk97c" podStartSLOduration=2.840731753 podStartE2EDuration="6.397067681s" podCreationTimestamp="2025-12-03 08:34:21 +0000 UTC" firstStartedPulling="2025-12-03 08:34:23.280831927 +0000 UTC m=+7400.624415435" lastFinishedPulling="2025-12-03 08:34:26.837167855 +0000 UTC m=+7404.180751363" observedRunningTime="2025-12-03 08:34:27.389859247 +0000 UTC m=+7404.733442765" watchObservedRunningTime="2025-12-03 08:34:27.397067681 +0000 UTC m=+7404.740651199" Dec 03 08:34:28 crc kubenswrapper[4831]: I1203 08:34:28.506441 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:28 crc kubenswrapper[4831]: I1203 08:34:28.506653 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:28 crc kubenswrapper[4831]: I1203 08:34:28.565486 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:29 crc kubenswrapper[4831]: I1203 08:34:29.437441 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:30 crc kubenswrapper[4831]: I1203 08:34:30.890021 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:30 crc kubenswrapper[4831]: I1203 08:34:30.890305 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:30 crc kubenswrapper[4831]: I1203 08:34:30.946415 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:31 crc kubenswrapper[4831]: I1203 08:34:31.134132 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf26g"] Dec 03 08:34:31 crc kubenswrapper[4831]: I1203 08:34:31.409399 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-lf26g" podUID="2f4c82ff-cac7-4b5f-804b-465bc5540971" containerName="registry-server" containerID="cri-o://eb98c51982a0f6625fcc27a0e65d8b6ee3ba55e863475b65f3262488939a7b6a" gracePeriod=2 Dec 03 08:34:31 crc kubenswrapper[4831]: I1203 08:34:31.473182 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:31 crc kubenswrapper[4831]: I1203 08:34:31.510312 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:31 crc kubenswrapper[4831]: I1203 08:34:31.510390 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:31 crc kubenswrapper[4831]: I1203 08:34:31.582629 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:31 crc kubenswrapper[4831]: I1203 08:34:31.951173 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.049916 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f4c82ff-cac7-4b5f-804b-465bc5540971-utilities\") pod \"2f4c82ff-cac7-4b5f-804b-465bc5540971\" (UID: \"2f4c82ff-cac7-4b5f-804b-465bc5540971\") " Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.050172 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f4c82ff-cac7-4b5f-804b-465bc5540971-catalog-content\") pod \"2f4c82ff-cac7-4b5f-804b-465bc5540971\" (UID: \"2f4c82ff-cac7-4b5f-804b-465bc5540971\") " Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.050200 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h47qv\" (UniqueName: \"kubernetes.io/projected/2f4c82ff-cac7-4b5f-804b-465bc5540971-kube-api-access-h47qv\") pod \"2f4c82ff-cac7-4b5f-804b-465bc5540971\" (UID: \"2f4c82ff-cac7-4b5f-804b-465bc5540971\") " Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.050606 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f4c82ff-cac7-4b5f-804b-465bc5540971-utilities" (OuterVolumeSpecName: "utilities") pod "2f4c82ff-cac7-4b5f-804b-465bc5540971" (UID: "2f4c82ff-cac7-4b5f-804b-465bc5540971"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.051490 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2f4c82ff-cac7-4b5f-804b-465bc5540971-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.057678 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f4c82ff-cac7-4b5f-804b-465bc5540971-kube-api-access-h47qv" (OuterVolumeSpecName: "kube-api-access-h47qv") pod "2f4c82ff-cac7-4b5f-804b-465bc5540971" (UID: "2f4c82ff-cac7-4b5f-804b-465bc5540971"). InnerVolumeSpecName "kube-api-access-h47qv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.069969 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f4c82ff-cac7-4b5f-804b-465bc5540971-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2f4c82ff-cac7-4b5f-804b-465bc5540971" (UID: "2f4c82ff-cac7-4b5f-804b-465bc5540971"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.154632 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2f4c82ff-cac7-4b5f-804b-465bc5540971-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.154676 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h47qv\" (UniqueName: \"kubernetes.io/projected/2f4c82ff-cac7-4b5f-804b-465bc5540971-kube-api-access-h47qv\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.423000 4831 generic.go:334] "Generic (PLEG): container finished" podID="2f4c82ff-cac7-4b5f-804b-465bc5540971" containerID="eb98c51982a0f6625fcc27a0e65d8b6ee3ba55e863475b65f3262488939a7b6a" exitCode=0 Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.423019 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-lf26g" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.423051 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf26g" event={"ID":"2f4c82ff-cac7-4b5f-804b-465bc5540971","Type":"ContainerDied","Data":"eb98c51982a0f6625fcc27a0e65d8b6ee3ba55e863475b65f3262488939a7b6a"} Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.423793 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-lf26g" event={"ID":"2f4c82ff-cac7-4b5f-804b-465bc5540971","Type":"ContainerDied","Data":"1937988f236aaef71f224bf6782b3c83b8de179f53c9f4a6e0c44767bfc53140"} Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.423816 4831 scope.go:117] "RemoveContainer" containerID="eb98c51982a0f6625fcc27a0e65d8b6ee3ba55e863475b65f3262488939a7b6a" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.450307 4831 scope.go:117] "RemoveContainer" containerID="b7f08fd0584d3d2ebb2f22a1150e20cf1efea958c737a6b347ccc6561cb7b194" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.462884 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf26g"] Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.476967 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-lf26g"] Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.496064 4831 scope.go:117] "RemoveContainer" containerID="195fb89fdb5181b3263633ca00710c4dd9ac0d0ac1c7f921ec3726b896771220" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.496786 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.556309 4831 scope.go:117] "RemoveContainer" containerID="eb98c51982a0f6625fcc27a0e65d8b6ee3ba55e863475b65f3262488939a7b6a" Dec 03 08:34:32 crc kubenswrapper[4831]: E1203 08:34:32.556996 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb98c51982a0f6625fcc27a0e65d8b6ee3ba55e863475b65f3262488939a7b6a\": container with ID starting with eb98c51982a0f6625fcc27a0e65d8b6ee3ba55e863475b65f3262488939a7b6a not found: ID does not exist" containerID="eb98c51982a0f6625fcc27a0e65d8b6ee3ba55e863475b65f3262488939a7b6a" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.557042 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb98c51982a0f6625fcc27a0e65d8b6ee3ba55e863475b65f3262488939a7b6a"} err="failed to get container status \"eb98c51982a0f6625fcc27a0e65d8b6ee3ba55e863475b65f3262488939a7b6a\": rpc error: code = NotFound desc = could not find container \"eb98c51982a0f6625fcc27a0e65d8b6ee3ba55e863475b65f3262488939a7b6a\": container with ID starting with eb98c51982a0f6625fcc27a0e65d8b6ee3ba55e863475b65f3262488939a7b6a not found: ID does not exist" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.557069 4831 scope.go:117] "RemoveContainer" containerID="b7f08fd0584d3d2ebb2f22a1150e20cf1efea958c737a6b347ccc6561cb7b194" Dec 03 08:34:32 crc kubenswrapper[4831]: E1203 08:34:32.557551 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7f08fd0584d3d2ebb2f22a1150e20cf1efea958c737a6b347ccc6561cb7b194\": container with ID starting with b7f08fd0584d3d2ebb2f22a1150e20cf1efea958c737a6b347ccc6561cb7b194 not found: ID does not exist" containerID="b7f08fd0584d3d2ebb2f22a1150e20cf1efea958c737a6b347ccc6561cb7b194" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.557658 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7f08fd0584d3d2ebb2f22a1150e20cf1efea958c737a6b347ccc6561cb7b194"} err="failed to get container status \"b7f08fd0584d3d2ebb2f22a1150e20cf1efea958c737a6b347ccc6561cb7b194\": rpc error: code = NotFound desc = could not find container \"b7f08fd0584d3d2ebb2f22a1150e20cf1efea958c737a6b347ccc6561cb7b194\": container with ID starting with b7f08fd0584d3d2ebb2f22a1150e20cf1efea958c737a6b347ccc6561cb7b194 not found: ID does not exist" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.557776 4831 scope.go:117] "RemoveContainer" containerID="195fb89fdb5181b3263633ca00710c4dd9ac0d0ac1c7f921ec3726b896771220" Dec 03 08:34:32 crc kubenswrapper[4831]: E1203 08:34:32.558232 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"195fb89fdb5181b3263633ca00710c4dd9ac0d0ac1c7f921ec3726b896771220\": container with ID starting with 195fb89fdb5181b3263633ca00710c4dd9ac0d0ac1c7f921ec3726b896771220 not found: ID does not exist" containerID="195fb89fdb5181b3263633ca00710c4dd9ac0d0ac1c7f921ec3726b896771220" Dec 03 08:34:32 crc kubenswrapper[4831]: I1203 08:34:32.558363 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"195fb89fdb5181b3263633ca00710c4dd9ac0d0ac1c7f921ec3726b896771220"} err="failed to get container status \"195fb89fdb5181b3263633ca00710c4dd9ac0d0ac1c7f921ec3726b896771220\": rpc error: code = NotFound desc = could not find container \"195fb89fdb5181b3263633ca00710c4dd9ac0d0ac1c7f921ec3726b896771220\": container with ID starting with 195fb89fdb5181b3263633ca00710c4dd9ac0d0ac1c7f921ec3726b896771220 not found: ID does not exist" Dec 03 08:34:33 crc kubenswrapper[4831]: I1203 08:34:33.032950 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f4c82ff-cac7-4b5f-804b-465bc5540971" path="/var/lib/kubelet/pods/2f4c82ff-cac7-4b5f-804b-465bc5540971/volumes" Dec 03 08:34:33 crc kubenswrapper[4831]: I1203 08:34:33.331884 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8p62s"] Dec 03 08:34:33 crc kubenswrapper[4831]: I1203 08:34:33.432258 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8p62s" podUID="705d39fb-c760-41ed-b5d6-ae4d8387631c" containerName="registry-server" containerID="cri-o://f41bd2f47a27a08dd1da0b542fa1613f76ab3374fb5699d9df29114b6297538d" gracePeriod=2 Dec 03 08:34:33 crc kubenswrapper[4831]: I1203 08:34:33.911155 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:33 crc kubenswrapper[4831]: I1203 08:34:33.995424 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzrnx\" (UniqueName: \"kubernetes.io/projected/705d39fb-c760-41ed-b5d6-ae4d8387631c-kube-api-access-gzrnx\") pod \"705d39fb-c760-41ed-b5d6-ae4d8387631c\" (UID: \"705d39fb-c760-41ed-b5d6-ae4d8387631c\") " Dec 03 08:34:33 crc kubenswrapper[4831]: I1203 08:34:33.995907 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/705d39fb-c760-41ed-b5d6-ae4d8387631c-catalog-content\") pod \"705d39fb-c760-41ed-b5d6-ae4d8387631c\" (UID: \"705d39fb-c760-41ed-b5d6-ae4d8387631c\") " Dec 03 08:34:33 crc kubenswrapper[4831]: I1203 08:34:33.995943 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/705d39fb-c760-41ed-b5d6-ae4d8387631c-utilities\") pod \"705d39fb-c760-41ed-b5d6-ae4d8387631c\" (UID: \"705d39fb-c760-41ed-b5d6-ae4d8387631c\") " Dec 03 08:34:33 crc kubenswrapper[4831]: I1203 08:34:33.998182 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/705d39fb-c760-41ed-b5d6-ae4d8387631c-utilities" (OuterVolumeSpecName: "utilities") pod "705d39fb-c760-41ed-b5d6-ae4d8387631c" (UID: "705d39fb-c760-41ed-b5d6-ae4d8387631c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.003092 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/705d39fb-c760-41ed-b5d6-ae4d8387631c-kube-api-access-gzrnx" (OuterVolumeSpecName: "kube-api-access-gzrnx") pod "705d39fb-c760-41ed-b5d6-ae4d8387631c" (UID: "705d39fb-c760-41ed-b5d6-ae4d8387631c"). InnerVolumeSpecName "kube-api-access-gzrnx". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.057904 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/705d39fb-c760-41ed-b5d6-ae4d8387631c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "705d39fb-c760-41ed-b5d6-ae4d8387631c" (UID: "705d39fb-c760-41ed-b5d6-ae4d8387631c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.098609 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/705d39fb-c760-41ed-b5d6-ae4d8387631c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.098645 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/705d39fb-c760-41ed-b5d6-ae4d8387631c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.098660 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzrnx\" (UniqueName: \"kubernetes.io/projected/705d39fb-c760-41ed-b5d6-ae4d8387631c-kube-api-access-gzrnx\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.448891 4831 generic.go:334] "Generic (PLEG): container finished" podID="705d39fb-c760-41ed-b5d6-ae4d8387631c" containerID="f41bd2f47a27a08dd1da0b542fa1613f76ab3374fb5699d9df29114b6297538d" exitCode=0 Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.448939 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p62s" event={"ID":"705d39fb-c760-41ed-b5d6-ae4d8387631c","Type":"ContainerDied","Data":"f41bd2f47a27a08dd1da0b542fa1613f76ab3374fb5699d9df29114b6297538d"} Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.448963 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8p62s" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.448978 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8p62s" event={"ID":"705d39fb-c760-41ed-b5d6-ae4d8387631c","Type":"ContainerDied","Data":"fa1bac761e086e6bc3f65a5d1052346476b4e0f0aa2adb4604476fc15fca7df3"} Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.449003 4831 scope.go:117] "RemoveContainer" containerID="f41bd2f47a27a08dd1da0b542fa1613f76ab3374fb5699d9df29114b6297538d" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.492409 4831 scope.go:117] "RemoveContainer" containerID="de314ddb5b51354979a6d3c0f1018863177019691a6dd78747b26ea7d4a5421d" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.495138 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8p62s"] Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.505823 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8p62s"] Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.616930 4831 scope.go:117] "RemoveContainer" containerID="38010149b2c3d21a59251a286524fdb147009ac35427909e625378c3203e479d" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.686029 4831 scope.go:117] "RemoveContainer" containerID="f41bd2f47a27a08dd1da0b542fa1613f76ab3374fb5699d9df29114b6297538d" Dec 03 08:34:34 crc kubenswrapper[4831]: E1203 08:34:34.691263 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f41bd2f47a27a08dd1da0b542fa1613f76ab3374fb5699d9df29114b6297538d\": container with ID starting with f41bd2f47a27a08dd1da0b542fa1613f76ab3374fb5699d9df29114b6297538d not found: ID does not exist" containerID="f41bd2f47a27a08dd1da0b542fa1613f76ab3374fb5699d9df29114b6297538d" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.691341 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f41bd2f47a27a08dd1da0b542fa1613f76ab3374fb5699d9df29114b6297538d"} err="failed to get container status \"f41bd2f47a27a08dd1da0b542fa1613f76ab3374fb5699d9df29114b6297538d\": rpc error: code = NotFound desc = could not find container \"f41bd2f47a27a08dd1da0b542fa1613f76ab3374fb5699d9df29114b6297538d\": container with ID starting with f41bd2f47a27a08dd1da0b542fa1613f76ab3374fb5699d9df29114b6297538d not found: ID does not exist" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.691370 4831 scope.go:117] "RemoveContainer" containerID="de314ddb5b51354979a6d3c0f1018863177019691a6dd78747b26ea7d4a5421d" Dec 03 08:34:34 crc kubenswrapper[4831]: E1203 08:34:34.691626 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de314ddb5b51354979a6d3c0f1018863177019691a6dd78747b26ea7d4a5421d\": container with ID starting with de314ddb5b51354979a6d3c0f1018863177019691a6dd78747b26ea7d4a5421d not found: ID does not exist" containerID="de314ddb5b51354979a6d3c0f1018863177019691a6dd78747b26ea7d4a5421d" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.691668 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de314ddb5b51354979a6d3c0f1018863177019691a6dd78747b26ea7d4a5421d"} err="failed to get container status \"de314ddb5b51354979a6d3c0f1018863177019691a6dd78747b26ea7d4a5421d\": rpc error: code = NotFound desc = could not find container \"de314ddb5b51354979a6d3c0f1018863177019691a6dd78747b26ea7d4a5421d\": container with ID starting with de314ddb5b51354979a6d3c0f1018863177019691a6dd78747b26ea7d4a5421d not found: ID does not exist" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.691682 4831 scope.go:117] "RemoveContainer" containerID="38010149b2c3d21a59251a286524fdb147009ac35427909e625378c3203e479d" Dec 03 08:34:34 crc kubenswrapper[4831]: E1203 08:34:34.691853 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38010149b2c3d21a59251a286524fdb147009ac35427909e625378c3203e479d\": container with ID starting with 38010149b2c3d21a59251a286524fdb147009ac35427909e625378c3203e479d not found: ID does not exist" containerID="38010149b2c3d21a59251a286524fdb147009ac35427909e625378c3203e479d" Dec 03 08:34:34 crc kubenswrapper[4831]: I1203 08:34:34.691875 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38010149b2c3d21a59251a286524fdb147009ac35427909e625378c3203e479d"} err="failed to get container status \"38010149b2c3d21a59251a286524fdb147009ac35427909e625378c3203e479d\": rpc error: code = NotFound desc = could not find container \"38010149b2c3d21a59251a286524fdb147009ac35427909e625378c3203e479d\": container with ID starting with 38010149b2c3d21a59251a286524fdb147009ac35427909e625378c3203e479d not found: ID does not exist" Dec 03 08:34:35 crc kubenswrapper[4831]: I1203 08:34:35.037509 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="705d39fb-c760-41ed-b5d6-ae4d8387631c" path="/var/lib/kubelet/pods/705d39fb-c760-41ed-b5d6-ae4d8387631c/volumes" Dec 03 08:34:35 crc kubenswrapper[4831]: I1203 08:34:35.536077 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bk97c"] Dec 03 08:34:35 crc kubenswrapper[4831]: I1203 08:34:35.536399 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bk97c" podUID="140e7ca4-e997-4f24-a97c-5393b46dcd38" containerName="registry-server" containerID="cri-o://1d455036f2594799e3a0697ca5faac0e15df46641813a8b9bf8185cc5017ae63" gracePeriod=2 Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.098201 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.156031 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/140e7ca4-e997-4f24-a97c-5393b46dcd38-utilities\") pod \"140e7ca4-e997-4f24-a97c-5393b46dcd38\" (UID: \"140e7ca4-e997-4f24-a97c-5393b46dcd38\") " Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.156398 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/140e7ca4-e997-4f24-a97c-5393b46dcd38-catalog-content\") pod \"140e7ca4-e997-4f24-a97c-5393b46dcd38\" (UID: \"140e7ca4-e997-4f24-a97c-5393b46dcd38\") " Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.156461 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd8vd\" (UniqueName: \"kubernetes.io/projected/140e7ca4-e997-4f24-a97c-5393b46dcd38-kube-api-access-fd8vd\") pod \"140e7ca4-e997-4f24-a97c-5393b46dcd38\" (UID: \"140e7ca4-e997-4f24-a97c-5393b46dcd38\") " Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.157408 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/140e7ca4-e997-4f24-a97c-5393b46dcd38-utilities" (OuterVolumeSpecName: "utilities") pod "140e7ca4-e997-4f24-a97c-5393b46dcd38" (UID: "140e7ca4-e997-4f24-a97c-5393b46dcd38"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.163760 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/140e7ca4-e997-4f24-a97c-5393b46dcd38-kube-api-access-fd8vd" (OuterVolumeSpecName: "kube-api-access-fd8vd") pod "140e7ca4-e997-4f24-a97c-5393b46dcd38" (UID: "140e7ca4-e997-4f24-a97c-5393b46dcd38"). InnerVolumeSpecName "kube-api-access-fd8vd". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.210827 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/140e7ca4-e997-4f24-a97c-5393b46dcd38-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "140e7ca4-e997-4f24-a97c-5393b46dcd38" (UID: "140e7ca4-e997-4f24-a97c-5393b46dcd38"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.259113 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/140e7ca4-e997-4f24-a97c-5393b46dcd38-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.259146 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd8vd\" (UniqueName: \"kubernetes.io/projected/140e7ca4-e997-4f24-a97c-5393b46dcd38-kube-api-access-fd8vd\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.259158 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/140e7ca4-e997-4f24-a97c-5393b46dcd38-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.474773 4831 generic.go:334] "Generic (PLEG): container finished" podID="140e7ca4-e997-4f24-a97c-5393b46dcd38" containerID="1d455036f2594799e3a0697ca5faac0e15df46641813a8b9bf8185cc5017ae63" exitCode=0 Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.474817 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk97c" event={"ID":"140e7ca4-e997-4f24-a97c-5393b46dcd38","Type":"ContainerDied","Data":"1d455036f2594799e3a0697ca5faac0e15df46641813a8b9bf8185cc5017ae63"} Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.475094 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bk97c" event={"ID":"140e7ca4-e997-4f24-a97c-5393b46dcd38","Type":"ContainerDied","Data":"d4757e6eb485dcf1f938b26fd5f5184c52ae7bc5aa97f8d04554c0822f8bfd01"} Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.475119 4831 scope.go:117] "RemoveContainer" containerID="1d455036f2594799e3a0697ca5faac0e15df46641813a8b9bf8185cc5017ae63" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.474839 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bk97c" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.501834 4831 scope.go:117] "RemoveContainer" containerID="86fda1823ecfe3a523854f675f38bc9649e5c566cc94ebe23b14279f712db0c2" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.514640 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bk97c"] Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.523300 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bk97c"] Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.539475 4831 scope.go:117] "RemoveContainer" containerID="3f9c9419671d7af0d01f6910e754b776d290b08a28c11abd50acf9eb5c87a825" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.586604 4831 scope.go:117] "RemoveContainer" containerID="1d455036f2594799e3a0697ca5faac0e15df46641813a8b9bf8185cc5017ae63" Dec 03 08:34:36 crc kubenswrapper[4831]: E1203 08:34:36.587141 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d455036f2594799e3a0697ca5faac0e15df46641813a8b9bf8185cc5017ae63\": container with ID starting with 1d455036f2594799e3a0697ca5faac0e15df46641813a8b9bf8185cc5017ae63 not found: ID does not exist" containerID="1d455036f2594799e3a0697ca5faac0e15df46641813a8b9bf8185cc5017ae63" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.587195 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d455036f2594799e3a0697ca5faac0e15df46641813a8b9bf8185cc5017ae63"} err="failed to get container status \"1d455036f2594799e3a0697ca5faac0e15df46641813a8b9bf8185cc5017ae63\": rpc error: code = NotFound desc = could not find container \"1d455036f2594799e3a0697ca5faac0e15df46641813a8b9bf8185cc5017ae63\": container with ID starting with 1d455036f2594799e3a0697ca5faac0e15df46641813a8b9bf8185cc5017ae63 not found: ID does not exist" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.587229 4831 scope.go:117] "RemoveContainer" containerID="86fda1823ecfe3a523854f675f38bc9649e5c566cc94ebe23b14279f712db0c2" Dec 03 08:34:36 crc kubenswrapper[4831]: E1203 08:34:36.587621 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86fda1823ecfe3a523854f675f38bc9649e5c566cc94ebe23b14279f712db0c2\": container with ID starting with 86fda1823ecfe3a523854f675f38bc9649e5c566cc94ebe23b14279f712db0c2 not found: ID does not exist" containerID="86fda1823ecfe3a523854f675f38bc9649e5c566cc94ebe23b14279f712db0c2" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.587653 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86fda1823ecfe3a523854f675f38bc9649e5c566cc94ebe23b14279f712db0c2"} err="failed to get container status \"86fda1823ecfe3a523854f675f38bc9649e5c566cc94ebe23b14279f712db0c2\": rpc error: code = NotFound desc = could not find container \"86fda1823ecfe3a523854f675f38bc9649e5c566cc94ebe23b14279f712db0c2\": container with ID starting with 86fda1823ecfe3a523854f675f38bc9649e5c566cc94ebe23b14279f712db0c2 not found: ID does not exist" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.587693 4831 scope.go:117] "RemoveContainer" containerID="3f9c9419671d7af0d01f6910e754b776d290b08a28c11abd50acf9eb5c87a825" Dec 03 08:34:36 crc kubenswrapper[4831]: E1203 08:34:36.587965 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f9c9419671d7af0d01f6910e754b776d290b08a28c11abd50acf9eb5c87a825\": container with ID starting with 3f9c9419671d7af0d01f6910e754b776d290b08a28c11abd50acf9eb5c87a825 not found: ID does not exist" containerID="3f9c9419671d7af0d01f6910e754b776d290b08a28c11abd50acf9eb5c87a825" Dec 03 08:34:36 crc kubenswrapper[4831]: I1203 08:34:36.588048 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f9c9419671d7af0d01f6910e754b776d290b08a28c11abd50acf9eb5c87a825"} err="failed to get container status \"3f9c9419671d7af0d01f6910e754b776d290b08a28c11abd50acf9eb5c87a825\": rpc error: code = NotFound desc = could not find container \"3f9c9419671d7af0d01f6910e754b776d290b08a28c11abd50acf9eb5c87a825\": container with ID starting with 3f9c9419671d7af0d01f6910e754b776d290b08a28c11abd50acf9eb5c87a825 not found: ID does not exist" Dec 03 08:34:37 crc kubenswrapper[4831]: I1203 08:34:37.034275 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="140e7ca4-e997-4f24-a97c-5393b46dcd38" path="/var/lib/kubelet/pods/140e7ca4-e997-4f24-a97c-5393b46dcd38/volumes" Dec 03 08:34:40 crc kubenswrapper[4831]: I1203 08:34:40.013229 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:34:40 crc kubenswrapper[4831]: I1203 08:34:40.541088 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"da347abb806229f967733bacbfad1365f4ae7089c1b4bf1476bfd8bf32654286"} Dec 03 08:34:46 crc kubenswrapper[4831]: I1203 08:34:46.603033 4831 generic.go:334] "Generic (PLEG): container finished" podID="13bbbc69-d429-4189-a64f-070d16440ed4" containerID="e740800a08710b77ad0f51137e8e04ffb9c10981772d80114a5ceb68494fbaff" exitCode=0 Dec 03 08:34:46 crc kubenswrapper[4831]: I1203 08:34:46.603162 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" event={"ID":"13bbbc69-d429-4189-a64f-070d16440ed4","Type":"ContainerDied","Data":"e740800a08710b77ad0f51137e8e04ffb9c10981772d80114a5ceb68494fbaff"} Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.123479 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.324834 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc2sr\" (UniqueName: \"kubernetes.io/projected/13bbbc69-d429-4189-a64f-070d16440ed4-kube-api-access-mc2sr\") pod \"13bbbc69-d429-4189-a64f-070d16440ed4\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.325175 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-ceph\") pod \"13bbbc69-d429-4189-a64f-070d16440ed4\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.325368 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-ssh-key\") pod \"13bbbc69-d429-4189-a64f-070d16440ed4\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.325528 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-inventory\") pod \"13bbbc69-d429-4189-a64f-070d16440ed4\" (UID: \"13bbbc69-d429-4189-a64f-070d16440ed4\") " Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.333980 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-ceph" (OuterVolumeSpecName: "ceph") pod "13bbbc69-d429-4189-a64f-070d16440ed4" (UID: "13bbbc69-d429-4189-a64f-070d16440ed4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.334905 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13bbbc69-d429-4189-a64f-070d16440ed4-kube-api-access-mc2sr" (OuterVolumeSpecName: "kube-api-access-mc2sr") pod "13bbbc69-d429-4189-a64f-070d16440ed4" (UID: "13bbbc69-d429-4189-a64f-070d16440ed4"). InnerVolumeSpecName "kube-api-access-mc2sr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.370970 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "13bbbc69-d429-4189-a64f-070d16440ed4" (UID: "13bbbc69-d429-4189-a64f-070d16440ed4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.371029 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-inventory" (OuterVolumeSpecName: "inventory") pod "13bbbc69-d429-4189-a64f-070d16440ed4" (UID: "13bbbc69-d429-4189-a64f-070d16440ed4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.428783 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.428821 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.428836 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc2sr\" (UniqueName: \"kubernetes.io/projected/13bbbc69-d429-4189-a64f-070d16440ed4-kube-api-access-mc2sr\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.428851 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/13bbbc69-d429-4189-a64f-070d16440ed4-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.630228 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" event={"ID":"13bbbc69-d429-4189-a64f-070d16440ed4","Type":"ContainerDied","Data":"c00d81eb07e2d8c89dabcd3aa0ace8bc07f0c0d4476a19120fe80db055b3421a"} Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.630271 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c00d81eb07e2d8c89dabcd3aa0ace8bc07f0c0d4476a19120fe80db055b3421a" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.630360 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-openstack-openstack-cell1-nzmhv" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.735490 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-gtncm"] Dec 03 08:34:48 crc kubenswrapper[4831]: E1203 08:34:48.735998 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="705d39fb-c760-41ed-b5d6-ae4d8387631c" containerName="extract-content" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736017 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="705d39fb-c760-41ed-b5d6-ae4d8387631c" containerName="extract-content" Dec 03 08:34:48 crc kubenswrapper[4831]: E1203 08:34:48.736042 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="705d39fb-c760-41ed-b5d6-ae4d8387631c" containerName="extract-utilities" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736052 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="705d39fb-c760-41ed-b5d6-ae4d8387631c" containerName="extract-utilities" Dec 03 08:34:48 crc kubenswrapper[4831]: E1203 08:34:48.736071 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4c82ff-cac7-4b5f-804b-465bc5540971" containerName="extract-utilities" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736079 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4c82ff-cac7-4b5f-804b-465bc5540971" containerName="extract-utilities" Dec 03 08:34:48 crc kubenswrapper[4831]: E1203 08:34:48.736092 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4c82ff-cac7-4b5f-804b-465bc5540971" containerName="registry-server" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736100 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4c82ff-cac7-4b5f-804b-465bc5540971" containerName="registry-server" Dec 03 08:34:48 crc kubenswrapper[4831]: E1203 08:34:48.736110 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="705d39fb-c760-41ed-b5d6-ae4d8387631c" containerName="registry-server" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736117 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="705d39fb-c760-41ed-b5d6-ae4d8387631c" containerName="registry-server" Dec 03 08:34:48 crc kubenswrapper[4831]: E1203 08:34:48.736129 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f4c82ff-cac7-4b5f-804b-465bc5540971" containerName="extract-content" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736137 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f4c82ff-cac7-4b5f-804b-465bc5540971" containerName="extract-content" Dec 03 08:34:48 crc kubenswrapper[4831]: E1203 08:34:48.736153 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="140e7ca4-e997-4f24-a97c-5393b46dcd38" containerName="extract-utilities" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736160 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="140e7ca4-e997-4f24-a97c-5393b46dcd38" containerName="extract-utilities" Dec 03 08:34:48 crc kubenswrapper[4831]: E1203 08:34:48.736175 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="140e7ca4-e997-4f24-a97c-5393b46dcd38" containerName="registry-server" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736182 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="140e7ca4-e997-4f24-a97c-5393b46dcd38" containerName="registry-server" Dec 03 08:34:48 crc kubenswrapper[4831]: E1203 08:34:48.736199 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="140e7ca4-e997-4f24-a97c-5393b46dcd38" containerName="extract-content" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736205 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="140e7ca4-e997-4f24-a97c-5393b46dcd38" containerName="extract-content" Dec 03 08:34:48 crc kubenswrapper[4831]: E1203 08:34:48.736218 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13bbbc69-d429-4189-a64f-070d16440ed4" containerName="download-cache-openstack-openstack-cell1" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736227 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="13bbbc69-d429-4189-a64f-070d16440ed4" containerName="download-cache-openstack-openstack-cell1" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736478 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f4c82ff-cac7-4b5f-804b-465bc5540971" containerName="registry-server" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736494 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="705d39fb-c760-41ed-b5d6-ae4d8387631c" containerName="registry-server" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736508 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="13bbbc69-d429-4189-a64f-070d16440ed4" containerName="download-cache-openstack-openstack-cell1" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.736522 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="140e7ca4-e997-4f24-a97c-5393b46dcd38" containerName="registry-server" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.737836 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.740903 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.741230 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.741819 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.741996 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.752896 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-gtncm"] Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.839190 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-ceph\") pod \"configure-network-openstack-openstack-cell1-gtncm\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.839668 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6fxq\" (UniqueName: \"kubernetes.io/projected/6843897c-b341-4bcc-9a38-c9c9707022e8-kube-api-access-v6fxq\") pod \"configure-network-openstack-openstack-cell1-gtncm\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.839706 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-ssh-key\") pod \"configure-network-openstack-openstack-cell1-gtncm\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.839751 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-inventory\") pod \"configure-network-openstack-openstack-cell1-gtncm\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.941690 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-ceph\") pod \"configure-network-openstack-openstack-cell1-gtncm\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.943097 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6fxq\" (UniqueName: \"kubernetes.io/projected/6843897c-b341-4bcc-9a38-c9c9707022e8-kube-api-access-v6fxq\") pod \"configure-network-openstack-openstack-cell1-gtncm\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.943361 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-ssh-key\") pod \"configure-network-openstack-openstack-cell1-gtncm\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.943560 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-inventory\") pod \"configure-network-openstack-openstack-cell1-gtncm\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.948419 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-ssh-key\") pod \"configure-network-openstack-openstack-cell1-gtncm\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.948833 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-ceph\") pod \"configure-network-openstack-openstack-cell1-gtncm\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.950474 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-inventory\") pod \"configure-network-openstack-openstack-cell1-gtncm\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:48 crc kubenswrapper[4831]: I1203 08:34:48.972737 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6fxq\" (UniqueName: \"kubernetes.io/projected/6843897c-b341-4bcc-9a38-c9c9707022e8-kube-api-access-v6fxq\") pod \"configure-network-openstack-openstack-cell1-gtncm\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:49 crc kubenswrapper[4831]: I1203 08:34:49.060220 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:34:49 crc kubenswrapper[4831]: I1203 08:34:49.643573 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-openstack-openstack-cell1-gtncm"] Dec 03 08:34:50 crc kubenswrapper[4831]: I1203 08:34:50.655456 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-gtncm" event={"ID":"6843897c-b341-4bcc-9a38-c9c9707022e8","Type":"ContainerStarted","Data":"ec56a50ebf4df8d367065065af64a8225a6a4063bf6594b19e6c932915c34188"} Dec 03 08:34:50 crc kubenswrapper[4831]: I1203 08:34:50.656020 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-gtncm" event={"ID":"6843897c-b341-4bcc-9a38-c9c9707022e8","Type":"ContainerStarted","Data":"3ca759de5521688d321e2adacb34367a1cf46847a93c27dd7615227c05529e80"} Dec 03 08:36:12 crc kubenswrapper[4831]: I1203 08:36:12.552086 4831 generic.go:334] "Generic (PLEG): container finished" podID="6843897c-b341-4bcc-9a38-c9c9707022e8" containerID="ec56a50ebf4df8d367065065af64a8225a6a4063bf6594b19e6c932915c34188" exitCode=0 Dec 03 08:36:12 crc kubenswrapper[4831]: I1203 08:36:12.552198 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-gtncm" event={"ID":"6843897c-b341-4bcc-9a38-c9c9707022e8","Type":"ContainerDied","Data":"ec56a50ebf4df8d367065065af64a8225a6a4063bf6594b19e6c932915c34188"} Dec 03 08:36:13 crc kubenswrapper[4831]: I1203 08:36:13.995520 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.009518 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-inventory\") pod \"6843897c-b341-4bcc-9a38-c9c9707022e8\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.009766 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6fxq\" (UniqueName: \"kubernetes.io/projected/6843897c-b341-4bcc-9a38-c9c9707022e8-kube-api-access-v6fxq\") pod \"6843897c-b341-4bcc-9a38-c9c9707022e8\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.009817 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-ssh-key\") pod \"6843897c-b341-4bcc-9a38-c9c9707022e8\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.009890 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-ceph\") pod \"6843897c-b341-4bcc-9a38-c9c9707022e8\" (UID: \"6843897c-b341-4bcc-9a38-c9c9707022e8\") " Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.016298 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-ceph" (OuterVolumeSpecName: "ceph") pod "6843897c-b341-4bcc-9a38-c9c9707022e8" (UID: "6843897c-b341-4bcc-9a38-c9c9707022e8"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.032243 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6843897c-b341-4bcc-9a38-c9c9707022e8-kube-api-access-v6fxq" (OuterVolumeSpecName: "kube-api-access-v6fxq") pod "6843897c-b341-4bcc-9a38-c9c9707022e8" (UID: "6843897c-b341-4bcc-9a38-c9c9707022e8"). InnerVolumeSpecName "kube-api-access-v6fxq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.052455 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-inventory" (OuterVolumeSpecName: "inventory") pod "6843897c-b341-4bcc-9a38-c9c9707022e8" (UID: "6843897c-b341-4bcc-9a38-c9c9707022e8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.068124 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "6843897c-b341-4bcc-9a38-c9c9707022e8" (UID: "6843897c-b341-4bcc-9a38-c9c9707022e8"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.116066 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.116103 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v6fxq\" (UniqueName: \"kubernetes.io/projected/6843897c-b341-4bcc-9a38-c9c9707022e8-kube-api-access-v6fxq\") on node \"crc\" DevicePath \"\"" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.116114 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.116123 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/6843897c-b341-4bcc-9a38-c9c9707022e8-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.572078 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-openstack-openstack-cell1-gtncm" event={"ID":"6843897c-b341-4bcc-9a38-c9c9707022e8","Type":"ContainerDied","Data":"3ca759de5521688d321e2adacb34367a1cf46847a93c27dd7615227c05529e80"} Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.572386 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3ca759de5521688d321e2adacb34367a1cf46847a93c27dd7615227c05529e80" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.572353 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-openstack-openstack-cell1-gtncm" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.685457 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-mrmbq"] Dec 03 08:36:14 crc kubenswrapper[4831]: E1203 08:36:14.685976 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6843897c-b341-4bcc-9a38-c9c9707022e8" containerName="configure-network-openstack-openstack-cell1" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.686000 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="6843897c-b341-4bcc-9a38-c9c9707022e8" containerName="configure-network-openstack-openstack-cell1" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.686341 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="6843897c-b341-4bcc-9a38-c9c9707022e8" containerName="configure-network-openstack-openstack-cell1" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.687268 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.689422 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.689610 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.689606 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.689722 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.695349 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-mrmbq"] Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.727613 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wchmv\" (UniqueName: \"kubernetes.io/projected/4526405f-7aac-4d79-ad67-42f6a2b1f241-kube-api-access-wchmv\") pod \"validate-network-openstack-openstack-cell1-mrmbq\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.727667 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-ssh-key\") pod \"validate-network-openstack-openstack-cell1-mrmbq\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.727754 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-inventory\") pod \"validate-network-openstack-openstack-cell1-mrmbq\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.727806 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-ceph\") pod \"validate-network-openstack-openstack-cell1-mrmbq\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.830614 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wchmv\" (UniqueName: \"kubernetes.io/projected/4526405f-7aac-4d79-ad67-42f6a2b1f241-kube-api-access-wchmv\") pod \"validate-network-openstack-openstack-cell1-mrmbq\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.830691 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-ssh-key\") pod \"validate-network-openstack-openstack-cell1-mrmbq\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.830754 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-inventory\") pod \"validate-network-openstack-openstack-cell1-mrmbq\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.830798 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-ceph\") pod \"validate-network-openstack-openstack-cell1-mrmbq\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.835999 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-ssh-key\") pod \"validate-network-openstack-openstack-cell1-mrmbq\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.836246 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-ceph\") pod \"validate-network-openstack-openstack-cell1-mrmbq\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.840462 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-inventory\") pod \"validate-network-openstack-openstack-cell1-mrmbq\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:14 crc kubenswrapper[4831]: I1203 08:36:14.853371 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wchmv\" (UniqueName: \"kubernetes.io/projected/4526405f-7aac-4d79-ad67-42f6a2b1f241-kube-api-access-wchmv\") pod \"validate-network-openstack-openstack-cell1-mrmbq\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:15 crc kubenswrapper[4831]: I1203 08:36:15.004028 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:15 crc kubenswrapper[4831]: W1203 08:36:15.588160 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4526405f_7aac_4d79_ad67_42f6a2b1f241.slice/crio-c6c5e64394bce50cf43f04dd66bfbac94adcaaae19d665bcaaa57d5466518d04 WatchSource:0}: Error finding container c6c5e64394bce50cf43f04dd66bfbac94adcaaae19d665bcaaa57d5466518d04: Status 404 returned error can't find the container with id c6c5e64394bce50cf43f04dd66bfbac94adcaaae19d665bcaaa57d5466518d04 Dec 03 08:36:15 crc kubenswrapper[4831]: I1203 08:36:15.590612 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:36:15 crc kubenswrapper[4831]: I1203 08:36:15.607208 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-openstack-openstack-cell1-mrmbq"] Dec 03 08:36:16 crc kubenswrapper[4831]: I1203 08:36:16.603276 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" event={"ID":"4526405f-7aac-4d79-ad67-42f6a2b1f241","Type":"ContainerStarted","Data":"019151f26c215662fc0fef196be01f82bc8693ea0725cb7cb1db513305a8de45"} Dec 03 08:36:16 crc kubenswrapper[4831]: I1203 08:36:16.603801 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" event={"ID":"4526405f-7aac-4d79-ad67-42f6a2b1f241","Type":"ContainerStarted","Data":"c6c5e64394bce50cf43f04dd66bfbac94adcaaae19d665bcaaa57d5466518d04"} Dec 03 08:36:21 crc kubenswrapper[4831]: I1203 08:36:21.656986 4831 generic.go:334] "Generic (PLEG): container finished" podID="4526405f-7aac-4d79-ad67-42f6a2b1f241" containerID="019151f26c215662fc0fef196be01f82bc8693ea0725cb7cb1db513305a8de45" exitCode=0 Dec 03 08:36:21 crc kubenswrapper[4831]: I1203 08:36:21.657065 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" event={"ID":"4526405f-7aac-4d79-ad67-42f6a2b1f241","Type":"ContainerDied","Data":"019151f26c215662fc0fef196be01f82bc8693ea0725cb7cb1db513305a8de45"} Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.245113 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.410253 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-inventory\") pod \"4526405f-7aac-4d79-ad67-42f6a2b1f241\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.410608 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-ceph\") pod \"4526405f-7aac-4d79-ad67-42f6a2b1f241\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.410693 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-ssh-key\") pod \"4526405f-7aac-4d79-ad67-42f6a2b1f241\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.410871 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wchmv\" (UniqueName: \"kubernetes.io/projected/4526405f-7aac-4d79-ad67-42f6a2b1f241-kube-api-access-wchmv\") pod \"4526405f-7aac-4d79-ad67-42f6a2b1f241\" (UID: \"4526405f-7aac-4d79-ad67-42f6a2b1f241\") " Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.415600 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4526405f-7aac-4d79-ad67-42f6a2b1f241-kube-api-access-wchmv" (OuterVolumeSpecName: "kube-api-access-wchmv") pod "4526405f-7aac-4d79-ad67-42f6a2b1f241" (UID: "4526405f-7aac-4d79-ad67-42f6a2b1f241"). InnerVolumeSpecName "kube-api-access-wchmv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.423241 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-ceph" (OuterVolumeSpecName: "ceph") pod "4526405f-7aac-4d79-ad67-42f6a2b1f241" (UID: "4526405f-7aac-4d79-ad67-42f6a2b1f241"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.443938 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-inventory" (OuterVolumeSpecName: "inventory") pod "4526405f-7aac-4d79-ad67-42f6a2b1f241" (UID: "4526405f-7aac-4d79-ad67-42f6a2b1f241"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.453280 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4526405f-7aac-4d79-ad67-42f6a2b1f241" (UID: "4526405f-7aac-4d79-ad67-42f6a2b1f241"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.512928 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.513135 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wchmv\" (UniqueName: \"kubernetes.io/projected/4526405f-7aac-4d79-ad67-42f6a2b1f241-kube-api-access-wchmv\") on node \"crc\" DevicePath \"\"" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.513253 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.513390 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4526405f-7aac-4d79-ad67-42f6a2b1f241-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.675678 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" event={"ID":"4526405f-7aac-4d79-ad67-42f6a2b1f241","Type":"ContainerDied","Data":"c6c5e64394bce50cf43f04dd66bfbac94adcaaae19d665bcaaa57d5466518d04"} Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.675719 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6c5e64394bce50cf43f04dd66bfbac94adcaaae19d665bcaaa57d5466518d04" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.675723 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-openstack-openstack-cell1-mrmbq" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.756263 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-openstack-openstack-cell1-cjwd2"] Dec 03 08:36:23 crc kubenswrapper[4831]: E1203 08:36:23.757779 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4526405f-7aac-4d79-ad67-42f6a2b1f241" containerName="validate-network-openstack-openstack-cell1" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.757819 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4526405f-7aac-4d79-ad67-42f6a2b1f241" containerName="validate-network-openstack-openstack-cell1" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.758175 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4526405f-7aac-4d79-ad67-42f6a2b1f241" containerName="validate-network-openstack-openstack-cell1" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.759221 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.770028 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-cjwd2"] Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.771253 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.771719 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.771947 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.772178 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.923906 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-ceph\") pod \"install-os-openstack-openstack-cell1-cjwd2\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.923992 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-inventory\") pod \"install-os-openstack-openstack-cell1-cjwd2\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.924092 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-ssh-key\") pod \"install-os-openstack-openstack-cell1-cjwd2\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:23 crc kubenswrapper[4831]: I1203 08:36:23.924257 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k7kq\" (UniqueName: \"kubernetes.io/projected/4287499e-d78b-43b2-b353-18c288c585a4-kube-api-access-4k7kq\") pod \"install-os-openstack-openstack-cell1-cjwd2\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:24 crc kubenswrapper[4831]: I1203 08:36:24.025870 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-ceph\") pod \"install-os-openstack-openstack-cell1-cjwd2\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:24 crc kubenswrapper[4831]: I1203 08:36:24.025988 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-inventory\") pod \"install-os-openstack-openstack-cell1-cjwd2\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:24 crc kubenswrapper[4831]: I1203 08:36:24.026098 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-ssh-key\") pod \"install-os-openstack-openstack-cell1-cjwd2\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:24 crc kubenswrapper[4831]: I1203 08:36:24.026239 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k7kq\" (UniqueName: \"kubernetes.io/projected/4287499e-d78b-43b2-b353-18c288c585a4-kube-api-access-4k7kq\") pod \"install-os-openstack-openstack-cell1-cjwd2\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:24 crc kubenswrapper[4831]: I1203 08:36:24.031414 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-ceph\") pod \"install-os-openstack-openstack-cell1-cjwd2\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:24 crc kubenswrapper[4831]: I1203 08:36:24.032026 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-inventory\") pod \"install-os-openstack-openstack-cell1-cjwd2\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:24 crc kubenswrapper[4831]: I1203 08:36:24.032972 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-ssh-key\") pod \"install-os-openstack-openstack-cell1-cjwd2\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:24 crc kubenswrapper[4831]: I1203 08:36:24.065349 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k7kq\" (UniqueName: \"kubernetes.io/projected/4287499e-d78b-43b2-b353-18c288c585a4-kube-api-access-4k7kq\") pod \"install-os-openstack-openstack-cell1-cjwd2\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:24 crc kubenswrapper[4831]: I1203 08:36:24.091228 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:36:24 crc kubenswrapper[4831]: I1203 08:36:24.693431 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-openstack-openstack-cell1-cjwd2"] Dec 03 08:36:25 crc kubenswrapper[4831]: I1203 08:36:25.718488 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-cjwd2" event={"ID":"4287499e-d78b-43b2-b353-18c288c585a4","Type":"ContainerStarted","Data":"d1611f7fd6d173703af1779c387d1f0a3b61237c7169019acb33e91ee282e8e5"} Dec 03 08:36:25 crc kubenswrapper[4831]: I1203 08:36:25.719035 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-cjwd2" event={"ID":"4287499e-d78b-43b2-b353-18c288c585a4","Type":"ContainerStarted","Data":"de37a90f3030a8917baa1e922fae48866c7dd80713c161f2747a084889b19628"} Dec 03 08:36:25 crc kubenswrapper[4831]: I1203 08:36:25.744776 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-openstack-openstack-cell1-cjwd2" podStartSLOduration=2.299364095 podStartE2EDuration="2.744754945s" podCreationTimestamp="2025-12-03 08:36:23 +0000 UTC" firstStartedPulling="2025-12-03 08:36:24.696278983 +0000 UTC m=+7522.039862491" lastFinishedPulling="2025-12-03 08:36:25.141669823 +0000 UTC m=+7522.485253341" observedRunningTime="2025-12-03 08:36:25.741270436 +0000 UTC m=+7523.084853944" watchObservedRunningTime="2025-12-03 08:36:25.744754945 +0000 UTC m=+7523.088338453" Dec 03 08:36:57 crc kubenswrapper[4831]: I1203 08:36:57.596383 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:36:57 crc kubenswrapper[4831]: I1203 08:36:57.596867 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:37:12 crc kubenswrapper[4831]: I1203 08:37:12.221051 4831 generic.go:334] "Generic (PLEG): container finished" podID="4287499e-d78b-43b2-b353-18c288c585a4" containerID="d1611f7fd6d173703af1779c387d1f0a3b61237c7169019acb33e91ee282e8e5" exitCode=0 Dec 03 08:37:12 crc kubenswrapper[4831]: I1203 08:37:12.221164 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-cjwd2" event={"ID":"4287499e-d78b-43b2-b353-18c288c585a4","Type":"ContainerDied","Data":"d1611f7fd6d173703af1779c387d1f0a3b61237c7169019acb33e91ee282e8e5"} Dec 03 08:37:13 crc kubenswrapper[4831]: I1203 08:37:13.752715 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:37:13 crc kubenswrapper[4831]: I1203 08:37:13.780989 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-inventory\") pod \"4287499e-d78b-43b2-b353-18c288c585a4\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " Dec 03 08:37:13 crc kubenswrapper[4831]: I1203 08:37:13.781141 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k7kq\" (UniqueName: \"kubernetes.io/projected/4287499e-d78b-43b2-b353-18c288c585a4-kube-api-access-4k7kq\") pod \"4287499e-d78b-43b2-b353-18c288c585a4\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " Dec 03 08:37:13 crc kubenswrapper[4831]: I1203 08:37:13.781229 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-ssh-key\") pod \"4287499e-d78b-43b2-b353-18c288c585a4\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " Dec 03 08:37:13 crc kubenswrapper[4831]: I1203 08:37:13.781311 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-ceph\") pod \"4287499e-d78b-43b2-b353-18c288c585a4\" (UID: \"4287499e-d78b-43b2-b353-18c288c585a4\") " Dec 03 08:37:13 crc kubenswrapper[4831]: I1203 08:37:13.791461 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-ceph" (OuterVolumeSpecName: "ceph") pod "4287499e-d78b-43b2-b353-18c288c585a4" (UID: "4287499e-d78b-43b2-b353-18c288c585a4"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:37:13 crc kubenswrapper[4831]: I1203 08:37:13.802492 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4287499e-d78b-43b2-b353-18c288c585a4-kube-api-access-4k7kq" (OuterVolumeSpecName: "kube-api-access-4k7kq") pod "4287499e-d78b-43b2-b353-18c288c585a4" (UID: "4287499e-d78b-43b2-b353-18c288c585a4"). InnerVolumeSpecName "kube-api-access-4k7kq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:37:13 crc kubenswrapper[4831]: I1203 08:37:13.812095 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-inventory" (OuterVolumeSpecName: "inventory") pod "4287499e-d78b-43b2-b353-18c288c585a4" (UID: "4287499e-d78b-43b2-b353-18c288c585a4"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:37:13 crc kubenswrapper[4831]: I1203 08:37:13.815825 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "4287499e-d78b-43b2-b353-18c288c585a4" (UID: "4287499e-d78b-43b2-b353-18c288c585a4"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:37:13 crc kubenswrapper[4831]: I1203 08:37:13.883946 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:37:13 crc kubenswrapper[4831]: I1203 08:37:13.883983 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4k7kq\" (UniqueName: \"kubernetes.io/projected/4287499e-d78b-43b2-b353-18c288c585a4-kube-api-access-4k7kq\") on node \"crc\" DevicePath \"\"" Dec 03 08:37:13 crc kubenswrapper[4831]: I1203 08:37:13.883994 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:37:13 crc kubenswrapper[4831]: I1203 08:37:13.884004 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/4287499e-d78b-43b2-b353-18c288c585a4-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.295310 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-openstack-openstack-cell1-cjwd2" event={"ID":"4287499e-d78b-43b2-b353-18c288c585a4","Type":"ContainerDied","Data":"de37a90f3030a8917baa1e922fae48866c7dd80713c161f2747a084889b19628"} Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.295398 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de37a90f3030a8917baa1e922fae48866c7dd80713c161f2747a084889b19628" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.295419 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-openstack-openstack-cell1-cjwd2" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.405682 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-wgszl"] Dec 03 08:37:14 crc kubenswrapper[4831]: E1203 08:37:14.406295 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4287499e-d78b-43b2-b353-18c288c585a4" containerName="install-os-openstack-openstack-cell1" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.406342 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4287499e-d78b-43b2-b353-18c288c585a4" containerName="install-os-openstack-openstack-cell1" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.406684 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4287499e-d78b-43b2-b353-18c288c585a4" containerName="install-os-openstack-openstack-cell1" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.407716 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.409886 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.410741 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.410829 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.411458 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.427849 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-wgszl"] Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.497980 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-ceph\") pod \"configure-os-openstack-openstack-cell1-wgszl\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.498227 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvn8k\" (UniqueName: \"kubernetes.io/projected/61e7c5dd-7643-4f95-a7d0-acff11d86694-kube-api-access-pvn8k\") pod \"configure-os-openstack-openstack-cell1-wgszl\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.498675 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-ssh-key\") pod \"configure-os-openstack-openstack-cell1-wgszl\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.498815 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-inventory\") pod \"configure-os-openstack-openstack-cell1-wgszl\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.601082 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-ceph\") pod \"configure-os-openstack-openstack-cell1-wgszl\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.601188 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvn8k\" (UniqueName: \"kubernetes.io/projected/61e7c5dd-7643-4f95-a7d0-acff11d86694-kube-api-access-pvn8k\") pod \"configure-os-openstack-openstack-cell1-wgszl\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.601281 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-ssh-key\") pod \"configure-os-openstack-openstack-cell1-wgszl\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.601339 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-inventory\") pod \"configure-os-openstack-openstack-cell1-wgszl\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.608821 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-inventory\") pod \"configure-os-openstack-openstack-cell1-wgszl\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.612080 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-ceph\") pod \"configure-os-openstack-openstack-cell1-wgszl\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.622122 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-ssh-key\") pod \"configure-os-openstack-openstack-cell1-wgszl\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.626034 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvn8k\" (UniqueName: \"kubernetes.io/projected/61e7c5dd-7643-4f95-a7d0-acff11d86694-kube-api-access-pvn8k\") pod \"configure-os-openstack-openstack-cell1-wgszl\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:14 crc kubenswrapper[4831]: I1203 08:37:14.738858 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:37:15 crc kubenswrapper[4831]: I1203 08:37:15.409472 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-openstack-openstack-cell1-wgszl"] Dec 03 08:37:16 crc kubenswrapper[4831]: I1203 08:37:16.337698 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wgszl" event={"ID":"61e7c5dd-7643-4f95-a7d0-acff11d86694","Type":"ContainerStarted","Data":"c400ee9f0b8d07a746c9c1b017d5fcb133ee7d6d9268307496f759d75d52ddf7"} Dec 03 08:37:16 crc kubenswrapper[4831]: I1203 08:37:16.338407 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wgszl" event={"ID":"61e7c5dd-7643-4f95-a7d0-acff11d86694","Type":"ContainerStarted","Data":"822a46bee0a4350c42c575f8d243d506f3c24f820612887ba5fdf411e5ffc998"} Dec 03 08:37:16 crc kubenswrapper[4831]: I1203 08:37:16.365766 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-openstack-openstack-cell1-wgszl" podStartSLOduration=2.185352363 podStartE2EDuration="2.365740731s" podCreationTimestamp="2025-12-03 08:37:14 +0000 UTC" firstStartedPulling="2025-12-03 08:37:15.407678026 +0000 UTC m=+7572.751261544" lastFinishedPulling="2025-12-03 08:37:15.588066404 +0000 UTC m=+7572.931649912" observedRunningTime="2025-12-03 08:37:16.361041115 +0000 UTC m=+7573.704624633" watchObservedRunningTime="2025-12-03 08:37:16.365740731 +0000 UTC m=+7573.709324249" Dec 03 08:37:27 crc kubenswrapper[4831]: I1203 08:37:27.596711 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:37:27 crc kubenswrapper[4831]: I1203 08:37:27.597248 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:37:57 crc kubenswrapper[4831]: I1203 08:37:57.596976 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:37:57 crc kubenswrapper[4831]: I1203 08:37:57.597575 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:37:57 crc kubenswrapper[4831]: I1203 08:37:57.597639 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 08:37:57 crc kubenswrapper[4831]: I1203 08:37:57.598306 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"da347abb806229f967733bacbfad1365f4ae7089c1b4bf1476bfd8bf32654286"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:37:57 crc kubenswrapper[4831]: I1203 08:37:57.598398 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://da347abb806229f967733bacbfad1365f4ae7089c1b4bf1476bfd8bf32654286" gracePeriod=600 Dec 03 08:37:57 crc kubenswrapper[4831]: I1203 08:37:57.822153 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="da347abb806229f967733bacbfad1365f4ae7089c1b4bf1476bfd8bf32654286" exitCode=0 Dec 03 08:37:57 crc kubenswrapper[4831]: I1203 08:37:57.822214 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"da347abb806229f967733bacbfad1365f4ae7089c1b4bf1476bfd8bf32654286"} Dec 03 08:37:57 crc kubenswrapper[4831]: I1203 08:37:57.822550 4831 scope.go:117] "RemoveContainer" containerID="6c971825398635e2134221f56926fa414b073e1b931d9e21ec56220faf3eafa4" Dec 03 08:37:58 crc kubenswrapper[4831]: I1203 08:37:58.833896 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5"} Dec 03 08:38:02 crc kubenswrapper[4831]: I1203 08:38:02.883138 4831 generic.go:334] "Generic (PLEG): container finished" podID="61e7c5dd-7643-4f95-a7d0-acff11d86694" containerID="c400ee9f0b8d07a746c9c1b017d5fcb133ee7d6d9268307496f759d75d52ddf7" exitCode=0 Dec 03 08:38:02 crc kubenswrapper[4831]: I1203 08:38:02.885164 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wgszl" event={"ID":"61e7c5dd-7643-4f95-a7d0-acff11d86694","Type":"ContainerDied","Data":"c400ee9f0b8d07a746c9c1b017d5fcb133ee7d6d9268307496f759d75d52ddf7"} Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.422831 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.551277 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-ssh-key\") pod \"61e7c5dd-7643-4f95-a7d0-acff11d86694\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.551729 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-ceph\") pod \"61e7c5dd-7643-4f95-a7d0-acff11d86694\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.551935 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-inventory\") pod \"61e7c5dd-7643-4f95-a7d0-acff11d86694\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.552042 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvn8k\" (UniqueName: \"kubernetes.io/projected/61e7c5dd-7643-4f95-a7d0-acff11d86694-kube-api-access-pvn8k\") pod \"61e7c5dd-7643-4f95-a7d0-acff11d86694\" (UID: \"61e7c5dd-7643-4f95-a7d0-acff11d86694\") " Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.558120 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61e7c5dd-7643-4f95-a7d0-acff11d86694-kube-api-access-pvn8k" (OuterVolumeSpecName: "kube-api-access-pvn8k") pod "61e7c5dd-7643-4f95-a7d0-acff11d86694" (UID: "61e7c5dd-7643-4f95-a7d0-acff11d86694"). InnerVolumeSpecName "kube-api-access-pvn8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.575397 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-ceph" (OuterVolumeSpecName: "ceph") pod "61e7c5dd-7643-4f95-a7d0-acff11d86694" (UID: "61e7c5dd-7643-4f95-a7d0-acff11d86694"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.579165 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-inventory" (OuterVolumeSpecName: "inventory") pod "61e7c5dd-7643-4f95-a7d0-acff11d86694" (UID: "61e7c5dd-7643-4f95-a7d0-acff11d86694"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.586781 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "61e7c5dd-7643-4f95-a7d0-acff11d86694" (UID: "61e7c5dd-7643-4f95-a7d0-acff11d86694"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.654404 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.654439 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.654452 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/61e7c5dd-7643-4f95-a7d0-acff11d86694-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.654467 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvn8k\" (UniqueName: \"kubernetes.io/projected/61e7c5dd-7643-4f95-a7d0-acff11d86694-kube-api-access-pvn8k\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.916380 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-openstack-openstack-cell1-wgszl" event={"ID":"61e7c5dd-7643-4f95-a7d0-acff11d86694","Type":"ContainerDied","Data":"822a46bee0a4350c42c575f8d243d506f3c24f820612887ba5fdf411e5ffc998"} Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.916747 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="822a46bee0a4350c42c575f8d243d506f3c24f820612887ba5fdf411e5ffc998" Dec 03 08:38:04 crc kubenswrapper[4831]: I1203 08:38:04.916499 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-openstack-openstack-cell1-wgszl" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.029740 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-openstack-lfwtg"] Dec 03 08:38:05 crc kubenswrapper[4831]: E1203 08:38:05.030148 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61e7c5dd-7643-4f95-a7d0-acff11d86694" containerName="configure-os-openstack-openstack-cell1" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.030170 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="61e7c5dd-7643-4f95-a7d0-acff11d86694" containerName="configure-os-openstack-openstack-cell1" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.030491 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="61e7c5dd-7643-4f95-a7d0-acff11d86694" containerName="configure-os-openstack-openstack-cell1" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.031369 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-lfwtg"] Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.031467 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.033659 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.033669 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.034168 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.034284 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.165298 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-lfwtg\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.165479 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnrh9\" (UniqueName: \"kubernetes.io/projected/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-kube-api-access-mnrh9\") pod \"ssh-known-hosts-openstack-lfwtg\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.165761 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-inventory-0\") pod \"ssh-known-hosts-openstack-lfwtg\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.165823 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-ceph\") pod \"ssh-known-hosts-openstack-lfwtg\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.267605 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnrh9\" (UniqueName: \"kubernetes.io/projected/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-kube-api-access-mnrh9\") pod \"ssh-known-hosts-openstack-lfwtg\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.267702 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-inventory-0\") pod \"ssh-known-hosts-openstack-lfwtg\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.267730 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-ceph\") pod \"ssh-known-hosts-openstack-lfwtg\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.267888 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-lfwtg\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.275288 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-inventory-0\") pod \"ssh-known-hosts-openstack-lfwtg\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.273254 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-ceph\") pod \"ssh-known-hosts-openstack-lfwtg\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.282455 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-ssh-key-openstack-cell1\") pod \"ssh-known-hosts-openstack-lfwtg\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.304676 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnrh9\" (UniqueName: \"kubernetes.io/projected/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-kube-api-access-mnrh9\") pod \"ssh-known-hosts-openstack-lfwtg\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.375083 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:05 crc kubenswrapper[4831]: I1203 08:38:05.997703 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-openstack-lfwtg"] Dec 03 08:38:06 crc kubenswrapper[4831]: I1203 08:38:06.940990 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-lfwtg" event={"ID":"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935","Type":"ContainerStarted","Data":"f45b2e1644d628320a1885095de6ecaeabaedc708912102871d2451aa0386bc9"} Dec 03 08:38:06 crc kubenswrapper[4831]: I1203 08:38:06.941525 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-lfwtg" event={"ID":"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935","Type":"ContainerStarted","Data":"954f09c6082d820c1222a15d714117a5aaea7e724e20adca51f0a342854f6750"} Dec 03 08:38:06 crc kubenswrapper[4831]: I1203 08:38:06.965198 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-openstack-lfwtg" podStartSLOduration=2.829585561 podStartE2EDuration="2.965157692s" podCreationTimestamp="2025-12-03 08:38:04 +0000 UTC" firstStartedPulling="2025-12-03 08:38:06.008343216 +0000 UTC m=+7623.351926744" lastFinishedPulling="2025-12-03 08:38:06.143915367 +0000 UTC m=+7623.487498875" observedRunningTime="2025-12-03 08:38:06.957044389 +0000 UTC m=+7624.300627917" watchObservedRunningTime="2025-12-03 08:38:06.965157692 +0000 UTC m=+7624.308741200" Dec 03 08:38:16 crc kubenswrapper[4831]: I1203 08:38:16.056529 4831 generic.go:334] "Generic (PLEG): container finished" podID="d34dc8ad-d601-4df6-8bb8-4dc8d76e3935" containerID="f45b2e1644d628320a1885095de6ecaeabaedc708912102871d2451aa0386bc9" exitCode=0 Dec 03 08:38:16 crc kubenswrapper[4831]: I1203 08:38:16.056651 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-lfwtg" event={"ID":"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935","Type":"ContainerDied","Data":"f45b2e1644d628320a1885095de6ecaeabaedc708912102871d2451aa0386bc9"} Dec 03 08:38:17 crc kubenswrapper[4831]: I1203 08:38:17.466798 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:17 crc kubenswrapper[4831]: I1203 08:38:17.563634 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-ceph\") pod \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " Dec 03 08:38:17 crc kubenswrapper[4831]: I1203 08:38:17.563689 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-ssh-key-openstack-cell1\") pod \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " Dec 03 08:38:17 crc kubenswrapper[4831]: I1203 08:38:17.563712 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-inventory-0\") pod \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " Dec 03 08:38:17 crc kubenswrapper[4831]: I1203 08:38:17.563739 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrh9\" (UniqueName: \"kubernetes.io/projected/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-kube-api-access-mnrh9\") pod \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\" (UID: \"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935\") " Dec 03 08:38:17 crc kubenswrapper[4831]: I1203 08:38:17.569780 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-kube-api-access-mnrh9" (OuterVolumeSpecName: "kube-api-access-mnrh9") pod "d34dc8ad-d601-4df6-8bb8-4dc8d76e3935" (UID: "d34dc8ad-d601-4df6-8bb8-4dc8d76e3935"). InnerVolumeSpecName "kube-api-access-mnrh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:38:17 crc kubenswrapper[4831]: I1203 08:38:17.570886 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-ceph" (OuterVolumeSpecName: "ceph") pod "d34dc8ad-d601-4df6-8bb8-4dc8d76e3935" (UID: "d34dc8ad-d601-4df6-8bb8-4dc8d76e3935"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:38:17 crc kubenswrapper[4831]: I1203 08:38:17.594596 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-ssh-key-openstack-cell1" (OuterVolumeSpecName: "ssh-key-openstack-cell1") pod "d34dc8ad-d601-4df6-8bb8-4dc8d76e3935" (UID: "d34dc8ad-d601-4df6-8bb8-4dc8d76e3935"). InnerVolumeSpecName "ssh-key-openstack-cell1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:38:17 crc kubenswrapper[4831]: I1203 08:38:17.598417 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "d34dc8ad-d601-4df6-8bb8-4dc8d76e3935" (UID: "d34dc8ad-d601-4df6-8bb8-4dc8d76e3935"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:38:17 crc kubenswrapper[4831]: I1203 08:38:17.665727 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:17 crc kubenswrapper[4831]: I1203 08:38:17.666024 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-cell1\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-ssh-key-openstack-cell1\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:17 crc kubenswrapper[4831]: I1203 08:38:17.666035 4831 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-inventory-0\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:17 crc kubenswrapper[4831]: I1203 08:38:17.666045 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrh9\" (UniqueName: \"kubernetes.io/projected/d34dc8ad-d601-4df6-8bb8-4dc8d76e3935-kube-api-access-mnrh9\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.077197 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-openstack-lfwtg" event={"ID":"d34dc8ad-d601-4df6-8bb8-4dc8d76e3935","Type":"ContainerDied","Data":"954f09c6082d820c1222a15d714117a5aaea7e724e20adca51f0a342854f6750"} Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.077244 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="954f09c6082d820c1222a15d714117a5aaea7e724e20adca51f0a342854f6750" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.077272 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-openstack-lfwtg" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.170058 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fnq88"] Dec 03 08:38:18 crc kubenswrapper[4831]: E1203 08:38:18.170509 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d34dc8ad-d601-4df6-8bb8-4dc8d76e3935" containerName="ssh-known-hosts-openstack" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.170527 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="d34dc8ad-d601-4df6-8bb8-4dc8d76e3935" containerName="ssh-known-hosts-openstack" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.170811 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="d34dc8ad-d601-4df6-8bb8-4dc8d76e3935" containerName="ssh-known-hosts-openstack" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.174427 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.177112 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.177268 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.177512 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.177664 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.187485 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-ceph\") pod \"run-os-openstack-openstack-cell1-fnq88\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.187818 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-ssh-key\") pod \"run-os-openstack-openstack-cell1-fnq88\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.187857 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-inventory\") pod \"run-os-openstack-openstack-cell1-fnq88\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.187951 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fnq88"] Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.187990 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkt7r\" (UniqueName: \"kubernetes.io/projected/7eac7128-cee7-426e-83e3-7579b7744457-kube-api-access-nkt7r\") pod \"run-os-openstack-openstack-cell1-fnq88\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.289940 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-ssh-key\") pod \"run-os-openstack-openstack-cell1-fnq88\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.289988 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-inventory\") pod \"run-os-openstack-openstack-cell1-fnq88\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.290050 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkt7r\" (UniqueName: \"kubernetes.io/projected/7eac7128-cee7-426e-83e3-7579b7744457-kube-api-access-nkt7r\") pod \"run-os-openstack-openstack-cell1-fnq88\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.290093 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-ceph\") pod \"run-os-openstack-openstack-cell1-fnq88\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.294614 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-ceph\") pod \"run-os-openstack-openstack-cell1-fnq88\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.294601 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-ssh-key\") pod \"run-os-openstack-openstack-cell1-fnq88\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.294895 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-inventory\") pod \"run-os-openstack-openstack-cell1-fnq88\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.313162 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkt7r\" (UniqueName: \"kubernetes.io/projected/7eac7128-cee7-426e-83e3-7579b7744457-kube-api-access-nkt7r\") pod \"run-os-openstack-openstack-cell1-fnq88\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:18 crc kubenswrapper[4831]: I1203 08:38:18.509288 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:19 crc kubenswrapper[4831]: I1203 08:38:19.072188 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-openstack-openstack-cell1-fnq88"] Dec 03 08:38:19 crc kubenswrapper[4831]: I1203 08:38:19.093868 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fnq88" event={"ID":"7eac7128-cee7-426e-83e3-7579b7744457","Type":"ContainerStarted","Data":"4cfc517d5c6aed726c6561c244c22670703f8bfd0a50ec3136b858e49e29c5be"} Dec 03 08:38:20 crc kubenswrapper[4831]: I1203 08:38:20.112433 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fnq88" event={"ID":"7eac7128-cee7-426e-83e3-7579b7744457","Type":"ContainerStarted","Data":"72328c5ca682b74324039f18ac7cf0799a6ca4b79ec8cef7dec9a0cecbad04e5"} Dec 03 08:38:20 crc kubenswrapper[4831]: I1203 08:38:20.142280 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-openstack-openstack-cell1-fnq88" podStartSLOduration=1.915256575 podStartE2EDuration="2.142250664s" podCreationTimestamp="2025-12-03 08:38:18 +0000 UTC" firstStartedPulling="2025-12-03 08:38:19.087177267 +0000 UTC m=+7636.430760775" lastFinishedPulling="2025-12-03 08:38:19.314171316 +0000 UTC m=+7636.657754864" observedRunningTime="2025-12-03 08:38:20.129721904 +0000 UTC m=+7637.473305412" watchObservedRunningTime="2025-12-03 08:38:20.142250664 +0000 UTC m=+7637.485834212" Dec 03 08:38:28 crc kubenswrapper[4831]: I1203 08:38:28.197873 4831 generic.go:334] "Generic (PLEG): container finished" podID="7eac7128-cee7-426e-83e3-7579b7744457" containerID="72328c5ca682b74324039f18ac7cf0799a6ca4b79ec8cef7dec9a0cecbad04e5" exitCode=0 Dec 03 08:38:28 crc kubenswrapper[4831]: I1203 08:38:28.198032 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fnq88" event={"ID":"7eac7128-cee7-426e-83e3-7579b7744457","Type":"ContainerDied","Data":"72328c5ca682b74324039f18ac7cf0799a6ca4b79ec8cef7dec9a0cecbad04e5"} Dec 03 08:38:29 crc kubenswrapper[4831]: I1203 08:38:29.865077 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:29 crc kubenswrapper[4831]: I1203 08:38:29.951701 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-ceph\") pod \"7eac7128-cee7-426e-83e3-7579b7744457\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " Dec 03 08:38:29 crc kubenswrapper[4831]: I1203 08:38:29.951920 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-inventory\") pod \"7eac7128-cee7-426e-83e3-7579b7744457\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " Dec 03 08:38:29 crc kubenswrapper[4831]: I1203 08:38:29.952017 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkt7r\" (UniqueName: \"kubernetes.io/projected/7eac7128-cee7-426e-83e3-7579b7744457-kube-api-access-nkt7r\") pod \"7eac7128-cee7-426e-83e3-7579b7744457\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " Dec 03 08:38:29 crc kubenswrapper[4831]: I1203 08:38:29.952037 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-ssh-key\") pod \"7eac7128-cee7-426e-83e3-7579b7744457\" (UID: \"7eac7128-cee7-426e-83e3-7579b7744457\") " Dec 03 08:38:29 crc kubenswrapper[4831]: I1203 08:38:29.960431 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-ceph" (OuterVolumeSpecName: "ceph") pod "7eac7128-cee7-426e-83e3-7579b7744457" (UID: "7eac7128-cee7-426e-83e3-7579b7744457"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:38:29 crc kubenswrapper[4831]: I1203 08:38:29.964779 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7eac7128-cee7-426e-83e3-7579b7744457-kube-api-access-nkt7r" (OuterVolumeSpecName: "kube-api-access-nkt7r") pod "7eac7128-cee7-426e-83e3-7579b7744457" (UID: "7eac7128-cee7-426e-83e3-7579b7744457"). InnerVolumeSpecName "kube-api-access-nkt7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:38:29 crc kubenswrapper[4831]: I1203 08:38:29.997612 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "7eac7128-cee7-426e-83e3-7579b7744457" (UID: "7eac7128-cee7-426e-83e3-7579b7744457"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:38:29 crc kubenswrapper[4831]: I1203 08:38:29.997861 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-inventory" (OuterVolumeSpecName: "inventory") pod "7eac7128-cee7-426e-83e3-7579b7744457" (UID: "7eac7128-cee7-426e-83e3-7579b7744457"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.056184 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.056349 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkt7r\" (UniqueName: \"kubernetes.io/projected/7eac7128-cee7-426e-83e3-7579b7744457-kube-api-access-nkt7r\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.056384 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.056407 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/7eac7128-cee7-426e-83e3-7579b7744457-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.229195 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-openstack-openstack-cell1-fnq88" event={"ID":"7eac7128-cee7-426e-83e3-7579b7744457","Type":"ContainerDied","Data":"4cfc517d5c6aed726c6561c244c22670703f8bfd0a50ec3136b858e49e29c5be"} Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.229251 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cfc517d5c6aed726c6561c244c22670703f8bfd0a50ec3136b858e49e29c5be" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.229281 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-openstack-openstack-cell1-fnq88" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.316688 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-c684f"] Dec 03 08:38:30 crc kubenswrapper[4831]: E1203 08:38:30.317256 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7eac7128-cee7-426e-83e3-7579b7744457" containerName="run-os-openstack-openstack-cell1" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.317283 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="7eac7128-cee7-426e-83e3-7579b7744457" containerName="run-os-openstack-openstack-cell1" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.318783 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="7eac7128-cee7-426e-83e3-7579b7744457" containerName="run-os-openstack-openstack-cell1" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.319846 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.325848 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.326447 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.326616 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.326773 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.343299 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-c684f"] Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.360746 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-inventory\") pod \"reboot-os-openstack-openstack-cell1-c684f\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.360786 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-c684f\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.360883 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-ceph\") pod \"reboot-os-openstack-openstack-cell1-c684f\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.360930 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxz2w\" (UniqueName: \"kubernetes.io/projected/bc45326f-d4eb-443f-babc-57f5bd7aa587-kube-api-access-mxz2w\") pod \"reboot-os-openstack-openstack-cell1-c684f\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.461889 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-inventory\") pod \"reboot-os-openstack-openstack-cell1-c684f\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.461934 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-c684f\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.462028 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-ceph\") pod \"reboot-os-openstack-openstack-cell1-c684f\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.462087 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mxz2w\" (UniqueName: \"kubernetes.io/projected/bc45326f-d4eb-443f-babc-57f5bd7aa587-kube-api-access-mxz2w\") pod \"reboot-os-openstack-openstack-cell1-c684f\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.466572 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-ceph\") pod \"reboot-os-openstack-openstack-cell1-c684f\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.467268 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-inventory\") pod \"reboot-os-openstack-openstack-cell1-c684f\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.474780 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-ssh-key\") pod \"reboot-os-openstack-openstack-cell1-c684f\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.478753 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxz2w\" (UniqueName: \"kubernetes.io/projected/bc45326f-d4eb-443f-babc-57f5bd7aa587-kube-api-access-mxz2w\") pod \"reboot-os-openstack-openstack-cell1-c684f\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:30 crc kubenswrapper[4831]: I1203 08:38:30.659173 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:31 crc kubenswrapper[4831]: I1203 08:38:31.247949 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-openstack-openstack-cell1-c684f"] Dec 03 08:38:32 crc kubenswrapper[4831]: I1203 08:38:32.248202 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-c684f" event={"ID":"bc45326f-d4eb-443f-babc-57f5bd7aa587","Type":"ContainerStarted","Data":"8fa7ad006a3499e434b285cd55c09caba187903ab0fc565f47c51e797d2411e6"} Dec 03 08:38:32 crc kubenswrapper[4831]: I1203 08:38:32.248860 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-c684f" event={"ID":"bc45326f-d4eb-443f-babc-57f5bd7aa587","Type":"ContainerStarted","Data":"af90f29aea4c176f85afb958279b8aae3bfe3af2672e880966cdcb631002664a"} Dec 03 08:38:32 crc kubenswrapper[4831]: I1203 08:38:32.268205 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-openstack-openstack-cell1-c684f" podStartSLOduration=2.100747926 podStartE2EDuration="2.268187541s" podCreationTimestamp="2025-12-03 08:38:30 +0000 UTC" firstStartedPulling="2025-12-03 08:38:31.253420839 +0000 UTC m=+7648.597004347" lastFinishedPulling="2025-12-03 08:38:31.420860454 +0000 UTC m=+7648.764443962" observedRunningTime="2025-12-03 08:38:32.267880851 +0000 UTC m=+7649.611464359" watchObservedRunningTime="2025-12-03 08:38:32.268187541 +0000 UTC m=+7649.611771059" Dec 03 08:38:48 crc kubenswrapper[4831]: I1203 08:38:48.450163 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-c684f" event={"ID":"bc45326f-d4eb-443f-babc-57f5bd7aa587","Type":"ContainerDied","Data":"8fa7ad006a3499e434b285cd55c09caba187903ab0fc565f47c51e797d2411e6"} Dec 03 08:38:48 crc kubenswrapper[4831]: I1203 08:38:48.450069 4831 generic.go:334] "Generic (PLEG): container finished" podID="bc45326f-d4eb-443f-babc-57f5bd7aa587" containerID="8fa7ad006a3499e434b285cd55c09caba187903ab0fc565f47c51e797d2411e6" exitCode=0 Dec 03 08:38:49 crc kubenswrapper[4831]: I1203 08:38:49.914094 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.037915 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxz2w\" (UniqueName: \"kubernetes.io/projected/bc45326f-d4eb-443f-babc-57f5bd7aa587-kube-api-access-mxz2w\") pod \"bc45326f-d4eb-443f-babc-57f5bd7aa587\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.038071 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-ssh-key\") pod \"bc45326f-d4eb-443f-babc-57f5bd7aa587\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.038225 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-ceph\") pod \"bc45326f-d4eb-443f-babc-57f5bd7aa587\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.038271 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-inventory\") pod \"bc45326f-d4eb-443f-babc-57f5bd7aa587\" (UID: \"bc45326f-d4eb-443f-babc-57f5bd7aa587\") " Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.045153 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-ceph" (OuterVolumeSpecName: "ceph") pod "bc45326f-d4eb-443f-babc-57f5bd7aa587" (UID: "bc45326f-d4eb-443f-babc-57f5bd7aa587"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.046595 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc45326f-d4eb-443f-babc-57f5bd7aa587-kube-api-access-mxz2w" (OuterVolumeSpecName: "kube-api-access-mxz2w") pod "bc45326f-d4eb-443f-babc-57f5bd7aa587" (UID: "bc45326f-d4eb-443f-babc-57f5bd7aa587"). InnerVolumeSpecName "kube-api-access-mxz2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.071573 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-inventory" (OuterVolumeSpecName: "inventory") pod "bc45326f-d4eb-443f-babc-57f5bd7aa587" (UID: "bc45326f-d4eb-443f-babc-57f5bd7aa587"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.072432 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bc45326f-d4eb-443f-babc-57f5bd7aa587" (UID: "bc45326f-d4eb-443f-babc-57f5bd7aa587"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.141834 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.141891 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.141910 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bc45326f-d4eb-443f-babc-57f5bd7aa587-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.141929 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mxz2w\" (UniqueName: \"kubernetes.io/projected/bc45326f-d4eb-443f-babc-57f5bd7aa587-kube-api-access-mxz2w\") on node \"crc\" DevicePath \"\"" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.472489 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-openstack-openstack-cell1-c684f" event={"ID":"bc45326f-d4eb-443f-babc-57f5bd7aa587","Type":"ContainerDied","Data":"af90f29aea4c176f85afb958279b8aae3bfe3af2672e880966cdcb631002664a"} Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.472899 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af90f29aea4c176f85afb958279b8aae3bfe3af2672e880966cdcb631002664a" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.472592 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-openstack-openstack-cell1-c684f" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.572489 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-5jpj2"] Dec 03 08:38:50 crc kubenswrapper[4831]: E1203 08:38:50.573229 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc45326f-d4eb-443f-babc-57f5bd7aa587" containerName="reboot-os-openstack-openstack-cell1" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.573369 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc45326f-d4eb-443f-babc-57f5bd7aa587" containerName="reboot-os-openstack-openstack-cell1" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.573743 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc45326f-d4eb-443f-babc-57f5bd7aa587" containerName="reboot-os-openstack-openstack-cell1" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.574737 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.577302 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.577720 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.578122 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.578417 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.602254 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-5jpj2"] Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.657884 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5ql6\" (UniqueName: \"kubernetes.io/projected/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-kube-api-access-q5ql6\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.657933 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ceph\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.658027 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.658166 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.658260 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.658282 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.658392 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.658410 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-inventory\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.658433 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.658526 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.658657 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ssh-key\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.658734 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.760748 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.760826 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.760904 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.760938 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-inventory\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.760973 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.761116 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.761224 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ssh-key\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.761344 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.761423 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5ql6\" (UniqueName: \"kubernetes.io/projected/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-kube-api-access-q5ql6\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.761459 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ceph\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.761542 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.761611 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.766623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-sriov-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.767156 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-inventory\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.767171 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-metadata-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.768084 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ovn-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.768398 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ceph\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.768734 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-telemetry-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.770181 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-bootstrap-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.771190 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-nova-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.771562 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-libvirt-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.773470 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ssh-key\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.777123 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-dhcp-combined-ca-bundle\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.781730 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5ql6\" (UniqueName: \"kubernetes.io/projected/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-kube-api-access-q5ql6\") pod \"install-certs-openstack-openstack-cell1-5jpj2\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:50 crc kubenswrapper[4831]: I1203 08:38:50.912813 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:38:51 crc kubenswrapper[4831]: I1203 08:38:51.485267 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-openstack-openstack-cell1-5jpj2"] Dec 03 08:38:52 crc kubenswrapper[4831]: I1203 08:38:52.492627 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" event={"ID":"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f","Type":"ContainerStarted","Data":"5e22dc4bc47294100d90849722d60e089eaf304f444755b184bf4628963b2ad5"} Dec 03 08:38:52 crc kubenswrapper[4831]: I1203 08:38:52.493139 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" event={"ID":"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f","Type":"ContainerStarted","Data":"692c2d91b9bb3fccd8f329d79a0426576e3141ed9f381e877a1b38db8e1bdb09"} Dec 03 08:38:52 crc kubenswrapper[4831]: I1203 08:38:52.514776 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" podStartSLOduration=2.354186708 podStartE2EDuration="2.514758508s" podCreationTimestamp="2025-12-03 08:38:50 +0000 UTC" firstStartedPulling="2025-12-03 08:38:51.493202595 +0000 UTC m=+7668.836786103" lastFinishedPulling="2025-12-03 08:38:51.653774395 +0000 UTC m=+7668.997357903" observedRunningTime="2025-12-03 08:38:52.513276771 +0000 UTC m=+7669.856860289" watchObservedRunningTime="2025-12-03 08:38:52.514758508 +0000 UTC m=+7669.858342016" Dec 03 08:39:11 crc kubenswrapper[4831]: I1203 08:39:11.735956 4831 generic.go:334] "Generic (PLEG): container finished" podID="8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" containerID="5e22dc4bc47294100d90849722d60e089eaf304f444755b184bf4628963b2ad5" exitCode=0 Dec 03 08:39:11 crc kubenswrapper[4831]: I1203 08:39:11.736038 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" event={"ID":"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f","Type":"ContainerDied","Data":"5e22dc4bc47294100d90849722d60e089eaf304f444755b184bf4628963b2ad5"} Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.195727 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.345003 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-sriov-combined-ca-bundle\") pod \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.345052 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q5ql6\" (UniqueName: \"kubernetes.io/projected/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-kube-api-access-q5ql6\") pod \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.345097 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-metadata-combined-ca-bundle\") pod \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.345146 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-inventory\") pod \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.345173 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-nova-combined-ca-bundle\") pod \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.345226 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-bootstrap-combined-ca-bundle\") pod \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.345274 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ssh-key\") pod \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.345351 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-libvirt-combined-ca-bundle\") pod \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.345432 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-telemetry-combined-ca-bundle\") pod \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.345486 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ovn-combined-ca-bundle\") pod \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.345507 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-dhcp-combined-ca-bundle\") pod \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.345542 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ceph\") pod \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\" (UID: \"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f\") " Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.352942 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" (UID: "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.354181 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-kube-api-access-q5ql6" (OuterVolumeSpecName: "kube-api-access-q5ql6") pod "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" (UID: "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f"). InnerVolumeSpecName "kube-api-access-q5ql6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.354640 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" (UID: "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.355022 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" (UID: "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.356253 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" (UID: "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.357170 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" (UID: "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.358484 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" (UID: "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.359646 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" (UID: "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.375384 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ceph" (OuterVolumeSpecName: "ceph") pod "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" (UID: "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.376083 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" (UID: "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.385658 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-inventory" (OuterVolumeSpecName: "inventory") pod "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" (UID: "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.392514 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" (UID: "8a974bcc-3f87-42a4-9a9c-75fa79fcd20f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.448510 4831 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.448556 4831 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.448568 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.448578 4831 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.448593 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.448602 4831 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.448613 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q5ql6\" (UniqueName: \"kubernetes.io/projected/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-kube-api-access-q5ql6\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.448625 4831 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.448633 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.448648 4831 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.448656 4831 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.448665 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/8a974bcc-3f87-42a4-9a9c-75fa79fcd20f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.758214 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" event={"ID":"8a974bcc-3f87-42a4-9a9c-75fa79fcd20f","Type":"ContainerDied","Data":"692c2d91b9bb3fccd8f329d79a0426576e3141ed9f381e877a1b38db8e1bdb09"} Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.758257 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="692c2d91b9bb3fccd8f329d79a0426576e3141ed9f381e877a1b38db8e1bdb09" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.758281 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-openstack-openstack-cell1-5jpj2" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.854370 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-6mmwx"] Dec 03 08:39:13 crc kubenswrapper[4831]: E1203 08:39:13.854823 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" containerName="install-certs-openstack-openstack-cell1" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.854839 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" containerName="install-certs-openstack-openstack-cell1" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.855088 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a974bcc-3f87-42a4-9a9c-75fa79fcd20f" containerName="install-certs-openstack-openstack-cell1" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.855844 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.858262 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.858268 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.858364 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.858795 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.865148 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-6mmwx"] Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.957206 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-6mmwx\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.957265 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj2jh\" (UniqueName: \"kubernetes.io/projected/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-kube-api-access-wj2jh\") pod \"ceph-client-openstack-openstack-cell1-6mmwx\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.957302 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-inventory\") pod \"ceph-client-openstack-openstack-cell1-6mmwx\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:13 crc kubenswrapper[4831]: I1203 08:39:13.957368 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-ceph\") pod \"ceph-client-openstack-openstack-cell1-6mmwx\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:14 crc kubenswrapper[4831]: I1203 08:39:14.059334 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-6mmwx\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:14 crc kubenswrapper[4831]: I1203 08:39:14.059410 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj2jh\" (UniqueName: \"kubernetes.io/projected/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-kube-api-access-wj2jh\") pod \"ceph-client-openstack-openstack-cell1-6mmwx\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:14 crc kubenswrapper[4831]: I1203 08:39:14.059455 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-inventory\") pod \"ceph-client-openstack-openstack-cell1-6mmwx\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:14 crc kubenswrapper[4831]: I1203 08:39:14.059510 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-ceph\") pod \"ceph-client-openstack-openstack-cell1-6mmwx\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:14 crc kubenswrapper[4831]: I1203 08:39:14.064624 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-ssh-key\") pod \"ceph-client-openstack-openstack-cell1-6mmwx\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:14 crc kubenswrapper[4831]: I1203 08:39:14.064630 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-ceph\") pod \"ceph-client-openstack-openstack-cell1-6mmwx\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:14 crc kubenswrapper[4831]: I1203 08:39:14.064999 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-inventory\") pod \"ceph-client-openstack-openstack-cell1-6mmwx\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:14 crc kubenswrapper[4831]: I1203 08:39:14.079099 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj2jh\" (UniqueName: \"kubernetes.io/projected/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-kube-api-access-wj2jh\") pod \"ceph-client-openstack-openstack-cell1-6mmwx\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:14 crc kubenswrapper[4831]: I1203 08:39:14.177614 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:14 crc kubenswrapper[4831]: I1203 08:39:14.683468 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-openstack-openstack-cell1-6mmwx"] Dec 03 08:39:14 crc kubenswrapper[4831]: W1203 08:39:14.684889 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa8d1192_9ba3_44c8_b5f0_78992aabc7d2.slice/crio-f1d3f967fa738e0372b8f6f5486d15f063a8cbf851b6eb1d4df26556707ac29e WatchSource:0}: Error finding container f1d3f967fa738e0372b8f6f5486d15f063a8cbf851b6eb1d4df26556707ac29e: Status 404 returned error can't find the container with id f1d3f967fa738e0372b8f6f5486d15f063a8cbf851b6eb1d4df26556707ac29e Dec 03 08:39:14 crc kubenswrapper[4831]: I1203 08:39:14.774049 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" event={"ID":"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2","Type":"ContainerStarted","Data":"f1d3f967fa738e0372b8f6f5486d15f063a8cbf851b6eb1d4df26556707ac29e"} Dec 03 08:39:15 crc kubenswrapper[4831]: I1203 08:39:15.790682 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" event={"ID":"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2","Type":"ContainerStarted","Data":"04cb747e1281d8ecc78b143b37134e929c2dc63cde746105f47aac185a3d2b53"} Dec 03 08:39:15 crc kubenswrapper[4831]: I1203 08:39:15.816653 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" podStartSLOduration=2.590651833 podStartE2EDuration="2.81663595s" podCreationTimestamp="2025-12-03 08:39:13 +0000 UTC" firstStartedPulling="2025-12-03 08:39:14.68790109 +0000 UTC m=+7692.031484588" lastFinishedPulling="2025-12-03 08:39:14.913885187 +0000 UTC m=+7692.257468705" observedRunningTime="2025-12-03 08:39:15.813739231 +0000 UTC m=+7693.157322739" watchObservedRunningTime="2025-12-03 08:39:15.81663595 +0000 UTC m=+7693.160219458" Dec 03 08:39:20 crc kubenswrapper[4831]: I1203 08:39:20.842623 4831 generic.go:334] "Generic (PLEG): container finished" podID="aa8d1192-9ba3-44c8-b5f0-78992aabc7d2" containerID="04cb747e1281d8ecc78b143b37134e929c2dc63cde746105f47aac185a3d2b53" exitCode=0 Dec 03 08:39:20 crc kubenswrapper[4831]: I1203 08:39:20.842736 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" event={"ID":"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2","Type":"ContainerDied","Data":"04cb747e1281d8ecc78b143b37134e929c2dc63cde746105f47aac185a3d2b53"} Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.322093 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.462441 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-inventory\") pod \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.462504 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-ceph\") pod \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.462556 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj2jh\" (UniqueName: \"kubernetes.io/projected/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-kube-api-access-wj2jh\") pod \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.462677 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-ssh-key\") pod \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\" (UID: \"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2\") " Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.470997 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-ceph" (OuterVolumeSpecName: "ceph") pod "aa8d1192-9ba3-44c8-b5f0-78992aabc7d2" (UID: "aa8d1192-9ba3-44c8-b5f0-78992aabc7d2"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.472770 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-kube-api-access-wj2jh" (OuterVolumeSpecName: "kube-api-access-wj2jh") pod "aa8d1192-9ba3-44c8-b5f0-78992aabc7d2" (UID: "aa8d1192-9ba3-44c8-b5f0-78992aabc7d2"). InnerVolumeSpecName "kube-api-access-wj2jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.494908 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "aa8d1192-9ba3-44c8-b5f0-78992aabc7d2" (UID: "aa8d1192-9ba3-44c8-b5f0-78992aabc7d2"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.508673 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-inventory" (OuterVolumeSpecName: "inventory") pod "aa8d1192-9ba3-44c8-b5f0-78992aabc7d2" (UID: "aa8d1192-9ba3-44c8-b5f0-78992aabc7d2"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.565500 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.565525 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.565537 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.565547 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wj2jh\" (UniqueName: \"kubernetes.io/projected/aa8d1192-9ba3-44c8-b5f0-78992aabc7d2-kube-api-access-wj2jh\") on node \"crc\" DevicePath \"\"" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.914518 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" event={"ID":"aa8d1192-9ba3-44c8-b5f0-78992aabc7d2","Type":"ContainerDied","Data":"f1d3f967fa738e0372b8f6f5486d15f063a8cbf851b6eb1d4df26556707ac29e"} Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.914836 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1d3f967fa738e0372b8f6f5486d15f063a8cbf851b6eb1d4df26556707ac29e" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.914539 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-openstack-openstack-cell1-6mmwx" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.981816 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-openstack-openstack-cell1-xk5l7"] Dec 03 08:39:24 crc kubenswrapper[4831]: E1203 08:39:22.982442 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa8d1192-9ba3-44c8-b5f0-78992aabc7d2" containerName="ceph-client-openstack-openstack-cell1" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.982462 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa8d1192-9ba3-44c8-b5f0-78992aabc7d2" containerName="ceph-client-openstack-openstack-cell1" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.982688 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa8d1192-9ba3-44c8-b5f0-78992aabc7d2" containerName="ceph-client-openstack-openstack-cell1" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.983706 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.985530 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.985819 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.985960 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.986077 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.989454 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:22.993896 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-xk5l7"] Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.004461 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ssh-key\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.004545 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ceph\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.004712 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/98063d34-441b-4c6b-aa0c-4c600be73767-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.004827 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fb95\" (UniqueName: \"kubernetes.io/projected/98063d34-441b-4c6b-aa0c-4c600be73767-kube-api-access-8fb95\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.004970 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-inventory\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.005183 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.107832 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.107969 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ssh-key\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.108064 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ceph\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.108122 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/98063d34-441b-4c6b-aa0c-4c600be73767-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.108188 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fb95\" (UniqueName: \"kubernetes.io/projected/98063d34-441b-4c6b-aa0c-4c600be73767-kube-api-access-8fb95\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.108267 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-inventory\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.109755 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/98063d34-441b-4c6b-aa0c-4c600be73767-ovncontroller-config-0\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.113745 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-inventory\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.114633 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ovn-combined-ca-bundle\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.126170 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ssh-key\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.126453 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ceph\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.127591 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fb95\" (UniqueName: \"kubernetes.io/projected/98063d34-441b-4c6b-aa0c-4c600be73767-kube-api-access-8fb95\") pod \"ovn-openstack-openstack-cell1-xk5l7\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:23.306232 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:24.446789 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-openstack-openstack-cell1-xk5l7"] Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:24.939799 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-xk5l7" event={"ID":"98063d34-441b-4c6b-aa0c-4c600be73767","Type":"ContainerStarted","Data":"9ee95308084af626f0d2e5d9b5c41de68ef8d5c9c1f517f6f0be50bb54bb72e5"} Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:24.940106 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-xk5l7" event={"ID":"98063d34-441b-4c6b-aa0c-4c600be73767","Type":"ContainerStarted","Data":"78ffbce986a0116514eda315b38f439d793b319802e945e12f52a165dc3595de"} Dec 03 08:39:24 crc kubenswrapper[4831]: I1203 08:39:24.959367 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-openstack-openstack-cell1-xk5l7" podStartSLOduration=2.723425119 podStartE2EDuration="2.959342346s" podCreationTimestamp="2025-12-03 08:39:22 +0000 UTC" firstStartedPulling="2025-12-03 08:39:24.448997633 +0000 UTC m=+7701.792581151" lastFinishedPulling="2025-12-03 08:39:24.68491487 +0000 UTC m=+7702.028498378" observedRunningTime="2025-12-03 08:39:24.957656243 +0000 UTC m=+7702.301239751" watchObservedRunningTime="2025-12-03 08:39:24.959342346 +0000 UTC m=+7702.302925874" Dec 03 08:39:57 crc kubenswrapper[4831]: I1203 08:39:57.596368 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:39:57 crc kubenswrapper[4831]: I1203 08:39:57.596899 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:40:27 crc kubenswrapper[4831]: I1203 08:40:27.596762 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:40:27 crc kubenswrapper[4831]: I1203 08:40:27.597254 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:40:35 crc kubenswrapper[4831]: I1203 08:40:35.732747 4831 generic.go:334] "Generic (PLEG): container finished" podID="98063d34-441b-4c6b-aa0c-4c600be73767" containerID="9ee95308084af626f0d2e5d9b5c41de68ef8d5c9c1f517f6f0be50bb54bb72e5" exitCode=0 Dec 03 08:40:35 crc kubenswrapper[4831]: I1203 08:40:35.732830 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-xk5l7" event={"ID":"98063d34-441b-4c6b-aa0c-4c600be73767","Type":"ContainerDied","Data":"9ee95308084af626f0d2e5d9b5c41de68ef8d5c9c1f517f6f0be50bb54bb72e5"} Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.265101 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.372691 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-inventory\") pod \"98063d34-441b-4c6b-aa0c-4c600be73767\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.372761 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ssh-key\") pod \"98063d34-441b-4c6b-aa0c-4c600be73767\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.372842 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8fb95\" (UniqueName: \"kubernetes.io/projected/98063d34-441b-4c6b-aa0c-4c600be73767-kube-api-access-8fb95\") pod \"98063d34-441b-4c6b-aa0c-4c600be73767\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.372870 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/98063d34-441b-4c6b-aa0c-4c600be73767-ovncontroller-config-0\") pod \"98063d34-441b-4c6b-aa0c-4c600be73767\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.372943 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ceph\") pod \"98063d34-441b-4c6b-aa0c-4c600be73767\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.372994 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ovn-combined-ca-bundle\") pod \"98063d34-441b-4c6b-aa0c-4c600be73767\" (UID: \"98063d34-441b-4c6b-aa0c-4c600be73767\") " Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.378043 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/98063d34-441b-4c6b-aa0c-4c600be73767-kube-api-access-8fb95" (OuterVolumeSpecName: "kube-api-access-8fb95") pod "98063d34-441b-4c6b-aa0c-4c600be73767" (UID: "98063d34-441b-4c6b-aa0c-4c600be73767"). InnerVolumeSpecName "kube-api-access-8fb95". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.383552 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "98063d34-441b-4c6b-aa0c-4c600be73767" (UID: "98063d34-441b-4c6b-aa0c-4c600be73767"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.396476 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ceph" (OuterVolumeSpecName: "ceph") pod "98063d34-441b-4c6b-aa0c-4c600be73767" (UID: "98063d34-441b-4c6b-aa0c-4c600be73767"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.406593 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/98063d34-441b-4c6b-aa0c-4c600be73767-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "98063d34-441b-4c6b-aa0c-4c600be73767" (UID: "98063d34-441b-4c6b-aa0c-4c600be73767"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.408523 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "98063d34-441b-4c6b-aa0c-4c600be73767" (UID: "98063d34-441b-4c6b-aa0c-4c600be73767"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.410054 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-inventory" (OuterVolumeSpecName: "inventory") pod "98063d34-441b-4c6b-aa0c-4c600be73767" (UID: "98063d34-441b-4c6b-aa0c-4c600be73767"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.475952 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.476002 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.476020 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.476039 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8fb95\" (UniqueName: \"kubernetes.io/projected/98063d34-441b-4c6b-aa0c-4c600be73767-kube-api-access-8fb95\") on node \"crc\" DevicePath \"\"" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.476058 4831 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/98063d34-441b-4c6b-aa0c-4c600be73767-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.476077 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/98063d34-441b-4c6b-aa0c-4c600be73767-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.775959 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-openstack-openstack-cell1-xk5l7" event={"ID":"98063d34-441b-4c6b-aa0c-4c600be73767","Type":"ContainerDied","Data":"78ffbce986a0116514eda315b38f439d793b319802e945e12f52a165dc3595de"} Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.776256 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78ffbce986a0116514eda315b38f439d793b319802e945e12f52a165dc3595de" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.776018 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-openstack-openstack-cell1-xk5l7" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.884645 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-vf2x4"] Dec 03 08:40:37 crc kubenswrapper[4831]: E1203 08:40:37.885082 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="98063d34-441b-4c6b-aa0c-4c600be73767" containerName="ovn-openstack-openstack-cell1" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.885104 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="98063d34-441b-4c6b-aa0c-4c600be73767" containerName="ovn-openstack-openstack-cell1" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.885366 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="98063d34-441b-4c6b-aa0c-4c600be73767" containerName="ovn-openstack-openstack-cell1" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.886130 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.894014 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.894184 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.894322 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.894447 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.894541 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.894672 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:40:37 crc kubenswrapper[4831]: I1203 08:40:37.903464 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-vf2x4"] Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.034787 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.035247 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmxtr\" (UniqueName: \"kubernetes.io/projected/1a59fd86-ca99-4128-bbcc-6f1075dbafce-kube-api-access-vmxtr\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.035573 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.035723 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.035801 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.035894 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.035940 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.138491 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.138615 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.138686 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.138768 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.138866 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.139110 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.139202 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmxtr\" (UniqueName: \"kubernetes.io/projected/1a59fd86-ca99-4128-bbcc-6f1075dbafce-kube-api-access-vmxtr\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.144270 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-inventory\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.145446 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.146107 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.147241 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-ceph\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.147702 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-ssh-key\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.148539 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-nova-metadata-neutron-config-0\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.162902 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmxtr\" (UniqueName: \"kubernetes.io/projected/1a59fd86-ca99-4128-bbcc-6f1075dbafce-kube-api-access-vmxtr\") pod \"neutron-metadata-openstack-openstack-cell1-vf2x4\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.250035 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:40:38 crc kubenswrapper[4831]: I1203 08:40:38.847067 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-openstack-openstack-cell1-vf2x4"] Dec 03 08:40:39 crc kubenswrapper[4831]: I1203 08:40:39.800308 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" event={"ID":"1a59fd86-ca99-4128-bbcc-6f1075dbafce","Type":"ContainerStarted","Data":"6d2f4c1aef5c0efc6f91239a2a12f6fce4d8d163b7d60ff3859e6922449f2a50"} Dec 03 08:40:39 crc kubenswrapper[4831]: I1203 08:40:39.800608 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" event={"ID":"1a59fd86-ca99-4128-bbcc-6f1075dbafce","Type":"ContainerStarted","Data":"3709022d158059e0c5546e8fc39fd3276762596275ad76ecd7e252905a093aa1"} Dec 03 08:40:39 crc kubenswrapper[4831]: I1203 08:40:39.835556 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" podStartSLOduration=2.683708871 podStartE2EDuration="2.835527258s" podCreationTimestamp="2025-12-03 08:40:37 +0000 UTC" firstStartedPulling="2025-12-03 08:40:38.851489344 +0000 UTC m=+7776.195072852" lastFinishedPulling="2025-12-03 08:40:39.003307731 +0000 UTC m=+7776.346891239" observedRunningTime="2025-12-03 08:40:39.825107964 +0000 UTC m=+7777.168691472" watchObservedRunningTime="2025-12-03 08:40:39.835527258 +0000 UTC m=+7777.179110796" Dec 03 08:40:57 crc kubenswrapper[4831]: I1203 08:40:57.597128 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:40:57 crc kubenswrapper[4831]: I1203 08:40:57.597654 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:40:57 crc kubenswrapper[4831]: I1203 08:40:57.597713 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 08:40:57 crc kubenswrapper[4831]: I1203 08:40:57.598587 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:40:57 crc kubenswrapper[4831]: I1203 08:40:57.598659 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" gracePeriod=600 Dec 03 08:40:57 crc kubenswrapper[4831]: E1203 08:40:57.730435 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:40:57 crc kubenswrapper[4831]: I1203 08:40:57.997751 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" exitCode=0 Dec 03 08:40:57 crc kubenswrapper[4831]: I1203 08:40:57.997803 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5"} Dec 03 08:40:57 crc kubenswrapper[4831]: I1203 08:40:57.997888 4831 scope.go:117] "RemoveContainer" containerID="da347abb806229f967733bacbfad1365f4ae7089c1b4bf1476bfd8bf32654286" Dec 03 08:40:57 crc kubenswrapper[4831]: I1203 08:40:57.999728 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:40:58 crc kubenswrapper[4831]: E1203 08:40:58.000130 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:41:12 crc kubenswrapper[4831]: I1203 08:41:12.013242 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:41:12 crc kubenswrapper[4831]: E1203 08:41:12.014193 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:41:24 crc kubenswrapper[4831]: I1203 08:41:24.013425 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:41:24 crc kubenswrapper[4831]: E1203 08:41:24.014998 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:41:36 crc kubenswrapper[4831]: I1203 08:41:36.449564 4831 generic.go:334] "Generic (PLEG): container finished" podID="1a59fd86-ca99-4128-bbcc-6f1075dbafce" containerID="6d2f4c1aef5c0efc6f91239a2a12f6fce4d8d163b7d60ff3859e6922449f2a50" exitCode=0 Dec 03 08:41:36 crc kubenswrapper[4831]: I1203 08:41:36.449750 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" event={"ID":"1a59fd86-ca99-4128-bbcc-6f1075dbafce","Type":"ContainerDied","Data":"6d2f4c1aef5c0efc6f91239a2a12f6fce4d8d163b7d60ff3859e6922449f2a50"} Dec 03 08:41:37 crc kubenswrapper[4831]: I1203 08:41:37.956414 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.030235 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.030313 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-neutron-metadata-combined-ca-bundle\") pod \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.030486 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmxtr\" (UniqueName: \"kubernetes.io/projected/1a59fd86-ca99-4128-bbcc-6f1075dbafce-kube-api-access-vmxtr\") pod \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.030527 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-ceph\") pod \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.030556 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-inventory\") pod \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.030575 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-nova-metadata-neutron-config-0\") pod \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.030724 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-ssh-key\") pod \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\" (UID: \"1a59fd86-ca99-4128-bbcc-6f1075dbafce\") " Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.035924 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "1a59fd86-ca99-4128-bbcc-6f1075dbafce" (UID: "1a59fd86-ca99-4128-bbcc-6f1075dbafce"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.044575 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-ceph" (OuterVolumeSpecName: "ceph") pod "1a59fd86-ca99-4128-bbcc-6f1075dbafce" (UID: "1a59fd86-ca99-4128-bbcc-6f1075dbafce"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.044719 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a59fd86-ca99-4128-bbcc-6f1075dbafce-kube-api-access-vmxtr" (OuterVolumeSpecName: "kube-api-access-vmxtr") pod "1a59fd86-ca99-4128-bbcc-6f1075dbafce" (UID: "1a59fd86-ca99-4128-bbcc-6f1075dbafce"). InnerVolumeSpecName "kube-api-access-vmxtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.058980 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "1a59fd86-ca99-4128-bbcc-6f1075dbafce" (UID: "1a59fd86-ca99-4128-bbcc-6f1075dbafce"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.070848 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "1a59fd86-ca99-4128-bbcc-6f1075dbafce" (UID: "1a59fd86-ca99-4128-bbcc-6f1075dbafce"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.071078 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-inventory" (OuterVolumeSpecName: "inventory") pod "1a59fd86-ca99-4128-bbcc-6f1075dbafce" (UID: "1a59fd86-ca99-4128-bbcc-6f1075dbafce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.074699 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "1a59fd86-ca99-4128-bbcc-6f1075dbafce" (UID: "1a59fd86-ca99-4128-bbcc-6f1075dbafce"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.133152 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.133480 4831 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.133494 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.133509 4831 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.133522 4831 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.133536 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmxtr\" (UniqueName: \"kubernetes.io/projected/1a59fd86-ca99-4128-bbcc-6f1075dbafce-kube-api-access-vmxtr\") on node \"crc\" DevicePath \"\"" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.133545 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/1a59fd86-ca99-4128-bbcc-6f1075dbafce-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.472679 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" event={"ID":"1a59fd86-ca99-4128-bbcc-6f1075dbafce","Type":"ContainerDied","Data":"3709022d158059e0c5546e8fc39fd3276762596275ad76ecd7e252905a093aa1"} Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.472732 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3709022d158059e0c5546e8fc39fd3276762596275ad76ecd7e252905a093aa1" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.472727 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-openstack-openstack-cell1-vf2x4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.567662 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-kwxr4"] Dec 03 08:41:38 crc kubenswrapper[4831]: E1203 08:41:38.568135 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a59fd86-ca99-4128-bbcc-6f1075dbafce" containerName="neutron-metadata-openstack-openstack-cell1" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.568154 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a59fd86-ca99-4128-bbcc-6f1075dbafce" containerName="neutron-metadata-openstack-openstack-cell1" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.568395 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a59fd86-ca99-4128-bbcc-6f1075dbafce" containerName="neutron-metadata-openstack-openstack-cell1" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.569092 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.572104 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.573014 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.573025 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.573112 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.579797 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.630968 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-kwxr4"] Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.649145 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.649227 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndj65\" (UniqueName: \"kubernetes.io/projected/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-kube-api-access-ndj65\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.649409 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-ssh-key\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.649475 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-ceph\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.649657 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-inventory\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.649751 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.751418 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-inventory\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.751806 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.751913 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.751953 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndj65\" (UniqueName: \"kubernetes.io/projected/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-kube-api-access-ndj65\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.751993 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-ssh-key\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.752027 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-ceph\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.760353 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-ceph\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.760511 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-libvirt-combined-ca-bundle\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.760807 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-inventory\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.761501 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-ssh-key\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.762356 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-libvirt-secret-0\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.782670 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndj65\" (UniqueName: \"kubernetes.io/projected/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-kube-api-access-ndj65\") pod \"libvirt-openstack-openstack-cell1-kwxr4\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:38 crc kubenswrapper[4831]: I1203 08:41:38.897014 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:41:39 crc kubenswrapper[4831]: I1203 08:41:39.013759 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:41:39 crc kubenswrapper[4831]: E1203 08:41:39.014353 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:41:39 crc kubenswrapper[4831]: I1203 08:41:39.567746 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:41:39 crc kubenswrapper[4831]: I1203 08:41:39.569606 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-openstack-openstack-cell1-kwxr4"] Dec 03 08:41:40 crc kubenswrapper[4831]: I1203 08:41:40.494135 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" event={"ID":"eb53812e-d0f1-4a38-b47b-00917ae4fa4f","Type":"ContainerStarted","Data":"c1164d70bd5fd215b5a73720f1189c1697ebe293bd8f511f417cf65380ded665"} Dec 03 08:41:40 crc kubenswrapper[4831]: I1203 08:41:40.494475 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" event={"ID":"eb53812e-d0f1-4a38-b47b-00917ae4fa4f","Type":"ContainerStarted","Data":"42950486032f5bfedebbf87374cbd187f01e7af7622d3925f306377108936651"} Dec 03 08:41:40 crc kubenswrapper[4831]: I1203 08:41:40.512960 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" podStartSLOduration=2.332049559 podStartE2EDuration="2.51290181s" podCreationTimestamp="2025-12-03 08:41:38 +0000 UTC" firstStartedPulling="2025-12-03 08:41:39.567481529 +0000 UTC m=+7836.911065047" lastFinishedPulling="2025-12-03 08:41:39.74833378 +0000 UTC m=+7837.091917298" observedRunningTime="2025-12-03 08:41:40.512791056 +0000 UTC m=+7837.856374564" watchObservedRunningTime="2025-12-03 08:41:40.51290181 +0000 UTC m=+7837.856485328" Dec 03 08:41:50 crc kubenswrapper[4831]: I1203 08:41:50.013687 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:41:50 crc kubenswrapper[4831]: E1203 08:41:50.014938 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:42:01 crc kubenswrapper[4831]: I1203 08:42:01.014150 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:42:01 crc kubenswrapper[4831]: E1203 08:42:01.015652 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:42:13 crc kubenswrapper[4831]: I1203 08:42:13.031182 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:42:13 crc kubenswrapper[4831]: E1203 08:42:13.032994 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:42:26 crc kubenswrapper[4831]: I1203 08:42:26.013802 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:42:26 crc kubenswrapper[4831]: E1203 08:42:26.015160 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:42:41 crc kubenswrapper[4831]: I1203 08:42:41.014952 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:42:41 crc kubenswrapper[4831]: E1203 08:42:41.016187 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:42:54 crc kubenswrapper[4831]: I1203 08:42:54.013983 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:42:54 crc kubenswrapper[4831]: E1203 08:42:54.015764 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:43:05 crc kubenswrapper[4831]: I1203 08:43:05.013576 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:43:05 crc kubenswrapper[4831]: E1203 08:43:05.015923 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:43:16 crc kubenswrapper[4831]: I1203 08:43:16.014030 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:43:16 crc kubenswrapper[4831]: E1203 08:43:16.016030 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:43:30 crc kubenswrapper[4831]: I1203 08:43:30.013012 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:43:30 crc kubenswrapper[4831]: E1203 08:43:30.013801 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:43:42 crc kubenswrapper[4831]: I1203 08:43:42.012788 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:43:42 crc kubenswrapper[4831]: E1203 08:43:42.013834 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:43:57 crc kubenswrapper[4831]: I1203 08:43:57.014815 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:43:57 crc kubenswrapper[4831]: E1203 08:43:57.015629 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:44:11 crc kubenswrapper[4831]: I1203 08:44:11.015178 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:44:11 crc kubenswrapper[4831]: E1203 08:44:11.018072 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:44:24 crc kubenswrapper[4831]: I1203 08:44:24.013865 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:44:24 crc kubenswrapper[4831]: E1203 08:44:24.015074 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:44:25 crc kubenswrapper[4831]: I1203 08:44:25.956383 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-nnp4p"] Dec 03 08:44:25 crc kubenswrapper[4831]: I1203 08:44:25.986896 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nnp4p"] Dec 03 08:44:25 crc kubenswrapper[4831]: I1203 08:44:25.987028 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:26 crc kubenswrapper[4831]: I1203 08:44:26.119717 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402492ef-0db7-4e1f-a309-5d3bfaaba78f-utilities\") pod \"certified-operators-nnp4p\" (UID: \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\") " pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:26 crc kubenswrapper[4831]: I1203 08:44:26.120086 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq225\" (UniqueName: \"kubernetes.io/projected/402492ef-0db7-4e1f-a309-5d3bfaaba78f-kube-api-access-zq225\") pod \"certified-operators-nnp4p\" (UID: \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\") " pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:26 crc kubenswrapper[4831]: I1203 08:44:26.120217 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402492ef-0db7-4e1f-a309-5d3bfaaba78f-catalog-content\") pod \"certified-operators-nnp4p\" (UID: \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\") " pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:26 crc kubenswrapper[4831]: I1203 08:44:26.222053 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402492ef-0db7-4e1f-a309-5d3bfaaba78f-utilities\") pod \"certified-operators-nnp4p\" (UID: \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\") " pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:26 crc kubenswrapper[4831]: I1203 08:44:26.222190 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zq225\" (UniqueName: \"kubernetes.io/projected/402492ef-0db7-4e1f-a309-5d3bfaaba78f-kube-api-access-zq225\") pod \"certified-operators-nnp4p\" (UID: \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\") " pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:26 crc kubenswrapper[4831]: I1203 08:44:26.222254 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402492ef-0db7-4e1f-a309-5d3bfaaba78f-catalog-content\") pod \"certified-operators-nnp4p\" (UID: \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\") " pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:26 crc kubenswrapper[4831]: I1203 08:44:26.222714 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402492ef-0db7-4e1f-a309-5d3bfaaba78f-catalog-content\") pod \"certified-operators-nnp4p\" (UID: \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\") " pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:26 crc kubenswrapper[4831]: I1203 08:44:26.222938 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402492ef-0db7-4e1f-a309-5d3bfaaba78f-utilities\") pod \"certified-operators-nnp4p\" (UID: \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\") " pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:26 crc kubenswrapper[4831]: I1203 08:44:26.241566 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq225\" (UniqueName: \"kubernetes.io/projected/402492ef-0db7-4e1f-a309-5d3bfaaba78f-kube-api-access-zq225\") pod \"certified-operators-nnp4p\" (UID: \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\") " pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:26 crc kubenswrapper[4831]: I1203 08:44:26.320121 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:26 crc kubenswrapper[4831]: I1203 08:44:26.907358 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-nnp4p"] Dec 03 08:44:27 crc kubenswrapper[4831]: I1203 08:44:27.681157 4831 generic.go:334] "Generic (PLEG): container finished" podID="402492ef-0db7-4e1f-a309-5d3bfaaba78f" containerID="c7753a41fff988a0340881da936bf35ff7a60265efc50545a712762e07ce4daf" exitCode=0 Dec 03 08:44:27 crc kubenswrapper[4831]: I1203 08:44:27.681195 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnp4p" event={"ID":"402492ef-0db7-4e1f-a309-5d3bfaaba78f","Type":"ContainerDied","Data":"c7753a41fff988a0340881da936bf35ff7a60265efc50545a712762e07ce4daf"} Dec 03 08:44:27 crc kubenswrapper[4831]: I1203 08:44:27.681474 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnp4p" event={"ID":"402492ef-0db7-4e1f-a309-5d3bfaaba78f","Type":"ContainerStarted","Data":"d08b31b8c0f6438af524532f1065ec8a274e7bc2b4c002e01616f480465df888"} Dec 03 08:44:28 crc kubenswrapper[4831]: I1203 08:44:28.693091 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnp4p" event={"ID":"402492ef-0db7-4e1f-a309-5d3bfaaba78f","Type":"ContainerStarted","Data":"db58d12147bcdcb3bd86ba53cd44fda82c527768a412a6d6e4fce863345bf178"} Dec 03 08:44:29 crc kubenswrapper[4831]: I1203 08:44:29.706457 4831 generic.go:334] "Generic (PLEG): container finished" podID="402492ef-0db7-4e1f-a309-5d3bfaaba78f" containerID="db58d12147bcdcb3bd86ba53cd44fda82c527768a412a6d6e4fce863345bf178" exitCode=0 Dec 03 08:44:29 crc kubenswrapper[4831]: I1203 08:44:29.706572 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnp4p" event={"ID":"402492ef-0db7-4e1f-a309-5d3bfaaba78f","Type":"ContainerDied","Data":"db58d12147bcdcb3bd86ba53cd44fda82c527768a412a6d6e4fce863345bf178"} Dec 03 08:44:30 crc kubenswrapper[4831]: I1203 08:44:30.732248 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnp4p" event={"ID":"402492ef-0db7-4e1f-a309-5d3bfaaba78f","Type":"ContainerStarted","Data":"5da9d90b5fd7f05cf1c76e9794893ff03783683d6d5ad5c34b7401cace11c936"} Dec 03 08:44:30 crc kubenswrapper[4831]: I1203 08:44:30.768707 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-nnp4p" podStartSLOduration=3.355812958 podStartE2EDuration="5.76866957s" podCreationTimestamp="2025-12-03 08:44:25 +0000 UTC" firstStartedPulling="2025-12-03 08:44:27.683338983 +0000 UTC m=+8005.026922491" lastFinishedPulling="2025-12-03 08:44:30.096195595 +0000 UTC m=+8007.439779103" observedRunningTime="2025-12-03 08:44:30.757771831 +0000 UTC m=+8008.101355359" watchObservedRunningTime="2025-12-03 08:44:30.76866957 +0000 UTC m=+8008.112253118" Dec 03 08:44:36 crc kubenswrapper[4831]: I1203 08:44:36.321431 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:36 crc kubenswrapper[4831]: I1203 08:44:36.322093 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:36 crc kubenswrapper[4831]: I1203 08:44:36.378747 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:36 crc kubenswrapper[4831]: I1203 08:44:36.899008 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:36 crc kubenswrapper[4831]: I1203 08:44:36.952422 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nnp4p"] Dec 03 08:44:37 crc kubenswrapper[4831]: I1203 08:44:37.012759 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:44:37 crc kubenswrapper[4831]: E1203 08:44:37.013084 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:44:38 crc kubenswrapper[4831]: I1203 08:44:38.833017 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-nnp4p" podUID="402492ef-0db7-4e1f-a309-5d3bfaaba78f" containerName="registry-server" containerID="cri-o://5da9d90b5fd7f05cf1c76e9794893ff03783683d6d5ad5c34b7401cace11c936" gracePeriod=2 Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.517991 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.568911 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402492ef-0db7-4e1f-a309-5d3bfaaba78f-utilities\") pod \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\" (UID: \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\") " Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.568960 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402492ef-0db7-4e1f-a309-5d3bfaaba78f-catalog-content\") pod \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\" (UID: \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\") " Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.569136 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zq225\" (UniqueName: \"kubernetes.io/projected/402492ef-0db7-4e1f-a309-5d3bfaaba78f-kube-api-access-zq225\") pod \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\" (UID: \"402492ef-0db7-4e1f-a309-5d3bfaaba78f\") " Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.570514 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402492ef-0db7-4e1f-a309-5d3bfaaba78f-utilities" (OuterVolumeSpecName: "utilities") pod "402492ef-0db7-4e1f-a309-5d3bfaaba78f" (UID: "402492ef-0db7-4e1f-a309-5d3bfaaba78f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.578769 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/402492ef-0db7-4e1f-a309-5d3bfaaba78f-kube-api-access-zq225" (OuterVolumeSpecName: "kube-api-access-zq225") pod "402492ef-0db7-4e1f-a309-5d3bfaaba78f" (UID: "402492ef-0db7-4e1f-a309-5d3bfaaba78f"). InnerVolumeSpecName "kube-api-access-zq225". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.670986 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/402492ef-0db7-4e1f-a309-5d3bfaaba78f-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.671044 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zq225\" (UniqueName: \"kubernetes.io/projected/402492ef-0db7-4e1f-a309-5d3bfaaba78f-kube-api-access-zq225\") on node \"crc\" DevicePath \"\"" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.836433 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/402492ef-0db7-4e1f-a309-5d3bfaaba78f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "402492ef-0db7-4e1f-a309-5d3bfaaba78f" (UID: "402492ef-0db7-4e1f-a309-5d3bfaaba78f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.845312 4831 generic.go:334] "Generic (PLEG): container finished" podID="402492ef-0db7-4e1f-a309-5d3bfaaba78f" containerID="5da9d90b5fd7f05cf1c76e9794893ff03783683d6d5ad5c34b7401cace11c936" exitCode=0 Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.845366 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnp4p" event={"ID":"402492ef-0db7-4e1f-a309-5d3bfaaba78f","Type":"ContainerDied","Data":"5da9d90b5fd7f05cf1c76e9794893ff03783683d6d5ad5c34b7401cace11c936"} Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.845427 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-nnp4p" event={"ID":"402492ef-0db7-4e1f-a309-5d3bfaaba78f","Type":"ContainerDied","Data":"d08b31b8c0f6438af524532f1065ec8a274e7bc2b4c002e01616f480465df888"} Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.845451 4831 scope.go:117] "RemoveContainer" containerID="5da9d90b5fd7f05cf1c76e9794893ff03783683d6d5ad5c34b7401cace11c936" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.845522 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-nnp4p" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.876775 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/402492ef-0db7-4e1f-a309-5d3bfaaba78f-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.880371 4831 scope.go:117] "RemoveContainer" containerID="db58d12147bcdcb3bd86ba53cd44fda82c527768a412a6d6e4fce863345bf178" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.918214 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-nnp4p"] Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.926110 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-nnp4p"] Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.931944 4831 scope.go:117] "RemoveContainer" containerID="c7753a41fff988a0340881da936bf35ff7a60265efc50545a712762e07ce4daf" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.986067 4831 scope.go:117] "RemoveContainer" containerID="5da9d90b5fd7f05cf1c76e9794893ff03783683d6d5ad5c34b7401cace11c936" Dec 03 08:44:39 crc kubenswrapper[4831]: E1203 08:44:39.987406 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5da9d90b5fd7f05cf1c76e9794893ff03783683d6d5ad5c34b7401cace11c936\": container with ID starting with 5da9d90b5fd7f05cf1c76e9794893ff03783683d6d5ad5c34b7401cace11c936 not found: ID does not exist" containerID="5da9d90b5fd7f05cf1c76e9794893ff03783683d6d5ad5c34b7401cace11c936" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.987486 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5da9d90b5fd7f05cf1c76e9794893ff03783683d6d5ad5c34b7401cace11c936"} err="failed to get container status \"5da9d90b5fd7f05cf1c76e9794893ff03783683d6d5ad5c34b7401cace11c936\": rpc error: code = NotFound desc = could not find container \"5da9d90b5fd7f05cf1c76e9794893ff03783683d6d5ad5c34b7401cace11c936\": container with ID starting with 5da9d90b5fd7f05cf1c76e9794893ff03783683d6d5ad5c34b7401cace11c936 not found: ID does not exist" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.987518 4831 scope.go:117] "RemoveContainer" containerID="db58d12147bcdcb3bd86ba53cd44fda82c527768a412a6d6e4fce863345bf178" Dec 03 08:44:39 crc kubenswrapper[4831]: E1203 08:44:39.989030 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db58d12147bcdcb3bd86ba53cd44fda82c527768a412a6d6e4fce863345bf178\": container with ID starting with db58d12147bcdcb3bd86ba53cd44fda82c527768a412a6d6e4fce863345bf178 not found: ID does not exist" containerID="db58d12147bcdcb3bd86ba53cd44fda82c527768a412a6d6e4fce863345bf178" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.989076 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db58d12147bcdcb3bd86ba53cd44fda82c527768a412a6d6e4fce863345bf178"} err="failed to get container status \"db58d12147bcdcb3bd86ba53cd44fda82c527768a412a6d6e4fce863345bf178\": rpc error: code = NotFound desc = could not find container \"db58d12147bcdcb3bd86ba53cd44fda82c527768a412a6d6e4fce863345bf178\": container with ID starting with db58d12147bcdcb3bd86ba53cd44fda82c527768a412a6d6e4fce863345bf178 not found: ID does not exist" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.989203 4831 scope.go:117] "RemoveContainer" containerID="c7753a41fff988a0340881da936bf35ff7a60265efc50545a712762e07ce4daf" Dec 03 08:44:39 crc kubenswrapper[4831]: E1203 08:44:39.990057 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7753a41fff988a0340881da936bf35ff7a60265efc50545a712762e07ce4daf\": container with ID starting with c7753a41fff988a0340881da936bf35ff7a60265efc50545a712762e07ce4daf not found: ID does not exist" containerID="c7753a41fff988a0340881da936bf35ff7a60265efc50545a712762e07ce4daf" Dec 03 08:44:39 crc kubenswrapper[4831]: I1203 08:44:39.990111 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7753a41fff988a0340881da936bf35ff7a60265efc50545a712762e07ce4daf"} err="failed to get container status \"c7753a41fff988a0340881da936bf35ff7a60265efc50545a712762e07ce4daf\": rpc error: code = NotFound desc = could not find container \"c7753a41fff988a0340881da936bf35ff7a60265efc50545a712762e07ce4daf\": container with ID starting with c7753a41fff988a0340881da936bf35ff7a60265efc50545a712762e07ce4daf not found: ID does not exist" Dec 03 08:44:41 crc kubenswrapper[4831]: I1203 08:44:41.037935 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="402492ef-0db7-4e1f-a309-5d3bfaaba78f" path="/var/lib/kubelet/pods/402492ef-0db7-4e1f-a309-5d3bfaaba78f/volumes" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.047256 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-28lrm"] Dec 03 08:44:42 crc kubenswrapper[4831]: E1203 08:44:42.048203 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402492ef-0db7-4e1f-a309-5d3bfaaba78f" containerName="registry-server" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.048291 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="402492ef-0db7-4e1f-a309-5d3bfaaba78f" containerName="registry-server" Dec 03 08:44:42 crc kubenswrapper[4831]: E1203 08:44:42.048397 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402492ef-0db7-4e1f-a309-5d3bfaaba78f" containerName="extract-content" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.048449 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="402492ef-0db7-4e1f-a309-5d3bfaaba78f" containerName="extract-content" Dec 03 08:44:42 crc kubenswrapper[4831]: E1203 08:44:42.048509 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="402492ef-0db7-4e1f-a309-5d3bfaaba78f" containerName="extract-utilities" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.048554 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="402492ef-0db7-4e1f-a309-5d3bfaaba78f" containerName="extract-utilities" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.048825 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="402492ef-0db7-4e1f-a309-5d3bfaaba78f" containerName="registry-server" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.050651 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.070391 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-28lrm"] Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.146866 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mcxdm\" (UniqueName: \"kubernetes.io/projected/1b276208-b556-44ce-b5b0-6d0705954d05-kube-api-access-mcxdm\") pod \"community-operators-28lrm\" (UID: \"1b276208-b556-44ce-b5b0-6d0705954d05\") " pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.146979 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b276208-b556-44ce-b5b0-6d0705954d05-utilities\") pod \"community-operators-28lrm\" (UID: \"1b276208-b556-44ce-b5b0-6d0705954d05\") " pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.147067 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b276208-b556-44ce-b5b0-6d0705954d05-catalog-content\") pod \"community-operators-28lrm\" (UID: \"1b276208-b556-44ce-b5b0-6d0705954d05\") " pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.248371 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mcxdm\" (UniqueName: \"kubernetes.io/projected/1b276208-b556-44ce-b5b0-6d0705954d05-kube-api-access-mcxdm\") pod \"community-operators-28lrm\" (UID: \"1b276208-b556-44ce-b5b0-6d0705954d05\") " pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.248643 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b276208-b556-44ce-b5b0-6d0705954d05-utilities\") pod \"community-operators-28lrm\" (UID: \"1b276208-b556-44ce-b5b0-6d0705954d05\") " pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.248697 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b276208-b556-44ce-b5b0-6d0705954d05-catalog-content\") pod \"community-operators-28lrm\" (UID: \"1b276208-b556-44ce-b5b0-6d0705954d05\") " pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.249176 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b276208-b556-44ce-b5b0-6d0705954d05-catalog-content\") pod \"community-operators-28lrm\" (UID: \"1b276208-b556-44ce-b5b0-6d0705954d05\") " pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.249212 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b276208-b556-44ce-b5b0-6d0705954d05-utilities\") pod \"community-operators-28lrm\" (UID: \"1b276208-b556-44ce-b5b0-6d0705954d05\") " pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.268117 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mcxdm\" (UniqueName: \"kubernetes.io/projected/1b276208-b556-44ce-b5b0-6d0705954d05-kube-api-access-mcxdm\") pod \"community-operators-28lrm\" (UID: \"1b276208-b556-44ce-b5b0-6d0705954d05\") " pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.392895 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:42 crc kubenswrapper[4831]: I1203 08:44:42.940490 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-28lrm"] Dec 03 08:44:43 crc kubenswrapper[4831]: I1203 08:44:43.902552 4831 generic.go:334] "Generic (PLEG): container finished" podID="1b276208-b556-44ce-b5b0-6d0705954d05" containerID="146be2a867027b894e3ea134600d85d0a1b33dace1a59fa9319b19c45174cdf6" exitCode=0 Dec 03 08:44:43 crc kubenswrapper[4831]: I1203 08:44:43.902635 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28lrm" event={"ID":"1b276208-b556-44ce-b5b0-6d0705954d05","Type":"ContainerDied","Data":"146be2a867027b894e3ea134600d85d0a1b33dace1a59fa9319b19c45174cdf6"} Dec 03 08:44:43 crc kubenswrapper[4831]: I1203 08:44:43.903683 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28lrm" event={"ID":"1b276208-b556-44ce-b5b0-6d0705954d05","Type":"ContainerStarted","Data":"1247de802b4a960aa5b36662b9c90fb77c64abeb652691daac40158674600fd5"} Dec 03 08:44:44 crc kubenswrapper[4831]: I1203 08:44:44.916980 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28lrm" event={"ID":"1b276208-b556-44ce-b5b0-6d0705954d05","Type":"ContainerStarted","Data":"913bb0c51e5d951b3fe2c53930d145c3fd41a7c6f046664fe00e7441752239b8"} Dec 03 08:44:45 crc kubenswrapper[4831]: I1203 08:44:45.929054 4831 generic.go:334] "Generic (PLEG): container finished" podID="1b276208-b556-44ce-b5b0-6d0705954d05" containerID="913bb0c51e5d951b3fe2c53930d145c3fd41a7c6f046664fe00e7441752239b8" exitCode=0 Dec 03 08:44:45 crc kubenswrapper[4831]: I1203 08:44:45.929149 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28lrm" event={"ID":"1b276208-b556-44ce-b5b0-6d0705954d05","Type":"ContainerDied","Data":"913bb0c51e5d951b3fe2c53930d145c3fd41a7c6f046664fe00e7441752239b8"} Dec 03 08:44:46 crc kubenswrapper[4831]: I1203 08:44:46.941044 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28lrm" event={"ID":"1b276208-b556-44ce-b5b0-6d0705954d05","Type":"ContainerStarted","Data":"b9675819893a97fc89d7a6b8052b593b6f6528d279ab454a899fb93ea9e662e1"} Dec 03 08:44:46 crc kubenswrapper[4831]: I1203 08:44:46.962383 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-28lrm" podStartSLOduration=2.55522968 podStartE2EDuration="4.962361583s" podCreationTimestamp="2025-12-03 08:44:42 +0000 UTC" firstStartedPulling="2025-12-03 08:44:43.905855674 +0000 UTC m=+8021.249439222" lastFinishedPulling="2025-12-03 08:44:46.312987617 +0000 UTC m=+8023.656571125" observedRunningTime="2025-12-03 08:44:46.95587087 +0000 UTC m=+8024.299454408" watchObservedRunningTime="2025-12-03 08:44:46.962361583 +0000 UTC m=+8024.305945091" Dec 03 08:44:51 crc kubenswrapper[4831]: I1203 08:44:51.012940 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:44:51 crc kubenswrapper[4831]: E1203 08:44:51.013849 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:44:52 crc kubenswrapper[4831]: I1203 08:44:52.393879 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:52 crc kubenswrapper[4831]: I1203 08:44:52.394968 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:52 crc kubenswrapper[4831]: I1203 08:44:52.455293 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:53 crc kubenswrapper[4831]: I1203 08:44:53.051733 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:53 crc kubenswrapper[4831]: I1203 08:44:53.110721 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-28lrm"] Dec 03 08:44:55 crc kubenswrapper[4831]: I1203 08:44:55.023074 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-28lrm" podUID="1b276208-b556-44ce-b5b0-6d0705954d05" containerName="registry-server" containerID="cri-o://b9675819893a97fc89d7a6b8052b593b6f6528d279ab454a899fb93ea9e662e1" gracePeriod=2 Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.062741 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28lrm" event={"ID":"1b276208-b556-44ce-b5b0-6d0705954d05","Type":"ContainerDied","Data":"b9675819893a97fc89d7a6b8052b593b6f6528d279ab454a899fb93ea9e662e1"} Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.062857 4831 generic.go:334] "Generic (PLEG): container finished" podID="1b276208-b556-44ce-b5b0-6d0705954d05" containerID="b9675819893a97fc89d7a6b8052b593b6f6528d279ab454a899fb93ea9e662e1" exitCode=0 Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.063256 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-28lrm" event={"ID":"1b276208-b556-44ce-b5b0-6d0705954d05","Type":"ContainerDied","Data":"1247de802b4a960aa5b36662b9c90fb77c64abeb652691daac40158674600fd5"} Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.063424 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1247de802b4a960aa5b36662b9c90fb77c64abeb652691daac40158674600fd5" Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.086922 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.226082 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b276208-b556-44ce-b5b0-6d0705954d05-utilities\") pod \"1b276208-b556-44ce-b5b0-6d0705954d05\" (UID: \"1b276208-b556-44ce-b5b0-6d0705954d05\") " Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.226525 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b276208-b556-44ce-b5b0-6d0705954d05-catalog-content\") pod \"1b276208-b556-44ce-b5b0-6d0705954d05\" (UID: \"1b276208-b556-44ce-b5b0-6d0705954d05\") " Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.226638 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mcxdm\" (UniqueName: \"kubernetes.io/projected/1b276208-b556-44ce-b5b0-6d0705954d05-kube-api-access-mcxdm\") pod \"1b276208-b556-44ce-b5b0-6d0705954d05\" (UID: \"1b276208-b556-44ce-b5b0-6d0705954d05\") " Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.228731 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b276208-b556-44ce-b5b0-6d0705954d05-utilities" (OuterVolumeSpecName: "utilities") pod "1b276208-b556-44ce-b5b0-6d0705954d05" (UID: "1b276208-b556-44ce-b5b0-6d0705954d05"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.242559 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b276208-b556-44ce-b5b0-6d0705954d05-kube-api-access-mcxdm" (OuterVolumeSpecName: "kube-api-access-mcxdm") pod "1b276208-b556-44ce-b5b0-6d0705954d05" (UID: "1b276208-b556-44ce-b5b0-6d0705954d05"). InnerVolumeSpecName "kube-api-access-mcxdm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.331866 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mcxdm\" (UniqueName: \"kubernetes.io/projected/1b276208-b556-44ce-b5b0-6d0705954d05-kube-api-access-mcxdm\") on node \"crc\" DevicePath \"\"" Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.331904 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b276208-b556-44ce-b5b0-6d0705954d05-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.333691 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b276208-b556-44ce-b5b0-6d0705954d05-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b276208-b556-44ce-b5b0-6d0705954d05" (UID: "1b276208-b556-44ce-b5b0-6d0705954d05"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:44:56 crc kubenswrapper[4831]: I1203 08:44:56.434035 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b276208-b556-44ce-b5b0-6d0705954d05-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:44:57 crc kubenswrapper[4831]: I1203 08:44:57.072944 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-28lrm" Dec 03 08:44:57 crc kubenswrapper[4831]: I1203 08:44:57.096009 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-28lrm"] Dec 03 08:44:57 crc kubenswrapper[4831]: I1203 08:44:57.104359 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-28lrm"] Dec 03 08:44:59 crc kubenswrapper[4831]: I1203 08:44:59.031876 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b276208-b556-44ce-b5b0-6d0705954d05" path="/var/lib/kubelet/pods/1b276208-b556-44ce-b5b0-6d0705954d05/volumes" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.178767 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf"] Dec 03 08:45:00 crc kubenswrapper[4831]: E1203 08:45:00.179594 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b276208-b556-44ce-b5b0-6d0705954d05" containerName="registry-server" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.179612 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b276208-b556-44ce-b5b0-6d0705954d05" containerName="registry-server" Dec 03 08:45:00 crc kubenswrapper[4831]: E1203 08:45:00.179642 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b276208-b556-44ce-b5b0-6d0705954d05" containerName="extract-utilities" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.179655 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b276208-b556-44ce-b5b0-6d0705954d05" containerName="extract-utilities" Dec 03 08:45:00 crc kubenswrapper[4831]: E1203 08:45:00.179701 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b276208-b556-44ce-b5b0-6d0705954d05" containerName="extract-content" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.179710 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b276208-b556-44ce-b5b0-6d0705954d05" containerName="extract-content" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.180012 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b276208-b556-44ce-b5b0-6d0705954d05" containerName="registry-server" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.181030 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.183825 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.186551 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.192542 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf"] Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.319783 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80fade55-3e0a-4cdc-97fd-e3e58fc93880-config-volume\") pod \"collect-profiles-29412525-t82vf\" (UID: \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.319871 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80fade55-3e0a-4cdc-97fd-e3e58fc93880-secret-volume\") pod \"collect-profiles-29412525-t82vf\" (UID: \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.319947 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfs84\" (UniqueName: \"kubernetes.io/projected/80fade55-3e0a-4cdc-97fd-e3e58fc93880-kube-api-access-kfs84\") pod \"collect-profiles-29412525-t82vf\" (UID: \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.421911 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfs84\" (UniqueName: \"kubernetes.io/projected/80fade55-3e0a-4cdc-97fd-e3e58fc93880-kube-api-access-kfs84\") pod \"collect-profiles-29412525-t82vf\" (UID: \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.422100 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80fade55-3e0a-4cdc-97fd-e3e58fc93880-config-volume\") pod \"collect-profiles-29412525-t82vf\" (UID: \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.422140 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80fade55-3e0a-4cdc-97fd-e3e58fc93880-secret-volume\") pod \"collect-profiles-29412525-t82vf\" (UID: \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.422947 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80fade55-3e0a-4cdc-97fd-e3e58fc93880-config-volume\") pod \"collect-profiles-29412525-t82vf\" (UID: \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.428154 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80fade55-3e0a-4cdc-97fd-e3e58fc93880-secret-volume\") pod \"collect-profiles-29412525-t82vf\" (UID: \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.442546 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfs84\" (UniqueName: \"kubernetes.io/projected/80fade55-3e0a-4cdc-97fd-e3e58fc93880-kube-api-access-kfs84\") pod \"collect-profiles-29412525-t82vf\" (UID: \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" Dec 03 08:45:00 crc kubenswrapper[4831]: I1203 08:45:00.511042 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" Dec 03 08:45:01 crc kubenswrapper[4831]: I1203 08:45:01.029594 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf"] Dec 03 08:45:01 crc kubenswrapper[4831]: I1203 08:45:01.115613 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" event={"ID":"80fade55-3e0a-4cdc-97fd-e3e58fc93880","Type":"ContainerStarted","Data":"e7320f851bd6823d53fb8c65961a090fa8e00c7d8e71529d703da37477a3e8a0"} Dec 03 08:45:02 crc kubenswrapper[4831]: I1203 08:45:02.013587 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:45:02 crc kubenswrapper[4831]: E1203 08:45:02.014503 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:45:02 crc kubenswrapper[4831]: I1203 08:45:02.126706 4831 generic.go:334] "Generic (PLEG): container finished" podID="80fade55-3e0a-4cdc-97fd-e3e58fc93880" containerID="5b7e936e187a93f742d1d91f887b553b35c87a45ec4a32454eb9d6ad6df4d81c" exitCode=0 Dec 03 08:45:02 crc kubenswrapper[4831]: I1203 08:45:02.126780 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" event={"ID":"80fade55-3e0a-4cdc-97fd-e3e58fc93880","Type":"ContainerDied","Data":"5b7e936e187a93f742d1d91f887b553b35c87a45ec4a32454eb9d6ad6df4d81c"} Dec 03 08:45:03 crc kubenswrapper[4831]: I1203 08:45:03.552987 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" Dec 03 08:45:03 crc kubenswrapper[4831]: I1203 08:45:03.699085 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfs84\" (UniqueName: \"kubernetes.io/projected/80fade55-3e0a-4cdc-97fd-e3e58fc93880-kube-api-access-kfs84\") pod \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\" (UID: \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\") " Dec 03 08:45:03 crc kubenswrapper[4831]: I1203 08:45:03.699208 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80fade55-3e0a-4cdc-97fd-e3e58fc93880-secret-volume\") pod \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\" (UID: \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\") " Dec 03 08:45:03 crc kubenswrapper[4831]: I1203 08:45:03.699460 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80fade55-3e0a-4cdc-97fd-e3e58fc93880-config-volume\") pod \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\" (UID: \"80fade55-3e0a-4cdc-97fd-e3e58fc93880\") " Dec 03 08:45:03 crc kubenswrapper[4831]: I1203 08:45:03.700430 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80fade55-3e0a-4cdc-97fd-e3e58fc93880-config-volume" (OuterVolumeSpecName: "config-volume") pod "80fade55-3e0a-4cdc-97fd-e3e58fc93880" (UID: "80fade55-3e0a-4cdc-97fd-e3e58fc93880"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:45:03 crc kubenswrapper[4831]: I1203 08:45:03.705502 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80fade55-3e0a-4cdc-97fd-e3e58fc93880-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "80fade55-3e0a-4cdc-97fd-e3e58fc93880" (UID: "80fade55-3e0a-4cdc-97fd-e3e58fc93880"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:45:03 crc kubenswrapper[4831]: I1203 08:45:03.705648 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80fade55-3e0a-4cdc-97fd-e3e58fc93880-kube-api-access-kfs84" (OuterVolumeSpecName: "kube-api-access-kfs84") pod "80fade55-3e0a-4cdc-97fd-e3e58fc93880" (UID: "80fade55-3e0a-4cdc-97fd-e3e58fc93880"). InnerVolumeSpecName "kube-api-access-kfs84". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:45:03 crc kubenswrapper[4831]: I1203 08:45:03.802874 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/80fade55-3e0a-4cdc-97fd-e3e58fc93880-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:03 crc kubenswrapper[4831]: I1203 08:45:03.802921 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfs84\" (UniqueName: \"kubernetes.io/projected/80fade55-3e0a-4cdc-97fd-e3e58fc93880-kube-api-access-kfs84\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:03 crc kubenswrapper[4831]: I1203 08:45:03.802941 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/80fade55-3e0a-4cdc-97fd-e3e58fc93880-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:04 crc kubenswrapper[4831]: I1203 08:45:04.146045 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" event={"ID":"80fade55-3e0a-4cdc-97fd-e3e58fc93880","Type":"ContainerDied","Data":"e7320f851bd6823d53fb8c65961a090fa8e00c7d8e71529d703da37477a3e8a0"} Dec 03 08:45:04 crc kubenswrapper[4831]: I1203 08:45:04.146091 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e7320f851bd6823d53fb8c65961a090fa8e00c7d8e71529d703da37477a3e8a0" Dec 03 08:45:04 crc kubenswrapper[4831]: I1203 08:45:04.146161 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412525-t82vf" Dec 03 08:45:04 crc kubenswrapper[4831]: E1203 08:45:04.379885 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80fade55_3e0a_4cdc_97fd_e3e58fc93880.slice/crio-e7320f851bd6823d53fb8c65961a090fa8e00c7d8e71529d703da37477a3e8a0\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80fade55_3e0a_4cdc_97fd_e3e58fc93880.slice\": RecentStats: unable to find data in memory cache]" Dec 03 08:45:04 crc kubenswrapper[4831]: I1203 08:45:04.660180 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m"] Dec 03 08:45:04 crc kubenswrapper[4831]: I1203 08:45:04.672849 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412480-ngp2m"] Dec 03 08:45:05 crc kubenswrapper[4831]: I1203 08:45:05.025958 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11173dd1-e076-4ad0-8a7e-ba71b69a805e" path="/var/lib/kubelet/pods/11173dd1-e076-4ad0-8a7e-ba71b69a805e/volumes" Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.700943 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sbvln"] Dec 03 08:45:11 crc kubenswrapper[4831]: E1203 08:45:11.702449 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80fade55-3e0a-4cdc-97fd-e3e58fc93880" containerName="collect-profiles" Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.702477 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="80fade55-3e0a-4cdc-97fd-e3e58fc93880" containerName="collect-profiles" Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.702950 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="80fade55-3e0a-4cdc-97fd-e3e58fc93880" containerName="collect-profiles" Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.706086 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.720868 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbvln"] Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.790533 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-catalog-content\") pod \"redhat-marketplace-sbvln\" (UID: \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\") " pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.790593 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hk496\" (UniqueName: \"kubernetes.io/projected/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-kube-api-access-hk496\") pod \"redhat-marketplace-sbvln\" (UID: \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\") " pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.790627 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-utilities\") pod \"redhat-marketplace-sbvln\" (UID: \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\") " pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.893272 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-catalog-content\") pod \"redhat-marketplace-sbvln\" (UID: \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\") " pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.893366 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hk496\" (UniqueName: \"kubernetes.io/projected/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-kube-api-access-hk496\") pod \"redhat-marketplace-sbvln\" (UID: \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\") " pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.893417 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-utilities\") pod \"redhat-marketplace-sbvln\" (UID: \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\") " pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.894070 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-catalog-content\") pod \"redhat-marketplace-sbvln\" (UID: \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\") " pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.894289 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-utilities\") pod \"redhat-marketplace-sbvln\" (UID: \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\") " pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:11 crc kubenswrapper[4831]: I1203 08:45:11.924184 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hk496\" (UniqueName: \"kubernetes.io/projected/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-kube-api-access-hk496\") pod \"redhat-marketplace-sbvln\" (UID: \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\") " pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:12 crc kubenswrapper[4831]: I1203 08:45:12.046065 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:12 crc kubenswrapper[4831]: I1203 08:45:12.582925 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbvln"] Dec 03 08:45:13 crc kubenswrapper[4831]: I1203 08:45:13.251796 4831 generic.go:334] "Generic (PLEG): container finished" podID="eeb1855f-d58b-4910-9f6c-9c409a90e5fc" containerID="661791bec506b200facf492d5967a56d612ed7534c9f1e169a17d929fed3fbf8" exitCode=0 Dec 03 08:45:13 crc kubenswrapper[4831]: I1203 08:45:13.251917 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbvln" event={"ID":"eeb1855f-d58b-4910-9f6c-9c409a90e5fc","Type":"ContainerDied","Data":"661791bec506b200facf492d5967a56d612ed7534c9f1e169a17d929fed3fbf8"} Dec 03 08:45:13 crc kubenswrapper[4831]: I1203 08:45:13.252149 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbvln" event={"ID":"eeb1855f-d58b-4910-9f6c-9c409a90e5fc","Type":"ContainerStarted","Data":"87e97271254226bdef87546bda89b90b040ab894faf7bb7fbe04b7463ecb9b50"} Dec 03 08:45:14 crc kubenswrapper[4831]: I1203 08:45:14.263178 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbvln" event={"ID":"eeb1855f-d58b-4910-9f6c-9c409a90e5fc","Type":"ContainerStarted","Data":"3b18c15bb5937202ea27427a59d117c8d8e6e955e82061c267b61d0f7607ae55"} Dec 03 08:45:15 crc kubenswrapper[4831]: I1203 08:45:15.014030 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:45:15 crc kubenswrapper[4831]: E1203 08:45:15.014892 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:45:15 crc kubenswrapper[4831]: I1203 08:45:15.280370 4831 generic.go:334] "Generic (PLEG): container finished" podID="eeb1855f-d58b-4910-9f6c-9c409a90e5fc" containerID="3b18c15bb5937202ea27427a59d117c8d8e6e955e82061c267b61d0f7607ae55" exitCode=0 Dec 03 08:45:15 crc kubenswrapper[4831]: I1203 08:45:15.280423 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbvln" event={"ID":"eeb1855f-d58b-4910-9f6c-9c409a90e5fc","Type":"ContainerDied","Data":"3b18c15bb5937202ea27427a59d117c8d8e6e955e82061c267b61d0f7607ae55"} Dec 03 08:45:16 crc kubenswrapper[4831]: I1203 08:45:16.296409 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbvln" event={"ID":"eeb1855f-d58b-4910-9f6c-9c409a90e5fc","Type":"ContainerStarted","Data":"cb879536036310258de57ef8e96ba02f3949abc10a8874bdc6fb541b1e4dcee7"} Dec 03 08:45:16 crc kubenswrapper[4831]: I1203 08:45:16.322380 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sbvln" podStartSLOduration=2.87528989 podStartE2EDuration="5.322353887s" podCreationTimestamp="2025-12-03 08:45:11 +0000 UTC" firstStartedPulling="2025-12-03 08:45:13.254831795 +0000 UTC m=+8050.598415313" lastFinishedPulling="2025-12-03 08:45:15.701895802 +0000 UTC m=+8053.045479310" observedRunningTime="2025-12-03 08:45:16.317428633 +0000 UTC m=+8053.661012161" watchObservedRunningTime="2025-12-03 08:45:16.322353887 +0000 UTC m=+8053.665937415" Dec 03 08:45:22 crc kubenswrapper[4831]: I1203 08:45:22.047136 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:22 crc kubenswrapper[4831]: I1203 08:45:22.047853 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:22 crc kubenswrapper[4831]: I1203 08:45:22.124898 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:22 crc kubenswrapper[4831]: I1203 08:45:22.414143 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:22 crc kubenswrapper[4831]: I1203 08:45:22.469602 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbvln"] Dec 03 08:45:24 crc kubenswrapper[4831]: I1203 08:45:24.385472 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sbvln" podUID="eeb1855f-d58b-4910-9f6c-9c409a90e5fc" containerName="registry-server" containerID="cri-o://cb879536036310258de57ef8e96ba02f3949abc10a8874bdc6fb541b1e4dcee7" gracePeriod=2 Dec 03 08:45:24 crc kubenswrapper[4831]: I1203 08:45:24.925016 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.049790 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-utilities\") pod \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\" (UID: \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\") " Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.049854 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hk496\" (UniqueName: \"kubernetes.io/projected/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-kube-api-access-hk496\") pod \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\" (UID: \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\") " Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.049875 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-catalog-content\") pod \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\" (UID: \"eeb1855f-d58b-4910-9f6c-9c409a90e5fc\") " Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.051074 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-utilities" (OuterVolumeSpecName: "utilities") pod "eeb1855f-d58b-4910-9f6c-9c409a90e5fc" (UID: "eeb1855f-d58b-4910-9f6c-9c409a90e5fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.057676 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-kube-api-access-hk496" (OuterVolumeSpecName: "kube-api-access-hk496") pod "eeb1855f-d58b-4910-9f6c-9c409a90e5fc" (UID: "eeb1855f-d58b-4910-9f6c-9c409a90e5fc"). InnerVolumeSpecName "kube-api-access-hk496". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.070130 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "eeb1855f-d58b-4910-9f6c-9c409a90e5fc" (UID: "eeb1855f-d58b-4910-9f6c-9c409a90e5fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.152677 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.152705 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hk496\" (UniqueName: \"kubernetes.io/projected/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-kube-api-access-hk496\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.152717 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eeb1855f-d58b-4910-9f6c-9c409a90e5fc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.399127 4831 generic.go:334] "Generic (PLEG): container finished" podID="eeb1855f-d58b-4910-9f6c-9c409a90e5fc" containerID="cb879536036310258de57ef8e96ba02f3949abc10a8874bdc6fb541b1e4dcee7" exitCode=0 Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.399204 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbvln" event={"ID":"eeb1855f-d58b-4910-9f6c-9c409a90e5fc","Type":"ContainerDied","Data":"cb879536036310258de57ef8e96ba02f3949abc10a8874bdc6fb541b1e4dcee7"} Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.399495 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sbvln" event={"ID":"eeb1855f-d58b-4910-9f6c-9c409a90e5fc","Type":"ContainerDied","Data":"87e97271254226bdef87546bda89b90b040ab894faf7bb7fbe04b7463ecb9b50"} Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.399524 4831 scope.go:117] "RemoveContainer" containerID="cb879536036310258de57ef8e96ba02f3949abc10a8874bdc6fb541b1e4dcee7" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.399227 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sbvln" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.429337 4831 scope.go:117] "RemoveContainer" containerID="3b18c15bb5937202ea27427a59d117c8d8e6e955e82061c267b61d0f7607ae55" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.457712 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbvln"] Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.470469 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sbvln"] Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.496863 4831 scope.go:117] "RemoveContainer" containerID="661791bec506b200facf492d5967a56d612ed7534c9f1e169a17d929fed3fbf8" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.640549 4831 scope.go:117] "RemoveContainer" containerID="cb879536036310258de57ef8e96ba02f3949abc10a8874bdc6fb541b1e4dcee7" Dec 03 08:45:25 crc kubenswrapper[4831]: E1203 08:45:25.641792 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb879536036310258de57ef8e96ba02f3949abc10a8874bdc6fb541b1e4dcee7\": container with ID starting with cb879536036310258de57ef8e96ba02f3949abc10a8874bdc6fb541b1e4dcee7 not found: ID does not exist" containerID="cb879536036310258de57ef8e96ba02f3949abc10a8874bdc6fb541b1e4dcee7" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.641833 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb879536036310258de57ef8e96ba02f3949abc10a8874bdc6fb541b1e4dcee7"} err="failed to get container status \"cb879536036310258de57ef8e96ba02f3949abc10a8874bdc6fb541b1e4dcee7\": rpc error: code = NotFound desc = could not find container \"cb879536036310258de57ef8e96ba02f3949abc10a8874bdc6fb541b1e4dcee7\": container with ID starting with cb879536036310258de57ef8e96ba02f3949abc10a8874bdc6fb541b1e4dcee7 not found: ID does not exist" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.641861 4831 scope.go:117] "RemoveContainer" containerID="3b18c15bb5937202ea27427a59d117c8d8e6e955e82061c267b61d0f7607ae55" Dec 03 08:45:25 crc kubenswrapper[4831]: E1203 08:45:25.654780 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3b18c15bb5937202ea27427a59d117c8d8e6e955e82061c267b61d0f7607ae55\": container with ID starting with 3b18c15bb5937202ea27427a59d117c8d8e6e955e82061c267b61d0f7607ae55 not found: ID does not exist" containerID="3b18c15bb5937202ea27427a59d117c8d8e6e955e82061c267b61d0f7607ae55" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.654840 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3b18c15bb5937202ea27427a59d117c8d8e6e955e82061c267b61d0f7607ae55"} err="failed to get container status \"3b18c15bb5937202ea27427a59d117c8d8e6e955e82061c267b61d0f7607ae55\": rpc error: code = NotFound desc = could not find container \"3b18c15bb5937202ea27427a59d117c8d8e6e955e82061c267b61d0f7607ae55\": container with ID starting with 3b18c15bb5937202ea27427a59d117c8d8e6e955e82061c267b61d0f7607ae55 not found: ID does not exist" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.654865 4831 scope.go:117] "RemoveContainer" containerID="661791bec506b200facf492d5967a56d612ed7534c9f1e169a17d929fed3fbf8" Dec 03 08:45:25 crc kubenswrapper[4831]: E1203 08:45:25.658167 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"661791bec506b200facf492d5967a56d612ed7534c9f1e169a17d929fed3fbf8\": container with ID starting with 661791bec506b200facf492d5967a56d612ed7534c9f1e169a17d929fed3fbf8 not found: ID does not exist" containerID="661791bec506b200facf492d5967a56d612ed7534c9f1e169a17d929fed3fbf8" Dec 03 08:45:25 crc kubenswrapper[4831]: I1203 08:45:25.658371 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"661791bec506b200facf492d5967a56d612ed7534c9f1e169a17d929fed3fbf8"} err="failed to get container status \"661791bec506b200facf492d5967a56d612ed7534c9f1e169a17d929fed3fbf8\": rpc error: code = NotFound desc = could not find container \"661791bec506b200facf492d5967a56d612ed7534c9f1e169a17d929fed3fbf8\": container with ID starting with 661791bec506b200facf492d5967a56d612ed7534c9f1e169a17d929fed3fbf8 not found: ID does not exist" Dec 03 08:45:26 crc kubenswrapper[4831]: I1203 08:45:26.834998 4831 scope.go:117] "RemoveContainer" containerID="d94b20946ce0656c0d56e455c92899d295eb3a008f134160ab2a94bf4d8dd742" Dec 03 08:45:27 crc kubenswrapper[4831]: I1203 08:45:27.012948 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:45:27 crc kubenswrapper[4831]: E1203 08:45:27.013484 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:45:27 crc kubenswrapper[4831]: I1203 08:45:27.026985 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eeb1855f-d58b-4910-9f6c-9c409a90e5fc" path="/var/lib/kubelet/pods/eeb1855f-d58b-4910-9f6c-9c409a90e5fc/volumes" Dec 03 08:45:42 crc kubenswrapper[4831]: I1203 08:45:42.019152 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:45:42 crc kubenswrapper[4831]: E1203 08:45:42.020026 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:45:57 crc kubenswrapper[4831]: I1203 08:45:57.012869 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:45:57 crc kubenswrapper[4831]: E1203 08:45:57.013755 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:46:11 crc kubenswrapper[4831]: I1203 08:46:11.013490 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:46:11 crc kubenswrapper[4831]: I1203 08:46:11.948901 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"e9eb1a16c4bbe6afd4e6634c7e68468260717e1466313e861ad6fc948922a648"} Dec 03 08:46:29 crc kubenswrapper[4831]: I1203 08:46:29.143625 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" event={"ID":"eb53812e-d0f1-4a38-b47b-00917ae4fa4f","Type":"ContainerDied","Data":"c1164d70bd5fd215b5a73720f1189c1697ebe293bd8f511f417cf65380ded665"} Dec 03 08:46:29 crc kubenswrapper[4831]: I1203 08:46:29.145292 4831 generic.go:334] "Generic (PLEG): container finished" podID="eb53812e-d0f1-4a38-b47b-00917ae4fa4f" containerID="c1164d70bd5fd215b5a73720f1189c1697ebe293bd8f511f417cf65380ded665" exitCode=0 Dec 03 08:46:30 crc kubenswrapper[4831]: I1203 08:46:30.814424 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.013712 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-ssh-key\") pod \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.013812 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-ceph\") pod \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.013905 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-libvirt-secret-0\") pod \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.013936 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndj65\" (UniqueName: \"kubernetes.io/projected/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-kube-api-access-ndj65\") pod \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.013979 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-libvirt-combined-ca-bundle\") pod \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.014118 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-inventory\") pod \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\" (UID: \"eb53812e-d0f1-4a38-b47b-00917ae4fa4f\") " Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.020557 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-kube-api-access-ndj65" (OuterVolumeSpecName: "kube-api-access-ndj65") pod "eb53812e-d0f1-4a38-b47b-00917ae4fa4f" (UID: "eb53812e-d0f1-4a38-b47b-00917ae4fa4f"). InnerVolumeSpecName "kube-api-access-ndj65". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.020617 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-ceph" (OuterVolumeSpecName: "ceph") pod "eb53812e-d0f1-4a38-b47b-00917ae4fa4f" (UID: "eb53812e-d0f1-4a38-b47b-00917ae4fa4f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.022390 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "eb53812e-d0f1-4a38-b47b-00917ae4fa4f" (UID: "eb53812e-d0f1-4a38-b47b-00917ae4fa4f"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.045800 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "eb53812e-d0f1-4a38-b47b-00917ae4fa4f" (UID: "eb53812e-d0f1-4a38-b47b-00917ae4fa4f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.053580 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-inventory" (OuterVolumeSpecName: "inventory") pod "eb53812e-d0f1-4a38-b47b-00917ae4fa4f" (UID: "eb53812e-d0f1-4a38-b47b-00917ae4fa4f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.057628 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "eb53812e-d0f1-4a38-b47b-00917ae4fa4f" (UID: "eb53812e-d0f1-4a38-b47b-00917ae4fa4f"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.116985 4831 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.117032 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndj65\" (UniqueName: \"kubernetes.io/projected/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-kube-api-access-ndj65\") on node \"crc\" DevicePath \"\"" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.117051 4831 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.117063 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.117075 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.117086 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/eb53812e-d0f1-4a38-b47b-00917ae4fa4f-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.178785 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" event={"ID":"eb53812e-d0f1-4a38-b47b-00917ae4fa4f","Type":"ContainerDied","Data":"42950486032f5bfedebbf87374cbd187f01e7af7622d3925f306377108936651"} Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.178832 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42950486032f5bfedebbf87374cbd187f01e7af7622d3925f306377108936651" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.179219 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-openstack-openstack-cell1-kwxr4" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.298190 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-fjbvm"] Dec 03 08:46:31 crc kubenswrapper[4831]: E1203 08:46:31.298703 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb53812e-d0f1-4a38-b47b-00917ae4fa4f" containerName="libvirt-openstack-openstack-cell1" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.298725 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb53812e-d0f1-4a38-b47b-00917ae4fa4f" containerName="libvirt-openstack-openstack-cell1" Dec 03 08:46:31 crc kubenswrapper[4831]: E1203 08:46:31.298739 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb1855f-d58b-4910-9f6c-9c409a90e5fc" containerName="registry-server" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.298747 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb1855f-d58b-4910-9f6c-9c409a90e5fc" containerName="registry-server" Dec 03 08:46:31 crc kubenswrapper[4831]: E1203 08:46:31.298787 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb1855f-d58b-4910-9f6c-9c409a90e5fc" containerName="extract-utilities" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.298794 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb1855f-d58b-4910-9f6c-9c409a90e5fc" containerName="extract-utilities" Dec 03 08:46:31 crc kubenswrapper[4831]: E1203 08:46:31.298812 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eeb1855f-d58b-4910-9f6c-9c409a90e5fc" containerName="extract-content" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.298818 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="eeb1855f-d58b-4910-9f6c-9c409a90e5fc" containerName="extract-content" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.299010 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb53812e-d0f1-4a38-b47b-00917ae4fa4f" containerName="libvirt-openstack-openstack-cell1" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.299036 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="eeb1855f-d58b-4910-9f6c-9c409a90e5fc" containerName="registry-server" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.299867 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.304474 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.304754 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.304820 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.304933 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.305041 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.305138 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.305161 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.320942 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-fjbvm"] Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.424487 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-inventory\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.425239 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.425293 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.425309 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s8g4j\" (UniqueName: \"kubernetes.io/projected/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-kube-api-access-s8g4j\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.425532 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-ceph\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.425608 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.425711 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.425757 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.425784 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.425824 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.425855 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.527549 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.527601 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.527650 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-inventory\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.527697 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.527738 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.527759 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s8g4j\" (UniqueName: \"kubernetes.io/projected/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-kube-api-access-s8g4j\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.527815 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-ceph\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.527853 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.527882 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.527909 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.527929 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.529860 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cells-global-config-1\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.531136 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cells-global-config-0\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.533459 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.534509 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-inventory\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.535153 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.535483 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.537243 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-ssh-key\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.537737 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.540509 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.543835 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-ceph\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.547604 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s8g4j\" (UniqueName: \"kubernetes.io/projected/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-kube-api-access-s8g4j\") pod \"nova-cell1-openstack-openstack-cell1-fjbvm\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:31 crc kubenswrapper[4831]: I1203 08:46:31.642002 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:46:32 crc kubenswrapper[4831]: I1203 08:46:32.277227 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-openstack-cell1-fjbvm"] Dec 03 08:46:32 crc kubenswrapper[4831]: W1203 08:46:32.279068 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71e965c2_d7fd_4d55_8383_e50b6f3ac1ac.slice/crio-1a0c1a63c325ebe164bb1ad0e1cc5b81af54d694296ba9d55e13d2e393edb4ac WatchSource:0}: Error finding container 1a0c1a63c325ebe164bb1ad0e1cc5b81af54d694296ba9d55e13d2e393edb4ac: Status 404 returned error can't find the container with id 1a0c1a63c325ebe164bb1ad0e1cc5b81af54d694296ba9d55e13d2e393edb4ac Dec 03 08:46:33 crc kubenswrapper[4831]: I1203 08:46:33.205979 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" event={"ID":"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac","Type":"ContainerStarted","Data":"dd0b6e9799a81ff91a67a5a456fdb811991d6bf6f4a1a2498e40a686f50e22a8"} Dec 03 08:46:33 crc kubenswrapper[4831]: I1203 08:46:33.206619 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" event={"ID":"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac","Type":"ContainerStarted","Data":"1a0c1a63c325ebe164bb1ad0e1cc5b81af54d694296ba9d55e13d2e393edb4ac"} Dec 03 08:46:33 crc kubenswrapper[4831]: I1203 08:46:33.250656 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" podStartSLOduration=2.032623256 podStartE2EDuration="2.250634786s" podCreationTimestamp="2025-12-03 08:46:31 +0000 UTC" firstStartedPulling="2025-12-03 08:46:32.282798172 +0000 UTC m=+8129.626381680" lastFinishedPulling="2025-12-03 08:46:32.500809702 +0000 UTC m=+8129.844393210" observedRunningTime="2025-12-03 08:46:33.238800997 +0000 UTC m=+8130.582384525" watchObservedRunningTime="2025-12-03 08:46:33.250634786 +0000 UTC m=+8130.594218304" Dec 03 08:48:27 crc kubenswrapper[4831]: I1203 08:48:27.596854 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:48:27 crc kubenswrapper[4831]: I1203 08:48:27.597538 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:48:57 crc kubenswrapper[4831]: I1203 08:48:57.596363 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:48:57 crc kubenswrapper[4831]: I1203 08:48:57.596926 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:49:27 crc kubenswrapper[4831]: I1203 08:49:27.597461 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:49:27 crc kubenswrapper[4831]: I1203 08:49:27.598118 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:49:27 crc kubenswrapper[4831]: I1203 08:49:27.598173 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 08:49:27 crc kubenswrapper[4831]: I1203 08:49:27.599145 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e9eb1a16c4bbe6afd4e6634c7e68468260717e1466313e861ad6fc948922a648"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:49:27 crc kubenswrapper[4831]: I1203 08:49:27.599213 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://e9eb1a16c4bbe6afd4e6634c7e68468260717e1466313e861ad6fc948922a648" gracePeriod=600 Dec 03 08:49:28 crc kubenswrapper[4831]: I1203 08:49:28.419942 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="e9eb1a16c4bbe6afd4e6634c7e68468260717e1466313e861ad6fc948922a648" exitCode=0 Dec 03 08:49:28 crc kubenswrapper[4831]: I1203 08:49:28.420162 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"e9eb1a16c4bbe6afd4e6634c7e68468260717e1466313e861ad6fc948922a648"} Dec 03 08:49:28 crc kubenswrapper[4831]: I1203 08:49:28.420459 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed"} Dec 03 08:49:28 crc kubenswrapper[4831]: I1203 08:49:28.420493 4831 scope.go:117] "RemoveContainer" containerID="8c0620bce0bd8778af3825425ac91f178d859bc62d4ff3c28cd87643972e22b5" Dec 03 08:49:56 crc kubenswrapper[4831]: I1203 08:49:56.813268 4831 generic.go:334] "Generic (PLEG): container finished" podID="71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" containerID="dd0b6e9799a81ff91a67a5a456fdb811991d6bf6f4a1a2498e40a686f50e22a8" exitCode=0 Dec 03 08:49:56 crc kubenswrapper[4831]: I1203 08:49:56.813384 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" event={"ID":"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac","Type":"ContainerDied","Data":"dd0b6e9799a81ff91a67a5a456fdb811991d6bf6f4a1a2498e40a686f50e22a8"} Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.382000 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.398213 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-migration-ssh-key-0\") pod \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.398272 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cells-global-config-0\") pod \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.398383 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-ceph\") pod \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.398424 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-compute-config-0\") pod \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.398487 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-compute-config-1\") pod \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.398613 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s8g4j\" (UniqueName: \"kubernetes.io/projected/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-kube-api-access-s8g4j\") pod \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.398652 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-combined-ca-bundle\") pod \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.398682 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-migration-ssh-key-1\") pod \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.398717 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cells-global-config-1\") pod \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.398738 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-ssh-key\") pod \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.398760 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-inventory\") pod \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\" (UID: \"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac\") " Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.410108 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-kube-api-access-s8g4j" (OuterVolumeSpecName: "kube-api-access-s8g4j") pod "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" (UID: "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac"). InnerVolumeSpecName "kube-api-access-s8g4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.411829 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-ceph" (OuterVolumeSpecName: "ceph") pod "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" (UID: "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.413536 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" (UID: "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.440276 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" (UID: "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.440632 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" (UID: "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.442943 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" (UID: "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.459446 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" (UID: "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.464013 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" (UID: "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.468629 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-inventory" (OuterVolumeSpecName: "inventory") pod "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" (UID: "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.476072 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" (UID: "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.481545 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" (UID: "71e965c2-d7fd-4d55-8383-e50b6f3ac1ac"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.501234 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.501408 4831 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.501467 4831 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.501519 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s8g4j\" (UniqueName: \"kubernetes.io/projected/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-kube-api-access-s8g4j\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.501587 4831 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.501638 4831 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.501695 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.501994 4831 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.502049 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.502098 4831 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.502172 4831 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/71e965c2-d7fd-4d55-8383-e50b6f3ac1ac-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.842257 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" event={"ID":"71e965c2-d7fd-4d55-8383-e50b6f3ac1ac","Type":"ContainerDied","Data":"1a0c1a63c325ebe164bb1ad0e1cc5b81af54d694296ba9d55e13d2e393edb4ac"} Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.842534 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1a0c1a63c325ebe164bb1ad0e1cc5b81af54d694296ba9d55e13d2e393edb4ac" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.842423 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-openstack-cell1-fjbvm" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.984826 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-7z2tk"] Dec 03 08:49:58 crc kubenswrapper[4831]: E1203 08:49:58.985600 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" containerName="nova-cell1-openstack-openstack-cell1" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.985625 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" containerName="nova-cell1-openstack-openstack-cell1" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.985942 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="71e965c2-d7fd-4d55-8383-e50b6f3ac1ac" containerName="nova-cell1-openstack-openstack-cell1" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.986984 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.989940 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.989955 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.990439 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.991011 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.994894 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:49:58 crc kubenswrapper[4831]: I1203 08:49:58.996694 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-7z2tk"] Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.033223 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ssh-key\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.033306 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.033384 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.033415 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.033476 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.033518 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-inventory\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.033579 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kb5bl\" (UniqueName: \"kubernetes.io/projected/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-kube-api-access-kb5bl\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.033728 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceph\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.136077 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceph\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.136147 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ssh-key\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.136170 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.136206 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.136225 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.136272 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.136307 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-inventory\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.136354 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kb5bl\" (UniqueName: \"kubernetes.io/projected/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-kube-api-access-kb5bl\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.141033 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-inventory\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.141136 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-1\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.141243 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-2\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.142895 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ssh-key\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.143773 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-0\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.146102 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceph\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.146728 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-telemetry-combined-ca-bundle\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.151576 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kb5bl\" (UniqueName: \"kubernetes.io/projected/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-kube-api-access-kb5bl\") pod \"telemetry-openstack-openstack-cell1-7z2tk\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.328497 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.933860 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-openstack-openstack-cell1-7z2tk"] Dec 03 08:49:59 crc kubenswrapper[4831]: W1203 08:49:59.940581 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode35e9dbe_4290_4ad4_80b2_2672e5d6903f.slice/crio-c92194bd27b63e60d4f063edf04e839005bba79c5355dbfbe5e54f796a185d02 WatchSource:0}: Error finding container c92194bd27b63e60d4f063edf04e839005bba79c5355dbfbe5e54f796a185d02: Status 404 returned error can't find the container with id c92194bd27b63e60d4f063edf04e839005bba79c5355dbfbe5e54f796a185d02 Dec 03 08:49:59 crc kubenswrapper[4831]: I1203 08:49:59.943522 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:50:00 crc kubenswrapper[4831]: I1203 08:50:00.864356 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" event={"ID":"e35e9dbe-4290-4ad4-80b2-2672e5d6903f","Type":"ContainerStarted","Data":"af6b90630c2c2d58ad24da43260fe80f76660075632d0ca440f8387e7d57e5e2"} Dec 03 08:50:00 crc kubenswrapper[4831]: I1203 08:50:00.864849 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" event={"ID":"e35e9dbe-4290-4ad4-80b2-2672e5d6903f","Type":"ContainerStarted","Data":"c92194bd27b63e60d4f063edf04e839005bba79c5355dbfbe5e54f796a185d02"} Dec 03 08:50:00 crc kubenswrapper[4831]: I1203 08:50:00.891475 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" podStartSLOduration=2.67282496 podStartE2EDuration="2.89145746s" podCreationTimestamp="2025-12-03 08:49:58 +0000 UTC" firstStartedPulling="2025-12-03 08:49:59.943068181 +0000 UTC m=+8337.286651729" lastFinishedPulling="2025-12-03 08:50:00.161700711 +0000 UTC m=+8337.505284229" observedRunningTime="2025-12-03 08:50:00.885412801 +0000 UTC m=+8338.228996309" watchObservedRunningTime="2025-12-03 08:50:00.89145746 +0000 UTC m=+8338.235040978" Dec 03 08:51:09 crc kubenswrapper[4831]: I1203 08:51:09.995056 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2dzwr"] Dec 03 08:51:09 crc kubenswrapper[4831]: I1203 08:51:09.997623 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:10 crc kubenswrapper[4831]: I1203 08:51:10.020908 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2dzwr"] Dec 03 08:51:10 crc kubenswrapper[4831]: I1203 08:51:10.126017 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5507a8ff-b35c-480f-8503-bae215501995-utilities\") pod \"redhat-operators-2dzwr\" (UID: \"5507a8ff-b35c-480f-8503-bae215501995\") " pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:10 crc kubenswrapper[4831]: I1203 08:51:10.126095 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2khww\" (UniqueName: \"kubernetes.io/projected/5507a8ff-b35c-480f-8503-bae215501995-kube-api-access-2khww\") pod \"redhat-operators-2dzwr\" (UID: \"5507a8ff-b35c-480f-8503-bae215501995\") " pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:10 crc kubenswrapper[4831]: I1203 08:51:10.126647 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5507a8ff-b35c-480f-8503-bae215501995-catalog-content\") pod \"redhat-operators-2dzwr\" (UID: \"5507a8ff-b35c-480f-8503-bae215501995\") " pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:10 crc kubenswrapper[4831]: I1203 08:51:10.229125 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5507a8ff-b35c-480f-8503-bae215501995-catalog-content\") pod \"redhat-operators-2dzwr\" (UID: \"5507a8ff-b35c-480f-8503-bae215501995\") " pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:10 crc kubenswrapper[4831]: I1203 08:51:10.229273 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5507a8ff-b35c-480f-8503-bae215501995-utilities\") pod \"redhat-operators-2dzwr\" (UID: \"5507a8ff-b35c-480f-8503-bae215501995\") " pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:10 crc kubenswrapper[4831]: I1203 08:51:10.229347 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2khww\" (UniqueName: \"kubernetes.io/projected/5507a8ff-b35c-480f-8503-bae215501995-kube-api-access-2khww\") pod \"redhat-operators-2dzwr\" (UID: \"5507a8ff-b35c-480f-8503-bae215501995\") " pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:10 crc kubenswrapper[4831]: I1203 08:51:10.230149 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5507a8ff-b35c-480f-8503-bae215501995-catalog-content\") pod \"redhat-operators-2dzwr\" (UID: \"5507a8ff-b35c-480f-8503-bae215501995\") " pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:10 crc kubenswrapper[4831]: I1203 08:51:10.230385 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5507a8ff-b35c-480f-8503-bae215501995-utilities\") pod \"redhat-operators-2dzwr\" (UID: \"5507a8ff-b35c-480f-8503-bae215501995\") " pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:10 crc kubenswrapper[4831]: I1203 08:51:10.249675 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2khww\" (UniqueName: \"kubernetes.io/projected/5507a8ff-b35c-480f-8503-bae215501995-kube-api-access-2khww\") pod \"redhat-operators-2dzwr\" (UID: \"5507a8ff-b35c-480f-8503-bae215501995\") " pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:10 crc kubenswrapper[4831]: I1203 08:51:10.340510 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:10 crc kubenswrapper[4831]: I1203 08:51:10.930596 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2dzwr"] Dec 03 08:51:11 crc kubenswrapper[4831]: I1203 08:51:11.769122 4831 generic.go:334] "Generic (PLEG): container finished" podID="5507a8ff-b35c-480f-8503-bae215501995" containerID="c41005225c876efe8ca1268189d820149b0efbf56239720f3fa0725b8a8da7dd" exitCode=0 Dec 03 08:51:11 crc kubenswrapper[4831]: I1203 08:51:11.769169 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzwr" event={"ID":"5507a8ff-b35c-480f-8503-bae215501995","Type":"ContainerDied","Data":"c41005225c876efe8ca1268189d820149b0efbf56239720f3fa0725b8a8da7dd"} Dec 03 08:51:11 crc kubenswrapper[4831]: I1203 08:51:11.769602 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzwr" event={"ID":"5507a8ff-b35c-480f-8503-bae215501995","Type":"ContainerStarted","Data":"c36b30b3978352d059d25f9b875d1839c84bb64a6bb03ea33d733a4354175a45"} Dec 03 08:51:13 crc kubenswrapper[4831]: I1203 08:51:13.801513 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzwr" event={"ID":"5507a8ff-b35c-480f-8503-bae215501995","Type":"ContainerStarted","Data":"d029ffd704e33068d3075bdb7a5fcdb94c3c592f68284c1b45e79dc0e2704b72"} Dec 03 08:51:16 crc kubenswrapper[4831]: I1203 08:51:16.842271 4831 generic.go:334] "Generic (PLEG): container finished" podID="5507a8ff-b35c-480f-8503-bae215501995" containerID="d029ffd704e33068d3075bdb7a5fcdb94c3c592f68284c1b45e79dc0e2704b72" exitCode=0 Dec 03 08:51:16 crc kubenswrapper[4831]: I1203 08:51:16.842420 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzwr" event={"ID":"5507a8ff-b35c-480f-8503-bae215501995","Type":"ContainerDied","Data":"d029ffd704e33068d3075bdb7a5fcdb94c3c592f68284c1b45e79dc0e2704b72"} Dec 03 08:51:17 crc kubenswrapper[4831]: I1203 08:51:17.859694 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzwr" event={"ID":"5507a8ff-b35c-480f-8503-bae215501995","Type":"ContainerStarted","Data":"78b4b4463e90f00271c88de267ffedfc77a4f3322ebea13f692951beb3351eda"} Dec 03 08:51:17 crc kubenswrapper[4831]: I1203 08:51:17.900261 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2dzwr" podStartSLOduration=3.404868169 podStartE2EDuration="8.900240479s" podCreationTimestamp="2025-12-03 08:51:09 +0000 UTC" firstStartedPulling="2025-12-03 08:51:11.771618565 +0000 UTC m=+8409.115202063" lastFinishedPulling="2025-12-03 08:51:17.266990825 +0000 UTC m=+8414.610574373" observedRunningTime="2025-12-03 08:51:17.88551847 +0000 UTC m=+8415.229101998" watchObservedRunningTime="2025-12-03 08:51:17.900240479 +0000 UTC m=+8415.243823987" Dec 03 08:51:20 crc kubenswrapper[4831]: I1203 08:51:20.341892 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:20 crc kubenswrapper[4831]: I1203 08:51:20.342653 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:21 crc kubenswrapper[4831]: I1203 08:51:21.435237 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-2dzwr" podUID="5507a8ff-b35c-480f-8503-bae215501995" containerName="registry-server" probeResult="failure" output=< Dec 03 08:51:21 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 03 08:51:21 crc kubenswrapper[4831]: > Dec 03 08:51:27 crc kubenswrapper[4831]: I1203 08:51:27.067710 4831 scope.go:117] "RemoveContainer" containerID="913bb0c51e5d951b3fe2c53930d145c3fd41a7c6f046664fe00e7441752239b8" Dec 03 08:51:27 crc kubenswrapper[4831]: I1203 08:51:27.102935 4831 scope.go:117] "RemoveContainer" containerID="b9675819893a97fc89d7a6b8052b593b6f6528d279ab454a899fb93ea9e662e1" Dec 03 08:51:27 crc kubenswrapper[4831]: I1203 08:51:27.178577 4831 scope.go:117] "RemoveContainer" containerID="146be2a867027b894e3ea134600d85d0a1b33dace1a59fa9319b19c45174cdf6" Dec 03 08:51:27 crc kubenswrapper[4831]: I1203 08:51:27.596538 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:51:27 crc kubenswrapper[4831]: I1203 08:51:27.596592 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:51:30 crc kubenswrapper[4831]: I1203 08:51:30.432417 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:30 crc kubenswrapper[4831]: I1203 08:51:30.541126 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:30 crc kubenswrapper[4831]: I1203 08:51:30.681976 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2dzwr"] Dec 03 08:51:32 crc kubenswrapper[4831]: I1203 08:51:32.013064 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2dzwr" podUID="5507a8ff-b35c-480f-8503-bae215501995" containerName="registry-server" containerID="cri-o://78b4b4463e90f00271c88de267ffedfc77a4f3322ebea13f692951beb3351eda" gracePeriod=2 Dec 03 08:51:32 crc kubenswrapper[4831]: I1203 08:51:32.598297 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:32 crc kubenswrapper[4831]: I1203 08:51:32.661377 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2khww\" (UniqueName: \"kubernetes.io/projected/5507a8ff-b35c-480f-8503-bae215501995-kube-api-access-2khww\") pod \"5507a8ff-b35c-480f-8503-bae215501995\" (UID: \"5507a8ff-b35c-480f-8503-bae215501995\") " Dec 03 08:51:32 crc kubenswrapper[4831]: I1203 08:51:32.661465 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5507a8ff-b35c-480f-8503-bae215501995-utilities\") pod \"5507a8ff-b35c-480f-8503-bae215501995\" (UID: \"5507a8ff-b35c-480f-8503-bae215501995\") " Dec 03 08:51:32 crc kubenswrapper[4831]: I1203 08:51:32.661523 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5507a8ff-b35c-480f-8503-bae215501995-catalog-content\") pod \"5507a8ff-b35c-480f-8503-bae215501995\" (UID: \"5507a8ff-b35c-480f-8503-bae215501995\") " Dec 03 08:51:32 crc kubenswrapper[4831]: I1203 08:51:32.662585 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5507a8ff-b35c-480f-8503-bae215501995-utilities" (OuterVolumeSpecName: "utilities") pod "5507a8ff-b35c-480f-8503-bae215501995" (UID: "5507a8ff-b35c-480f-8503-bae215501995"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:51:32 crc kubenswrapper[4831]: I1203 08:51:32.671274 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5507a8ff-b35c-480f-8503-bae215501995-kube-api-access-2khww" (OuterVolumeSpecName: "kube-api-access-2khww") pod "5507a8ff-b35c-480f-8503-bae215501995" (UID: "5507a8ff-b35c-480f-8503-bae215501995"). InnerVolumeSpecName "kube-api-access-2khww". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:51:32 crc kubenswrapper[4831]: I1203 08:51:32.764737 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2khww\" (UniqueName: \"kubernetes.io/projected/5507a8ff-b35c-480f-8503-bae215501995-kube-api-access-2khww\") on node \"crc\" DevicePath \"\"" Dec 03 08:51:32 crc kubenswrapper[4831]: I1203 08:51:32.764773 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5507a8ff-b35c-480f-8503-bae215501995-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:51:32 crc kubenswrapper[4831]: I1203 08:51:32.786044 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5507a8ff-b35c-480f-8503-bae215501995-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5507a8ff-b35c-480f-8503-bae215501995" (UID: "5507a8ff-b35c-480f-8503-bae215501995"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:51:32 crc kubenswrapper[4831]: I1203 08:51:32.866941 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5507a8ff-b35c-480f-8503-bae215501995-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.025412 4831 generic.go:334] "Generic (PLEG): container finished" podID="5507a8ff-b35c-480f-8503-bae215501995" containerID="78b4b4463e90f00271c88de267ffedfc77a4f3322ebea13f692951beb3351eda" exitCode=0 Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.025531 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2dzwr" Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.040476 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzwr" event={"ID":"5507a8ff-b35c-480f-8503-bae215501995","Type":"ContainerDied","Data":"78b4b4463e90f00271c88de267ffedfc77a4f3322ebea13f692951beb3351eda"} Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.040601 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2dzwr" event={"ID":"5507a8ff-b35c-480f-8503-bae215501995","Type":"ContainerDied","Data":"c36b30b3978352d059d25f9b875d1839c84bb64a6bb03ea33d733a4354175a45"} Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.040645 4831 scope.go:117] "RemoveContainer" containerID="78b4b4463e90f00271c88de267ffedfc77a4f3322ebea13f692951beb3351eda" Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.067066 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2dzwr"] Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.070680 4831 scope.go:117] "RemoveContainer" containerID="d029ffd704e33068d3075bdb7a5fcdb94c3c592f68284c1b45e79dc0e2704b72" Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.074605 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2dzwr"] Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.100563 4831 scope.go:117] "RemoveContainer" containerID="c41005225c876efe8ca1268189d820149b0efbf56239720f3fa0725b8a8da7dd" Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.173520 4831 scope.go:117] "RemoveContainer" containerID="78b4b4463e90f00271c88de267ffedfc77a4f3322ebea13f692951beb3351eda" Dec 03 08:51:33 crc kubenswrapper[4831]: E1203 08:51:33.174077 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78b4b4463e90f00271c88de267ffedfc77a4f3322ebea13f692951beb3351eda\": container with ID starting with 78b4b4463e90f00271c88de267ffedfc77a4f3322ebea13f692951beb3351eda not found: ID does not exist" containerID="78b4b4463e90f00271c88de267ffedfc77a4f3322ebea13f692951beb3351eda" Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.174120 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78b4b4463e90f00271c88de267ffedfc77a4f3322ebea13f692951beb3351eda"} err="failed to get container status \"78b4b4463e90f00271c88de267ffedfc77a4f3322ebea13f692951beb3351eda\": rpc error: code = NotFound desc = could not find container \"78b4b4463e90f00271c88de267ffedfc77a4f3322ebea13f692951beb3351eda\": container with ID starting with 78b4b4463e90f00271c88de267ffedfc77a4f3322ebea13f692951beb3351eda not found: ID does not exist" Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.174145 4831 scope.go:117] "RemoveContainer" containerID="d029ffd704e33068d3075bdb7a5fcdb94c3c592f68284c1b45e79dc0e2704b72" Dec 03 08:51:33 crc kubenswrapper[4831]: E1203 08:51:33.174827 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d029ffd704e33068d3075bdb7a5fcdb94c3c592f68284c1b45e79dc0e2704b72\": container with ID starting with d029ffd704e33068d3075bdb7a5fcdb94c3c592f68284c1b45e79dc0e2704b72 not found: ID does not exist" containerID="d029ffd704e33068d3075bdb7a5fcdb94c3c592f68284c1b45e79dc0e2704b72" Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.174871 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d029ffd704e33068d3075bdb7a5fcdb94c3c592f68284c1b45e79dc0e2704b72"} err="failed to get container status \"d029ffd704e33068d3075bdb7a5fcdb94c3c592f68284c1b45e79dc0e2704b72\": rpc error: code = NotFound desc = could not find container \"d029ffd704e33068d3075bdb7a5fcdb94c3c592f68284c1b45e79dc0e2704b72\": container with ID starting with d029ffd704e33068d3075bdb7a5fcdb94c3c592f68284c1b45e79dc0e2704b72 not found: ID does not exist" Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.174899 4831 scope.go:117] "RemoveContainer" containerID="c41005225c876efe8ca1268189d820149b0efbf56239720f3fa0725b8a8da7dd" Dec 03 08:51:33 crc kubenswrapper[4831]: E1203 08:51:33.175226 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c41005225c876efe8ca1268189d820149b0efbf56239720f3fa0725b8a8da7dd\": container with ID starting with c41005225c876efe8ca1268189d820149b0efbf56239720f3fa0725b8a8da7dd not found: ID does not exist" containerID="c41005225c876efe8ca1268189d820149b0efbf56239720f3fa0725b8a8da7dd" Dec 03 08:51:33 crc kubenswrapper[4831]: I1203 08:51:33.175251 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c41005225c876efe8ca1268189d820149b0efbf56239720f3fa0725b8a8da7dd"} err="failed to get container status \"c41005225c876efe8ca1268189d820149b0efbf56239720f3fa0725b8a8da7dd\": rpc error: code = NotFound desc = could not find container \"c41005225c876efe8ca1268189d820149b0efbf56239720f3fa0725b8a8da7dd\": container with ID starting with c41005225c876efe8ca1268189d820149b0efbf56239720f3fa0725b8a8da7dd not found: ID does not exist" Dec 03 08:51:35 crc kubenswrapper[4831]: I1203 08:51:35.104671 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5507a8ff-b35c-480f-8503-bae215501995" path="/var/lib/kubelet/pods/5507a8ff-b35c-480f-8503-bae215501995/volumes" Dec 03 08:51:57 crc kubenswrapper[4831]: I1203 08:51:57.596188 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:51:57 crc kubenswrapper[4831]: I1203 08:51:57.597043 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:52:27 crc kubenswrapper[4831]: I1203 08:52:27.596358 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:52:27 crc kubenswrapper[4831]: I1203 08:52:27.597984 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 08:52:27 crc kubenswrapper[4831]: I1203 08:52:27.598127 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 08:52:27 crc kubenswrapper[4831]: I1203 08:52:27.599258 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 08:52:27 crc kubenswrapper[4831]: I1203 08:52:27.599460 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" gracePeriod=600 Dec 03 08:52:28 crc kubenswrapper[4831]: E1203 08:52:28.458276 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:52:28 crc kubenswrapper[4831]: I1203 08:52:28.800046 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" exitCode=0 Dec 03 08:52:28 crc kubenswrapper[4831]: I1203 08:52:28.800100 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed"} Dec 03 08:52:28 crc kubenswrapper[4831]: I1203 08:52:28.800144 4831 scope.go:117] "RemoveContainer" containerID="e9eb1a16c4bbe6afd4e6634c7e68468260717e1466313e861ad6fc948922a648" Dec 03 08:52:28 crc kubenswrapper[4831]: I1203 08:52:28.800890 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:52:28 crc kubenswrapper[4831]: E1203 08:52:28.801119 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:52:42 crc kubenswrapper[4831]: I1203 08:52:42.013085 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:52:42 crc kubenswrapper[4831]: E1203 08:52:42.014127 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:52:56 crc kubenswrapper[4831]: I1203 08:52:56.013835 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:52:56 crc kubenswrapper[4831]: E1203 08:52:56.014842 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:53:10 crc kubenswrapper[4831]: I1203 08:53:10.013309 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:53:10 crc kubenswrapper[4831]: E1203 08:53:10.014586 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:53:25 crc kubenswrapper[4831]: I1203 08:53:25.013999 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:53:25 crc kubenswrapper[4831]: E1203 08:53:25.015308 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:53:36 crc kubenswrapper[4831]: I1203 08:53:36.012848 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:53:36 crc kubenswrapper[4831]: E1203 08:53:36.013602 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:53:47 crc kubenswrapper[4831]: I1203 08:53:47.013361 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:53:47 crc kubenswrapper[4831]: E1203 08:53:47.014690 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:53:59 crc kubenswrapper[4831]: I1203 08:53:59.019978 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:53:59 crc kubenswrapper[4831]: E1203 08:53:59.021940 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:54:10 crc kubenswrapper[4831]: I1203 08:54:10.014777 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:54:10 crc kubenswrapper[4831]: E1203 08:54:10.015847 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:54:21 crc kubenswrapper[4831]: I1203 08:54:21.013591 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:54:21 crc kubenswrapper[4831]: E1203 08:54:21.014462 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:54:23 crc kubenswrapper[4831]: I1203 08:54:23.330351 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" event={"ID":"e35e9dbe-4290-4ad4-80b2-2672e5d6903f","Type":"ContainerDied","Data":"af6b90630c2c2d58ad24da43260fe80f76660075632d0ca440f8387e7d57e5e2"} Dec 03 08:54:23 crc kubenswrapper[4831]: I1203 08:54:23.330347 4831 generic.go:334] "Generic (PLEG): container finished" podID="e35e9dbe-4290-4ad4-80b2-2672e5d6903f" containerID="af6b90630c2c2d58ad24da43260fe80f76660075632d0ca440f8387e7d57e5e2" exitCode=0 Dec 03 08:54:24 crc kubenswrapper[4831]: I1203 08:54:24.914707 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.091539 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-inventory\") pod \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.091605 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-1\") pod \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.091688 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kb5bl\" (UniqueName: \"kubernetes.io/projected/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-kube-api-access-kb5bl\") pod \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.091715 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceph\") pod \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.091788 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-2\") pod \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.091888 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ssh-key\") pod \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.091976 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-telemetry-combined-ca-bundle\") pod \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.092061 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-0\") pod \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\" (UID: \"e35e9dbe-4290-4ad4-80b2-2672e5d6903f\") " Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.098723 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "e35e9dbe-4290-4ad4-80b2-2672e5d6903f" (UID: "e35e9dbe-4290-4ad4-80b2-2672e5d6903f"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.099268 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-kube-api-access-kb5bl" (OuterVolumeSpecName: "kube-api-access-kb5bl") pod "e35e9dbe-4290-4ad4-80b2-2672e5d6903f" (UID: "e35e9dbe-4290-4ad4-80b2-2672e5d6903f"). InnerVolumeSpecName "kube-api-access-kb5bl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.099284 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceph" (OuterVolumeSpecName: "ceph") pod "e35e9dbe-4290-4ad4-80b2-2672e5d6903f" (UID: "e35e9dbe-4290-4ad4-80b2-2672e5d6903f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.134539 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-inventory" (OuterVolumeSpecName: "inventory") pod "e35e9dbe-4290-4ad4-80b2-2672e5d6903f" (UID: "e35e9dbe-4290-4ad4-80b2-2672e5d6903f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.136141 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "e35e9dbe-4290-4ad4-80b2-2672e5d6903f" (UID: "e35e9dbe-4290-4ad4-80b2-2672e5d6903f"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.146080 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "e35e9dbe-4290-4ad4-80b2-2672e5d6903f" (UID: "e35e9dbe-4290-4ad4-80b2-2672e5d6903f"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.146948 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "e35e9dbe-4290-4ad4-80b2-2672e5d6903f" (UID: "e35e9dbe-4290-4ad4-80b2-2672e5d6903f"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.154081 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "e35e9dbe-4290-4ad4-80b2-2672e5d6903f" (UID: "e35e9dbe-4290-4ad4-80b2-2672e5d6903f"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.194830 4831 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.195111 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.195126 4831 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.195157 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kb5bl\" (UniqueName: \"kubernetes.io/projected/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-kube-api-access-kb5bl\") on node \"crc\" DevicePath \"\"" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.195187 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.195205 4831 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.195218 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.195242 4831 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e35e9dbe-4290-4ad4-80b2-2672e5d6903f-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.376543 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" event={"ID":"e35e9dbe-4290-4ad4-80b2-2672e5d6903f","Type":"ContainerDied","Data":"c92194bd27b63e60d4f063edf04e839005bba79c5355dbfbe5e54f796a185d02"} Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.376585 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c92194bd27b63e60d4f063edf04e839005bba79c5355dbfbe5e54f796a185d02" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.376798 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-openstack-openstack-cell1-7z2tk" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.491397 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-7hxpj"] Dec 03 08:54:25 crc kubenswrapper[4831]: E1203 08:54:25.491829 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5507a8ff-b35c-480f-8503-bae215501995" containerName="extract-content" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.491846 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5507a8ff-b35c-480f-8503-bae215501995" containerName="extract-content" Dec 03 08:54:25 crc kubenswrapper[4831]: E1203 08:54:25.491868 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5507a8ff-b35c-480f-8503-bae215501995" containerName="registry-server" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.491877 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5507a8ff-b35c-480f-8503-bae215501995" containerName="registry-server" Dec 03 08:54:25 crc kubenswrapper[4831]: E1203 08:54:25.491885 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e35e9dbe-4290-4ad4-80b2-2672e5d6903f" containerName="telemetry-openstack-openstack-cell1" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.491891 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e35e9dbe-4290-4ad4-80b2-2672e5d6903f" containerName="telemetry-openstack-openstack-cell1" Dec 03 08:54:25 crc kubenswrapper[4831]: E1203 08:54:25.491910 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5507a8ff-b35c-480f-8503-bae215501995" containerName="extract-utilities" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.491917 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5507a8ff-b35c-480f-8503-bae215501995" containerName="extract-utilities" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.492130 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5507a8ff-b35c-480f-8503-bae215501995" containerName="registry-server" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.492154 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e35e9dbe-4290-4ad4-80b2-2672e5d6903f" containerName="telemetry-openstack-openstack-cell1" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.492949 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.496239 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.497229 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-sriov-agent-neutron-config" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.497433 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.497765 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.498227 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.511034 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-7hxpj"] Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.607260 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.607333 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.607377 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.607708 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.608067 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rsxr\" (UniqueName: \"kubernetes.io/projected/b0f24e55-9970-4294-8bca-b289f2958f85-kube-api-access-7rsxr\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.608166 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.710473 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.710656 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rsxr\" (UniqueName: \"kubernetes.io/projected/b0f24e55-9970-4294-8bca-b289f2958f85-kube-api-access-7rsxr\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.710694 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.710756 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.710921 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.710973 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.715243 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-inventory\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.715623 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-neutron-sriov-agent-neutron-config-0\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.717266 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-neutron-sriov-combined-ca-bundle\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.719240 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-ceph\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.728141 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-ssh-key\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.736876 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rsxr\" (UniqueName: \"kubernetes.io/projected/b0f24e55-9970-4294-8bca-b289f2958f85-kube-api-access-7rsxr\") pod \"neutron-sriov-openstack-openstack-cell1-7hxpj\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:25 crc kubenswrapper[4831]: I1203 08:54:25.863760 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:54:26 crc kubenswrapper[4831]: I1203 08:54:26.537153 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-sriov-openstack-openstack-cell1-7hxpj"] Dec 03 08:54:27 crc kubenswrapper[4831]: I1203 08:54:27.404095 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" event={"ID":"b0f24e55-9970-4294-8bca-b289f2958f85","Type":"ContainerStarted","Data":"17fc0492fe402b55626ebccf401b9966d8df236441343c220c9a6518dc7fe191"} Dec 03 08:54:27 crc kubenswrapper[4831]: I1203 08:54:27.404379 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" event={"ID":"b0f24e55-9970-4294-8bca-b289f2958f85","Type":"ContainerStarted","Data":"5a4f020b7eb7d6de1128d4bc82a892e8f95dad979f61f79f04bc39f46f79066b"} Dec 03 08:54:27 crc kubenswrapper[4831]: I1203 08:54:27.430689 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" podStartSLOduration=2.263525754 podStartE2EDuration="2.43066411s" podCreationTimestamp="2025-12-03 08:54:25 +0000 UTC" firstStartedPulling="2025-12-03 08:54:26.537639265 +0000 UTC m=+8603.881222773" lastFinishedPulling="2025-12-03 08:54:26.704777621 +0000 UTC m=+8604.048361129" observedRunningTime="2025-12-03 08:54:27.422211446 +0000 UTC m=+8604.765794984" watchObservedRunningTime="2025-12-03 08:54:27.43066411 +0000 UTC m=+8604.774247658" Dec 03 08:54:32 crc kubenswrapper[4831]: I1203 08:54:32.012773 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:54:32 crc kubenswrapper[4831]: E1203 08:54:32.014250 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:54:45 crc kubenswrapper[4831]: I1203 08:54:45.015202 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:54:45 crc kubenswrapper[4831]: E1203 08:54:45.016270 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:54:56 crc kubenswrapper[4831]: I1203 08:54:56.013329 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:54:56 crc kubenswrapper[4831]: E1203 08:54:56.014142 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:55:11 crc kubenswrapper[4831]: I1203 08:55:11.013999 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:55:11 crc kubenswrapper[4831]: E1203 08:55:11.015131 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:55:26 crc kubenswrapper[4831]: I1203 08:55:26.013548 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:55:26 crc kubenswrapper[4831]: E1203 08:55:26.014502 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:55:34 crc kubenswrapper[4831]: I1203 08:55:34.244386 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jcxbq"] Dec 03 08:55:34 crc kubenswrapper[4831]: I1203 08:55:34.247313 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:34 crc kubenswrapper[4831]: I1203 08:55:34.252038 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jcxbq"] Dec 03 08:55:34 crc kubenswrapper[4831]: I1203 08:55:34.372186 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a65f44-18a8-4450-b7d6-56d0ec01ef63-catalog-content\") pod \"community-operators-jcxbq\" (UID: \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\") " pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:34 crc kubenswrapper[4831]: I1203 08:55:34.372515 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhhqm\" (UniqueName: \"kubernetes.io/projected/41a65f44-18a8-4450-b7d6-56d0ec01ef63-kube-api-access-zhhqm\") pod \"community-operators-jcxbq\" (UID: \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\") " pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:34 crc kubenswrapper[4831]: I1203 08:55:34.372654 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a65f44-18a8-4450-b7d6-56d0ec01ef63-utilities\") pod \"community-operators-jcxbq\" (UID: \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\") " pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:34 crc kubenswrapper[4831]: I1203 08:55:34.474863 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a65f44-18a8-4450-b7d6-56d0ec01ef63-catalog-content\") pod \"community-operators-jcxbq\" (UID: \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\") " pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:34 crc kubenswrapper[4831]: I1203 08:55:34.474922 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhhqm\" (UniqueName: \"kubernetes.io/projected/41a65f44-18a8-4450-b7d6-56d0ec01ef63-kube-api-access-zhhqm\") pod \"community-operators-jcxbq\" (UID: \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\") " pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:34 crc kubenswrapper[4831]: I1203 08:55:34.474961 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a65f44-18a8-4450-b7d6-56d0ec01ef63-utilities\") pod \"community-operators-jcxbq\" (UID: \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\") " pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:34 crc kubenswrapper[4831]: I1203 08:55:34.475477 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a65f44-18a8-4450-b7d6-56d0ec01ef63-catalog-content\") pod \"community-operators-jcxbq\" (UID: \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\") " pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:34 crc kubenswrapper[4831]: I1203 08:55:34.475489 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a65f44-18a8-4450-b7d6-56d0ec01ef63-utilities\") pod \"community-operators-jcxbq\" (UID: \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\") " pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:34 crc kubenswrapper[4831]: I1203 08:55:34.500537 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhhqm\" (UniqueName: \"kubernetes.io/projected/41a65f44-18a8-4450-b7d6-56d0ec01ef63-kube-api-access-zhhqm\") pod \"community-operators-jcxbq\" (UID: \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\") " pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:34 crc kubenswrapper[4831]: I1203 08:55:34.589888 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:35 crc kubenswrapper[4831]: I1203 08:55:35.180959 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jcxbq"] Dec 03 08:55:35 crc kubenswrapper[4831]: I1203 08:55:35.238613 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcxbq" event={"ID":"41a65f44-18a8-4450-b7d6-56d0ec01ef63","Type":"ContainerStarted","Data":"b16d7bd1201091667b227df368a8f988072fa10ed7d08a1b1adf32762aefa089"} Dec 03 08:55:36 crc kubenswrapper[4831]: I1203 08:55:36.250944 4831 generic.go:334] "Generic (PLEG): container finished" podID="41a65f44-18a8-4450-b7d6-56d0ec01ef63" containerID="128e5bcf5730d070272485cd6b4b2fe322d2e634965463dd0b82e2416a3b7a5e" exitCode=0 Dec 03 08:55:36 crc kubenswrapper[4831]: I1203 08:55:36.250990 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcxbq" event={"ID":"41a65f44-18a8-4450-b7d6-56d0ec01ef63","Type":"ContainerDied","Data":"128e5bcf5730d070272485cd6b4b2fe322d2e634965463dd0b82e2416a3b7a5e"} Dec 03 08:55:36 crc kubenswrapper[4831]: I1203 08:55:36.253664 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 08:55:37 crc kubenswrapper[4831]: I1203 08:55:37.265606 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcxbq" event={"ID":"41a65f44-18a8-4450-b7d6-56d0ec01ef63","Type":"ContainerStarted","Data":"7e89c7ae1d6dfff72e4f91aa6a154378ff4ee68dc8ef176ae0edae7414f12d9d"} Dec 03 08:55:38 crc kubenswrapper[4831]: I1203 08:55:38.276454 4831 generic.go:334] "Generic (PLEG): container finished" podID="41a65f44-18a8-4450-b7d6-56d0ec01ef63" containerID="7e89c7ae1d6dfff72e4f91aa6a154378ff4ee68dc8ef176ae0edae7414f12d9d" exitCode=0 Dec 03 08:55:38 crc kubenswrapper[4831]: I1203 08:55:38.276742 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcxbq" event={"ID":"41a65f44-18a8-4450-b7d6-56d0ec01ef63","Type":"ContainerDied","Data":"7e89c7ae1d6dfff72e4f91aa6a154378ff4ee68dc8ef176ae0edae7414f12d9d"} Dec 03 08:55:39 crc kubenswrapper[4831]: I1203 08:55:39.013704 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:55:39 crc kubenswrapper[4831]: E1203 08:55:39.014293 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:55:39 crc kubenswrapper[4831]: I1203 08:55:39.290101 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcxbq" event={"ID":"41a65f44-18a8-4450-b7d6-56d0ec01ef63","Type":"ContainerStarted","Data":"a4b402e3a4fe861a708bbf63f0eaf7566337d8f13a4220d851d0475f16cce2f1"} Dec 03 08:55:39 crc kubenswrapper[4831]: I1203 08:55:39.311868 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jcxbq" podStartSLOduration=2.719067876 podStartE2EDuration="5.311843682s" podCreationTimestamp="2025-12-03 08:55:34 +0000 UTC" firstStartedPulling="2025-12-03 08:55:36.253391473 +0000 UTC m=+8673.596974981" lastFinishedPulling="2025-12-03 08:55:38.846167279 +0000 UTC m=+8676.189750787" observedRunningTime="2025-12-03 08:55:39.304534025 +0000 UTC m=+8676.648117563" watchObservedRunningTime="2025-12-03 08:55:39.311843682 +0000 UTC m=+8676.655427230" Dec 03 08:55:43 crc kubenswrapper[4831]: I1203 08:55:43.802962 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kshn6"] Dec 03 08:55:43 crc kubenswrapper[4831]: I1203 08:55:43.808331 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:43 crc kubenswrapper[4831]: I1203 08:55:43.818887 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kshn6"] Dec 03 08:55:43 crc kubenswrapper[4831]: I1203 08:55:43.905978 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2514854-4180-4ac5-8c48-6953ec3b32bc-utilities\") pod \"certified-operators-kshn6\" (UID: \"a2514854-4180-4ac5-8c48-6953ec3b32bc\") " pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:43 crc kubenswrapper[4831]: I1203 08:55:43.906074 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7kqg\" (UniqueName: \"kubernetes.io/projected/a2514854-4180-4ac5-8c48-6953ec3b32bc-kube-api-access-b7kqg\") pod \"certified-operators-kshn6\" (UID: \"a2514854-4180-4ac5-8c48-6953ec3b32bc\") " pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:43 crc kubenswrapper[4831]: I1203 08:55:43.906094 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2514854-4180-4ac5-8c48-6953ec3b32bc-catalog-content\") pod \"certified-operators-kshn6\" (UID: \"a2514854-4180-4ac5-8c48-6953ec3b32bc\") " pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:44 crc kubenswrapper[4831]: I1203 08:55:44.008197 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2514854-4180-4ac5-8c48-6953ec3b32bc-utilities\") pod \"certified-operators-kshn6\" (UID: \"a2514854-4180-4ac5-8c48-6953ec3b32bc\") " pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:44 crc kubenswrapper[4831]: I1203 08:55:44.008297 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7kqg\" (UniqueName: \"kubernetes.io/projected/a2514854-4180-4ac5-8c48-6953ec3b32bc-kube-api-access-b7kqg\") pod \"certified-operators-kshn6\" (UID: \"a2514854-4180-4ac5-8c48-6953ec3b32bc\") " pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:44 crc kubenswrapper[4831]: I1203 08:55:44.008333 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2514854-4180-4ac5-8c48-6953ec3b32bc-catalog-content\") pod \"certified-operators-kshn6\" (UID: \"a2514854-4180-4ac5-8c48-6953ec3b32bc\") " pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:44 crc kubenswrapper[4831]: I1203 08:55:44.008862 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2514854-4180-4ac5-8c48-6953ec3b32bc-catalog-content\") pod \"certified-operators-kshn6\" (UID: \"a2514854-4180-4ac5-8c48-6953ec3b32bc\") " pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:44 crc kubenswrapper[4831]: I1203 08:55:44.008868 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2514854-4180-4ac5-8c48-6953ec3b32bc-utilities\") pod \"certified-operators-kshn6\" (UID: \"a2514854-4180-4ac5-8c48-6953ec3b32bc\") " pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:44 crc kubenswrapper[4831]: I1203 08:55:44.028494 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7kqg\" (UniqueName: \"kubernetes.io/projected/a2514854-4180-4ac5-8c48-6953ec3b32bc-kube-api-access-b7kqg\") pod \"certified-operators-kshn6\" (UID: \"a2514854-4180-4ac5-8c48-6953ec3b32bc\") " pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:44 crc kubenswrapper[4831]: I1203 08:55:44.131381 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:44 crc kubenswrapper[4831]: I1203 08:55:44.590896 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:44 crc kubenswrapper[4831]: I1203 08:55:44.591208 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:44 crc kubenswrapper[4831]: I1203 08:55:44.642989 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:44 crc kubenswrapper[4831]: I1203 08:55:44.649063 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kshn6"] Dec 03 08:55:45 crc kubenswrapper[4831]: I1203 08:55:45.364991 4831 generic.go:334] "Generic (PLEG): container finished" podID="a2514854-4180-4ac5-8c48-6953ec3b32bc" containerID="bb3f6912d3c4bb36f5743185a5b6dc23448de1c63012c03dda7f53e695d3b856" exitCode=0 Dec 03 08:55:45 crc kubenswrapper[4831]: I1203 08:55:45.365120 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kshn6" event={"ID":"a2514854-4180-4ac5-8c48-6953ec3b32bc","Type":"ContainerDied","Data":"bb3f6912d3c4bb36f5743185a5b6dc23448de1c63012c03dda7f53e695d3b856"} Dec 03 08:55:45 crc kubenswrapper[4831]: I1203 08:55:45.366524 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kshn6" event={"ID":"a2514854-4180-4ac5-8c48-6953ec3b32bc","Type":"ContainerStarted","Data":"d02da32e25895525490818c7e7305a26cec1a3082b17d585ba0ac59a5eafdf69"} Dec 03 08:55:45 crc kubenswrapper[4831]: I1203 08:55:45.444908 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:46 crc kubenswrapper[4831]: I1203 08:55:46.378025 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kshn6" event={"ID":"a2514854-4180-4ac5-8c48-6953ec3b32bc","Type":"ContainerStarted","Data":"a492257d79a4629b80df5eebbe596745e3010edb67d3d797e8267080266c0860"} Dec 03 08:55:47 crc kubenswrapper[4831]: I1203 08:55:47.004233 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sk2sg"] Dec 03 08:55:47 crc kubenswrapper[4831]: I1203 08:55:47.009699 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:47 crc kubenswrapper[4831]: I1203 08:55:47.025873 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk2sg"] Dec 03 08:55:47 crc kubenswrapper[4831]: I1203 08:55:47.172491 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80e2ca55-52a9-4317-ac36-1592f5df5424-utilities\") pod \"redhat-marketplace-sk2sg\" (UID: \"80e2ca55-52a9-4317-ac36-1592f5df5424\") " pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:47 crc kubenswrapper[4831]: I1203 08:55:47.172738 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80e2ca55-52a9-4317-ac36-1592f5df5424-catalog-content\") pod \"redhat-marketplace-sk2sg\" (UID: \"80e2ca55-52a9-4317-ac36-1592f5df5424\") " pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:47 crc kubenswrapper[4831]: I1203 08:55:47.173585 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmmz5\" (UniqueName: \"kubernetes.io/projected/80e2ca55-52a9-4317-ac36-1592f5df5424-kube-api-access-xmmz5\") pod \"redhat-marketplace-sk2sg\" (UID: \"80e2ca55-52a9-4317-ac36-1592f5df5424\") " pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:47 crc kubenswrapper[4831]: I1203 08:55:47.277279 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmmz5\" (UniqueName: \"kubernetes.io/projected/80e2ca55-52a9-4317-ac36-1592f5df5424-kube-api-access-xmmz5\") pod \"redhat-marketplace-sk2sg\" (UID: \"80e2ca55-52a9-4317-ac36-1592f5df5424\") " pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:47 crc kubenswrapper[4831]: I1203 08:55:47.277489 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80e2ca55-52a9-4317-ac36-1592f5df5424-utilities\") pod \"redhat-marketplace-sk2sg\" (UID: \"80e2ca55-52a9-4317-ac36-1592f5df5424\") " pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:47 crc kubenswrapper[4831]: I1203 08:55:47.277564 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80e2ca55-52a9-4317-ac36-1592f5df5424-catalog-content\") pod \"redhat-marketplace-sk2sg\" (UID: \"80e2ca55-52a9-4317-ac36-1592f5df5424\") " pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:47 crc kubenswrapper[4831]: I1203 08:55:47.278121 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80e2ca55-52a9-4317-ac36-1592f5df5424-catalog-content\") pod \"redhat-marketplace-sk2sg\" (UID: \"80e2ca55-52a9-4317-ac36-1592f5df5424\") " pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:47 crc kubenswrapper[4831]: I1203 08:55:47.278773 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80e2ca55-52a9-4317-ac36-1592f5df5424-utilities\") pod \"redhat-marketplace-sk2sg\" (UID: \"80e2ca55-52a9-4317-ac36-1592f5df5424\") " pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:47 crc kubenswrapper[4831]: I1203 08:55:47.297652 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmmz5\" (UniqueName: \"kubernetes.io/projected/80e2ca55-52a9-4317-ac36-1592f5df5424-kube-api-access-xmmz5\") pod \"redhat-marketplace-sk2sg\" (UID: \"80e2ca55-52a9-4317-ac36-1592f5df5424\") " pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:47 crc kubenswrapper[4831]: I1203 08:55:47.332470 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:48 crc kubenswrapper[4831]: I1203 08:55:48.081036 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk2sg"] Dec 03 08:55:48 crc kubenswrapper[4831]: W1203 08:55:48.085439 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80e2ca55_52a9_4317_ac36_1592f5df5424.slice/crio-e2e8d078030f41829391bc2f1f80adebddbfb30faad9d5fc80033021e329e1c2 WatchSource:0}: Error finding container e2e8d078030f41829391bc2f1f80adebddbfb30faad9d5fc80033021e329e1c2: Status 404 returned error can't find the container with id e2e8d078030f41829391bc2f1f80adebddbfb30faad9d5fc80033021e329e1c2 Dec 03 08:55:48 crc kubenswrapper[4831]: I1203 08:55:48.417939 4831 generic.go:334] "Generic (PLEG): container finished" podID="a2514854-4180-4ac5-8c48-6953ec3b32bc" containerID="a492257d79a4629b80df5eebbe596745e3010edb67d3d797e8267080266c0860" exitCode=0 Dec 03 08:55:48 crc kubenswrapper[4831]: I1203 08:55:48.418242 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kshn6" event={"ID":"a2514854-4180-4ac5-8c48-6953ec3b32bc","Type":"ContainerDied","Data":"a492257d79a4629b80df5eebbe596745e3010edb67d3d797e8267080266c0860"} Dec 03 08:55:48 crc kubenswrapper[4831]: I1203 08:55:48.423686 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk2sg" event={"ID":"80e2ca55-52a9-4317-ac36-1592f5df5424","Type":"ContainerStarted","Data":"5e80eb48a941a3cf16f187ca55aca7bd386d269b208915d07698097c946061ec"} Dec 03 08:55:48 crc kubenswrapper[4831]: I1203 08:55:48.423720 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk2sg" event={"ID":"80e2ca55-52a9-4317-ac36-1592f5df5424","Type":"ContainerStarted","Data":"e2e8d078030f41829391bc2f1f80adebddbfb30faad9d5fc80033021e329e1c2"} Dec 03 08:55:49 crc kubenswrapper[4831]: I1203 08:55:49.435664 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kshn6" event={"ID":"a2514854-4180-4ac5-8c48-6953ec3b32bc","Type":"ContainerStarted","Data":"5a0c951bf198eaf6bb623885921038b2ec24502a10b16c1ee29e9ca93c258fe2"} Dec 03 08:55:49 crc kubenswrapper[4831]: I1203 08:55:49.439332 4831 generic.go:334] "Generic (PLEG): container finished" podID="80e2ca55-52a9-4317-ac36-1592f5df5424" containerID="5e80eb48a941a3cf16f187ca55aca7bd386d269b208915d07698097c946061ec" exitCode=0 Dec 03 08:55:49 crc kubenswrapper[4831]: I1203 08:55:49.439363 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk2sg" event={"ID":"80e2ca55-52a9-4317-ac36-1592f5df5424","Type":"ContainerDied","Data":"5e80eb48a941a3cf16f187ca55aca7bd386d269b208915d07698097c946061ec"} Dec 03 08:55:49 crc kubenswrapper[4831]: I1203 08:55:49.439378 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk2sg" event={"ID":"80e2ca55-52a9-4317-ac36-1592f5df5424","Type":"ContainerStarted","Data":"4954883713b950533b8a4657e3f926f62c60acea3abc2517af4aae8d314911a0"} Dec 03 08:55:49 crc kubenswrapper[4831]: I1203 08:55:49.465033 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kshn6" podStartSLOduration=2.855861254 podStartE2EDuration="6.465014726s" podCreationTimestamp="2025-12-03 08:55:43 +0000 UTC" firstStartedPulling="2025-12-03 08:55:45.367579436 +0000 UTC m=+8682.711162944" lastFinishedPulling="2025-12-03 08:55:48.976732908 +0000 UTC m=+8686.320316416" observedRunningTime="2025-12-03 08:55:49.461697103 +0000 UTC m=+8686.805280611" watchObservedRunningTime="2025-12-03 08:55:49.465014726 +0000 UTC m=+8686.808598234" Dec 03 08:55:49 crc kubenswrapper[4831]: I1203 08:55:49.595741 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jcxbq"] Dec 03 08:55:49 crc kubenswrapper[4831]: I1203 08:55:49.596075 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-jcxbq" podUID="41a65f44-18a8-4450-b7d6-56d0ec01ef63" containerName="registry-server" containerID="cri-o://a4b402e3a4fe861a708bbf63f0eaf7566337d8f13a4220d851d0475f16cce2f1" gracePeriod=2 Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.325583 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.451459 4831 generic.go:334] "Generic (PLEG): container finished" podID="80e2ca55-52a9-4317-ac36-1592f5df5424" containerID="4954883713b950533b8a4657e3f926f62c60acea3abc2517af4aae8d314911a0" exitCode=0 Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.451657 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk2sg" event={"ID":"80e2ca55-52a9-4317-ac36-1592f5df5424","Type":"ContainerDied","Data":"4954883713b950533b8a4657e3f926f62c60acea3abc2517af4aae8d314911a0"} Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.454962 4831 generic.go:334] "Generic (PLEG): container finished" podID="41a65f44-18a8-4450-b7d6-56d0ec01ef63" containerID="a4b402e3a4fe861a708bbf63f0eaf7566337d8f13a4220d851d0475f16cce2f1" exitCode=0 Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.455002 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcxbq" event={"ID":"41a65f44-18a8-4450-b7d6-56d0ec01ef63","Type":"ContainerDied","Data":"a4b402e3a4fe861a708bbf63f0eaf7566337d8f13a4220d851d0475f16cce2f1"} Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.455026 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jcxbq" event={"ID":"41a65f44-18a8-4450-b7d6-56d0ec01ef63","Type":"ContainerDied","Data":"b16d7bd1201091667b227df368a8f988072fa10ed7d08a1b1adf32762aefa089"} Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.455043 4831 scope.go:117] "RemoveContainer" containerID="a4b402e3a4fe861a708bbf63f0eaf7566337d8f13a4220d851d0475f16cce2f1" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.455156 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jcxbq" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.464123 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a65f44-18a8-4450-b7d6-56d0ec01ef63-catalog-content\") pod \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\" (UID: \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\") " Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.464210 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhhqm\" (UniqueName: \"kubernetes.io/projected/41a65f44-18a8-4450-b7d6-56d0ec01ef63-kube-api-access-zhhqm\") pod \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\" (UID: \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\") " Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.464361 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a65f44-18a8-4450-b7d6-56d0ec01ef63-utilities\") pod \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\" (UID: \"41a65f44-18a8-4450-b7d6-56d0ec01ef63\") " Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.465455 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a65f44-18a8-4450-b7d6-56d0ec01ef63-utilities" (OuterVolumeSpecName: "utilities") pod "41a65f44-18a8-4450-b7d6-56d0ec01ef63" (UID: "41a65f44-18a8-4450-b7d6-56d0ec01ef63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.483279 4831 scope.go:117] "RemoveContainer" containerID="7e89c7ae1d6dfff72e4f91aa6a154378ff4ee68dc8ef176ae0edae7414f12d9d" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.491611 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41a65f44-18a8-4450-b7d6-56d0ec01ef63-kube-api-access-zhhqm" (OuterVolumeSpecName: "kube-api-access-zhhqm") pod "41a65f44-18a8-4450-b7d6-56d0ec01ef63" (UID: "41a65f44-18a8-4450-b7d6-56d0ec01ef63"). InnerVolumeSpecName "kube-api-access-zhhqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.511020 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41a65f44-18a8-4450-b7d6-56d0ec01ef63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "41a65f44-18a8-4450-b7d6-56d0ec01ef63" (UID: "41a65f44-18a8-4450-b7d6-56d0ec01ef63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.547404 4831 scope.go:117] "RemoveContainer" containerID="128e5bcf5730d070272485cd6b4b2fe322d2e634965463dd0b82e2416a3b7a5e" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.567854 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhhqm\" (UniqueName: \"kubernetes.io/projected/41a65f44-18a8-4450-b7d6-56d0ec01ef63-kube-api-access-zhhqm\") on node \"crc\" DevicePath \"\"" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.567889 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/41a65f44-18a8-4450-b7d6-56d0ec01ef63-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.567899 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/41a65f44-18a8-4450-b7d6-56d0ec01ef63-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.583961 4831 scope.go:117] "RemoveContainer" containerID="a4b402e3a4fe861a708bbf63f0eaf7566337d8f13a4220d851d0475f16cce2f1" Dec 03 08:55:50 crc kubenswrapper[4831]: E1203 08:55:50.585474 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4b402e3a4fe861a708bbf63f0eaf7566337d8f13a4220d851d0475f16cce2f1\": container with ID starting with a4b402e3a4fe861a708bbf63f0eaf7566337d8f13a4220d851d0475f16cce2f1 not found: ID does not exist" containerID="a4b402e3a4fe861a708bbf63f0eaf7566337d8f13a4220d851d0475f16cce2f1" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.585508 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4b402e3a4fe861a708bbf63f0eaf7566337d8f13a4220d851d0475f16cce2f1"} err="failed to get container status \"a4b402e3a4fe861a708bbf63f0eaf7566337d8f13a4220d851d0475f16cce2f1\": rpc error: code = NotFound desc = could not find container \"a4b402e3a4fe861a708bbf63f0eaf7566337d8f13a4220d851d0475f16cce2f1\": container with ID starting with a4b402e3a4fe861a708bbf63f0eaf7566337d8f13a4220d851d0475f16cce2f1 not found: ID does not exist" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.585528 4831 scope.go:117] "RemoveContainer" containerID="7e89c7ae1d6dfff72e4f91aa6a154378ff4ee68dc8ef176ae0edae7414f12d9d" Dec 03 08:55:50 crc kubenswrapper[4831]: E1203 08:55:50.585870 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e89c7ae1d6dfff72e4f91aa6a154378ff4ee68dc8ef176ae0edae7414f12d9d\": container with ID starting with 7e89c7ae1d6dfff72e4f91aa6a154378ff4ee68dc8ef176ae0edae7414f12d9d not found: ID does not exist" containerID="7e89c7ae1d6dfff72e4f91aa6a154378ff4ee68dc8ef176ae0edae7414f12d9d" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.585973 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e89c7ae1d6dfff72e4f91aa6a154378ff4ee68dc8ef176ae0edae7414f12d9d"} err="failed to get container status \"7e89c7ae1d6dfff72e4f91aa6a154378ff4ee68dc8ef176ae0edae7414f12d9d\": rpc error: code = NotFound desc = could not find container \"7e89c7ae1d6dfff72e4f91aa6a154378ff4ee68dc8ef176ae0edae7414f12d9d\": container with ID starting with 7e89c7ae1d6dfff72e4f91aa6a154378ff4ee68dc8ef176ae0edae7414f12d9d not found: ID does not exist" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.586055 4831 scope.go:117] "RemoveContainer" containerID="128e5bcf5730d070272485cd6b4b2fe322d2e634965463dd0b82e2416a3b7a5e" Dec 03 08:55:50 crc kubenswrapper[4831]: E1203 08:55:50.586444 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"128e5bcf5730d070272485cd6b4b2fe322d2e634965463dd0b82e2416a3b7a5e\": container with ID starting with 128e5bcf5730d070272485cd6b4b2fe322d2e634965463dd0b82e2416a3b7a5e not found: ID does not exist" containerID="128e5bcf5730d070272485cd6b4b2fe322d2e634965463dd0b82e2416a3b7a5e" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.586475 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"128e5bcf5730d070272485cd6b4b2fe322d2e634965463dd0b82e2416a3b7a5e"} err="failed to get container status \"128e5bcf5730d070272485cd6b4b2fe322d2e634965463dd0b82e2416a3b7a5e\": rpc error: code = NotFound desc = could not find container \"128e5bcf5730d070272485cd6b4b2fe322d2e634965463dd0b82e2416a3b7a5e\": container with ID starting with 128e5bcf5730d070272485cd6b4b2fe322d2e634965463dd0b82e2416a3b7a5e not found: ID does not exist" Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.793597 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-jcxbq"] Dec 03 08:55:50 crc kubenswrapper[4831]: I1203 08:55:50.803149 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-jcxbq"] Dec 03 08:55:51 crc kubenswrapper[4831]: I1203 08:55:51.037161 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41a65f44-18a8-4450-b7d6-56d0ec01ef63" path="/var/lib/kubelet/pods/41a65f44-18a8-4450-b7d6-56d0ec01ef63/volumes" Dec 03 08:55:53 crc kubenswrapper[4831]: I1203 08:55:53.511796 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk2sg" event={"ID":"80e2ca55-52a9-4317-ac36-1592f5df5424","Type":"ContainerStarted","Data":"78aa1c9523adc46998dab292f18c6f4665955436ca1cb63e5eb78ab266a22c47"} Dec 03 08:55:53 crc kubenswrapper[4831]: I1203 08:55:53.552563 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sk2sg" podStartSLOduration=3.320088692 podStartE2EDuration="7.552543898s" podCreationTimestamp="2025-12-03 08:55:46 +0000 UTC" firstStartedPulling="2025-12-03 08:55:48.425264201 +0000 UTC m=+8685.768847709" lastFinishedPulling="2025-12-03 08:55:52.657719407 +0000 UTC m=+8690.001302915" observedRunningTime="2025-12-03 08:55:53.543153205 +0000 UTC m=+8690.886736723" watchObservedRunningTime="2025-12-03 08:55:53.552543898 +0000 UTC m=+8690.896127406" Dec 03 08:55:54 crc kubenswrapper[4831]: I1203 08:55:54.013466 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:55:54 crc kubenswrapper[4831]: E1203 08:55:54.013751 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:55:54 crc kubenswrapper[4831]: I1203 08:55:54.131531 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:54 crc kubenswrapper[4831]: I1203 08:55:54.131593 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:54 crc kubenswrapper[4831]: I1203 08:55:54.188560 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:54 crc kubenswrapper[4831]: I1203 08:55:54.587178 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:55 crc kubenswrapper[4831]: I1203 08:55:55.596280 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kshn6"] Dec 03 08:55:56 crc kubenswrapper[4831]: I1203 08:55:56.547846 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-kshn6" podUID="a2514854-4180-4ac5-8c48-6953ec3b32bc" containerName="registry-server" containerID="cri-o://5a0c951bf198eaf6bb623885921038b2ec24502a10b16c1ee29e9ca93c258fe2" gracePeriod=2 Dec 03 08:55:56 crc kubenswrapper[4831]: I1203 08:55:56.978145 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.113664 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2514854-4180-4ac5-8c48-6953ec3b32bc-utilities\") pod \"a2514854-4180-4ac5-8c48-6953ec3b32bc\" (UID: \"a2514854-4180-4ac5-8c48-6953ec3b32bc\") " Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.113807 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2514854-4180-4ac5-8c48-6953ec3b32bc-catalog-content\") pod \"a2514854-4180-4ac5-8c48-6953ec3b32bc\" (UID: \"a2514854-4180-4ac5-8c48-6953ec3b32bc\") " Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.113884 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7kqg\" (UniqueName: \"kubernetes.io/projected/a2514854-4180-4ac5-8c48-6953ec3b32bc-kube-api-access-b7kqg\") pod \"a2514854-4180-4ac5-8c48-6953ec3b32bc\" (UID: \"a2514854-4180-4ac5-8c48-6953ec3b32bc\") " Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.114979 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2514854-4180-4ac5-8c48-6953ec3b32bc-utilities" (OuterVolumeSpecName: "utilities") pod "a2514854-4180-4ac5-8c48-6953ec3b32bc" (UID: "a2514854-4180-4ac5-8c48-6953ec3b32bc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.123827 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2514854-4180-4ac5-8c48-6953ec3b32bc-kube-api-access-b7kqg" (OuterVolumeSpecName: "kube-api-access-b7kqg") pod "a2514854-4180-4ac5-8c48-6953ec3b32bc" (UID: "a2514854-4180-4ac5-8c48-6953ec3b32bc"). InnerVolumeSpecName "kube-api-access-b7kqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.163368 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2514854-4180-4ac5-8c48-6953ec3b32bc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2514854-4180-4ac5-8c48-6953ec3b32bc" (UID: "a2514854-4180-4ac5-8c48-6953ec3b32bc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.216435 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2514854-4180-4ac5-8c48-6953ec3b32bc-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.216664 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7kqg\" (UniqueName: \"kubernetes.io/projected/a2514854-4180-4ac5-8c48-6953ec3b32bc-kube-api-access-b7kqg\") on node \"crc\" DevicePath \"\"" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.216747 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2514854-4180-4ac5-8c48-6953ec3b32bc-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.332826 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.332886 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.396859 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.561459 4831 generic.go:334] "Generic (PLEG): container finished" podID="a2514854-4180-4ac5-8c48-6953ec3b32bc" containerID="5a0c951bf198eaf6bb623885921038b2ec24502a10b16c1ee29e9ca93c258fe2" exitCode=0 Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.561531 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kshn6" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.561580 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kshn6" event={"ID":"a2514854-4180-4ac5-8c48-6953ec3b32bc","Type":"ContainerDied","Data":"5a0c951bf198eaf6bb623885921038b2ec24502a10b16c1ee29e9ca93c258fe2"} Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.561608 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kshn6" event={"ID":"a2514854-4180-4ac5-8c48-6953ec3b32bc","Type":"ContainerDied","Data":"d02da32e25895525490818c7e7305a26cec1a3082b17d585ba0ac59a5eafdf69"} Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.561627 4831 scope.go:117] "RemoveContainer" containerID="5a0c951bf198eaf6bb623885921038b2ec24502a10b16c1ee29e9ca93c258fe2" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.592622 4831 scope.go:117] "RemoveContainer" containerID="a492257d79a4629b80df5eebbe596745e3010edb67d3d797e8267080266c0860" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.605055 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-kshn6"] Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.615985 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-kshn6"] Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.621702 4831 scope.go:117] "RemoveContainer" containerID="bb3f6912d3c4bb36f5743185a5b6dc23448de1c63012c03dda7f53e695d3b856" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.668706 4831 scope.go:117] "RemoveContainer" containerID="5a0c951bf198eaf6bb623885921038b2ec24502a10b16c1ee29e9ca93c258fe2" Dec 03 08:55:57 crc kubenswrapper[4831]: E1203 08:55:57.669391 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a0c951bf198eaf6bb623885921038b2ec24502a10b16c1ee29e9ca93c258fe2\": container with ID starting with 5a0c951bf198eaf6bb623885921038b2ec24502a10b16c1ee29e9ca93c258fe2 not found: ID does not exist" containerID="5a0c951bf198eaf6bb623885921038b2ec24502a10b16c1ee29e9ca93c258fe2" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.669426 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a0c951bf198eaf6bb623885921038b2ec24502a10b16c1ee29e9ca93c258fe2"} err="failed to get container status \"5a0c951bf198eaf6bb623885921038b2ec24502a10b16c1ee29e9ca93c258fe2\": rpc error: code = NotFound desc = could not find container \"5a0c951bf198eaf6bb623885921038b2ec24502a10b16c1ee29e9ca93c258fe2\": container with ID starting with 5a0c951bf198eaf6bb623885921038b2ec24502a10b16c1ee29e9ca93c258fe2 not found: ID does not exist" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.669446 4831 scope.go:117] "RemoveContainer" containerID="a492257d79a4629b80df5eebbe596745e3010edb67d3d797e8267080266c0860" Dec 03 08:55:57 crc kubenswrapper[4831]: E1203 08:55:57.669816 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a492257d79a4629b80df5eebbe596745e3010edb67d3d797e8267080266c0860\": container with ID starting with a492257d79a4629b80df5eebbe596745e3010edb67d3d797e8267080266c0860 not found: ID does not exist" containerID="a492257d79a4629b80df5eebbe596745e3010edb67d3d797e8267080266c0860" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.669841 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a492257d79a4629b80df5eebbe596745e3010edb67d3d797e8267080266c0860"} err="failed to get container status \"a492257d79a4629b80df5eebbe596745e3010edb67d3d797e8267080266c0860\": rpc error: code = NotFound desc = could not find container \"a492257d79a4629b80df5eebbe596745e3010edb67d3d797e8267080266c0860\": container with ID starting with a492257d79a4629b80df5eebbe596745e3010edb67d3d797e8267080266c0860 not found: ID does not exist" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.669854 4831 scope.go:117] "RemoveContainer" containerID="bb3f6912d3c4bb36f5743185a5b6dc23448de1c63012c03dda7f53e695d3b856" Dec 03 08:55:57 crc kubenswrapper[4831]: E1203 08:55:57.670103 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb3f6912d3c4bb36f5743185a5b6dc23448de1c63012c03dda7f53e695d3b856\": container with ID starting with bb3f6912d3c4bb36f5743185a5b6dc23448de1c63012c03dda7f53e695d3b856 not found: ID does not exist" containerID="bb3f6912d3c4bb36f5743185a5b6dc23448de1c63012c03dda7f53e695d3b856" Dec 03 08:55:57 crc kubenswrapper[4831]: I1203 08:55:57.670125 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb3f6912d3c4bb36f5743185a5b6dc23448de1c63012c03dda7f53e695d3b856"} err="failed to get container status \"bb3f6912d3c4bb36f5743185a5b6dc23448de1c63012c03dda7f53e695d3b856\": rpc error: code = NotFound desc = could not find container \"bb3f6912d3c4bb36f5743185a5b6dc23448de1c63012c03dda7f53e695d3b856\": container with ID starting with bb3f6912d3c4bb36f5743185a5b6dc23448de1c63012c03dda7f53e695d3b856 not found: ID does not exist" Dec 03 08:55:59 crc kubenswrapper[4831]: I1203 08:55:59.042249 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2514854-4180-4ac5-8c48-6953ec3b32bc" path="/var/lib/kubelet/pods/a2514854-4180-4ac5-8c48-6953ec3b32bc/volumes" Dec 03 08:56:07 crc kubenswrapper[4831]: I1203 08:56:07.387557 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:56:08 crc kubenswrapper[4831]: I1203 08:56:08.826353 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk2sg"] Dec 03 08:56:08 crc kubenswrapper[4831]: I1203 08:56:08.826824 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sk2sg" podUID="80e2ca55-52a9-4317-ac36-1592f5df5424" containerName="registry-server" containerID="cri-o://78aa1c9523adc46998dab292f18c6f4665955436ca1cb63e5eb78ab266a22c47" gracePeriod=2 Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.013227 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:56:09 crc kubenswrapper[4831]: E1203 08:56:09.013599 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.455340 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.534786 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80e2ca55-52a9-4317-ac36-1592f5df5424-catalog-content\") pod \"80e2ca55-52a9-4317-ac36-1592f5df5424\" (UID: \"80e2ca55-52a9-4317-ac36-1592f5df5424\") " Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.534955 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmmz5\" (UniqueName: \"kubernetes.io/projected/80e2ca55-52a9-4317-ac36-1592f5df5424-kube-api-access-xmmz5\") pod \"80e2ca55-52a9-4317-ac36-1592f5df5424\" (UID: \"80e2ca55-52a9-4317-ac36-1592f5df5424\") " Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.534984 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80e2ca55-52a9-4317-ac36-1592f5df5424-utilities\") pod \"80e2ca55-52a9-4317-ac36-1592f5df5424\" (UID: \"80e2ca55-52a9-4317-ac36-1592f5df5424\") " Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.536015 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80e2ca55-52a9-4317-ac36-1592f5df5424-utilities" (OuterVolumeSpecName: "utilities") pod "80e2ca55-52a9-4317-ac36-1592f5df5424" (UID: "80e2ca55-52a9-4317-ac36-1592f5df5424"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.541437 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80e2ca55-52a9-4317-ac36-1592f5df5424-kube-api-access-xmmz5" (OuterVolumeSpecName: "kube-api-access-xmmz5") pod "80e2ca55-52a9-4317-ac36-1592f5df5424" (UID: "80e2ca55-52a9-4317-ac36-1592f5df5424"). InnerVolumeSpecName "kube-api-access-xmmz5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.557259 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80e2ca55-52a9-4317-ac36-1592f5df5424-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "80e2ca55-52a9-4317-ac36-1592f5df5424" (UID: "80e2ca55-52a9-4317-ac36-1592f5df5424"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.637243 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmmz5\" (UniqueName: \"kubernetes.io/projected/80e2ca55-52a9-4317-ac36-1592f5df5424-kube-api-access-xmmz5\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.637287 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/80e2ca55-52a9-4317-ac36-1592f5df5424-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.637302 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/80e2ca55-52a9-4317-ac36-1592f5df5424-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.751730 4831 generic.go:334] "Generic (PLEG): container finished" podID="80e2ca55-52a9-4317-ac36-1592f5df5424" containerID="78aa1c9523adc46998dab292f18c6f4665955436ca1cb63e5eb78ab266a22c47" exitCode=0 Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.751778 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk2sg" event={"ID":"80e2ca55-52a9-4317-ac36-1592f5df5424","Type":"ContainerDied","Data":"78aa1c9523adc46998dab292f18c6f4665955436ca1cb63e5eb78ab266a22c47"} Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.751821 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sk2sg" event={"ID":"80e2ca55-52a9-4317-ac36-1592f5df5424","Type":"ContainerDied","Data":"e2e8d078030f41829391bc2f1f80adebddbfb30faad9d5fc80033021e329e1c2"} Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.751843 4831 scope.go:117] "RemoveContainer" containerID="78aa1c9523adc46998dab292f18c6f4665955436ca1cb63e5eb78ab266a22c47" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.751853 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sk2sg" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.791784 4831 scope.go:117] "RemoveContainer" containerID="4954883713b950533b8a4657e3f926f62c60acea3abc2517af4aae8d314911a0" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.808240 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk2sg"] Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.832197 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sk2sg"] Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.845972 4831 scope.go:117] "RemoveContainer" containerID="5e80eb48a941a3cf16f187ca55aca7bd386d269b208915d07698097c946061ec" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.915300 4831 scope.go:117] "RemoveContainer" containerID="78aa1c9523adc46998dab292f18c6f4665955436ca1cb63e5eb78ab266a22c47" Dec 03 08:56:09 crc kubenswrapper[4831]: E1203 08:56:09.915971 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78aa1c9523adc46998dab292f18c6f4665955436ca1cb63e5eb78ab266a22c47\": container with ID starting with 78aa1c9523adc46998dab292f18c6f4665955436ca1cb63e5eb78ab266a22c47 not found: ID does not exist" containerID="78aa1c9523adc46998dab292f18c6f4665955436ca1cb63e5eb78ab266a22c47" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.916024 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78aa1c9523adc46998dab292f18c6f4665955436ca1cb63e5eb78ab266a22c47"} err="failed to get container status \"78aa1c9523adc46998dab292f18c6f4665955436ca1cb63e5eb78ab266a22c47\": rpc error: code = NotFound desc = could not find container \"78aa1c9523adc46998dab292f18c6f4665955436ca1cb63e5eb78ab266a22c47\": container with ID starting with 78aa1c9523adc46998dab292f18c6f4665955436ca1cb63e5eb78ab266a22c47 not found: ID does not exist" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.916061 4831 scope.go:117] "RemoveContainer" containerID="4954883713b950533b8a4657e3f926f62c60acea3abc2517af4aae8d314911a0" Dec 03 08:56:09 crc kubenswrapper[4831]: E1203 08:56:09.916651 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4954883713b950533b8a4657e3f926f62c60acea3abc2517af4aae8d314911a0\": container with ID starting with 4954883713b950533b8a4657e3f926f62c60acea3abc2517af4aae8d314911a0 not found: ID does not exist" containerID="4954883713b950533b8a4657e3f926f62c60acea3abc2517af4aae8d314911a0" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.916684 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4954883713b950533b8a4657e3f926f62c60acea3abc2517af4aae8d314911a0"} err="failed to get container status \"4954883713b950533b8a4657e3f926f62c60acea3abc2517af4aae8d314911a0\": rpc error: code = NotFound desc = could not find container \"4954883713b950533b8a4657e3f926f62c60acea3abc2517af4aae8d314911a0\": container with ID starting with 4954883713b950533b8a4657e3f926f62c60acea3abc2517af4aae8d314911a0 not found: ID does not exist" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.916703 4831 scope.go:117] "RemoveContainer" containerID="5e80eb48a941a3cf16f187ca55aca7bd386d269b208915d07698097c946061ec" Dec 03 08:56:09 crc kubenswrapper[4831]: E1203 08:56:09.917191 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e80eb48a941a3cf16f187ca55aca7bd386d269b208915d07698097c946061ec\": container with ID starting with 5e80eb48a941a3cf16f187ca55aca7bd386d269b208915d07698097c946061ec not found: ID does not exist" containerID="5e80eb48a941a3cf16f187ca55aca7bd386d269b208915d07698097c946061ec" Dec 03 08:56:09 crc kubenswrapper[4831]: I1203 08:56:09.917224 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e80eb48a941a3cf16f187ca55aca7bd386d269b208915d07698097c946061ec"} err="failed to get container status \"5e80eb48a941a3cf16f187ca55aca7bd386d269b208915d07698097c946061ec\": rpc error: code = NotFound desc = could not find container \"5e80eb48a941a3cf16f187ca55aca7bd386d269b208915d07698097c946061ec\": container with ID starting with 5e80eb48a941a3cf16f187ca55aca7bd386d269b208915d07698097c946061ec not found: ID does not exist" Dec 03 08:56:11 crc kubenswrapper[4831]: I1203 08:56:11.033688 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80e2ca55-52a9-4317-ac36-1592f5df5424" path="/var/lib/kubelet/pods/80e2ca55-52a9-4317-ac36-1592f5df5424/volumes" Dec 03 08:56:21 crc kubenswrapper[4831]: I1203 08:56:21.013936 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:56:21 crc kubenswrapper[4831]: E1203 08:56:21.015160 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:56:32 crc kubenswrapper[4831]: I1203 08:56:32.013470 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:56:32 crc kubenswrapper[4831]: E1203 08:56:32.014607 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:56:32 crc kubenswrapper[4831]: I1203 08:56:32.057477 4831 generic.go:334] "Generic (PLEG): container finished" podID="b0f24e55-9970-4294-8bca-b289f2958f85" containerID="17fc0492fe402b55626ebccf401b9966d8df236441343c220c9a6518dc7fe191" exitCode=0 Dec 03 08:56:32 crc kubenswrapper[4831]: I1203 08:56:32.057539 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" event={"ID":"b0f24e55-9970-4294-8bca-b289f2958f85","Type":"ContainerDied","Data":"17fc0492fe402b55626ebccf401b9966d8df236441343c220c9a6518dc7fe191"} Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.673823 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.830294 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-neutron-sriov-combined-ca-bundle\") pod \"b0f24e55-9970-4294-8bca-b289f2958f85\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.830442 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-ceph\") pod \"b0f24e55-9970-4294-8bca-b289f2958f85\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.830519 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-ssh-key\") pod \"b0f24e55-9970-4294-8bca-b289f2958f85\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.830547 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-neutron-sriov-agent-neutron-config-0\") pod \"b0f24e55-9970-4294-8bca-b289f2958f85\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.830579 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rsxr\" (UniqueName: \"kubernetes.io/projected/b0f24e55-9970-4294-8bca-b289f2958f85-kube-api-access-7rsxr\") pod \"b0f24e55-9970-4294-8bca-b289f2958f85\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.830644 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-inventory\") pod \"b0f24e55-9970-4294-8bca-b289f2958f85\" (UID: \"b0f24e55-9970-4294-8bca-b289f2958f85\") " Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.836382 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0f24e55-9970-4294-8bca-b289f2958f85-kube-api-access-7rsxr" (OuterVolumeSpecName: "kube-api-access-7rsxr") pod "b0f24e55-9970-4294-8bca-b289f2958f85" (UID: "b0f24e55-9970-4294-8bca-b289f2958f85"). InnerVolumeSpecName "kube-api-access-7rsxr". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.837053 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-neutron-sriov-combined-ca-bundle" (OuterVolumeSpecName: "neutron-sriov-combined-ca-bundle") pod "b0f24e55-9970-4294-8bca-b289f2958f85" (UID: "b0f24e55-9970-4294-8bca-b289f2958f85"). InnerVolumeSpecName "neutron-sriov-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.838437 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-ceph" (OuterVolumeSpecName: "ceph") pod "b0f24e55-9970-4294-8bca-b289f2958f85" (UID: "b0f24e55-9970-4294-8bca-b289f2958f85"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.866445 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-inventory" (OuterVolumeSpecName: "inventory") pod "b0f24e55-9970-4294-8bca-b289f2958f85" (UID: "b0f24e55-9970-4294-8bca-b289f2958f85"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.876702 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "b0f24e55-9970-4294-8bca-b289f2958f85" (UID: "b0f24e55-9970-4294-8bca-b289f2958f85"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.883763 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-neutron-sriov-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-sriov-agent-neutron-config-0") pod "b0f24e55-9970-4294-8bca-b289f2958f85" (UID: "b0f24e55-9970-4294-8bca-b289f2958f85"). InnerVolumeSpecName "neutron-sriov-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.934593 4831 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-neutron-sriov-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.934656 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.934672 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.934687 4831 reconciler_common.go:293] "Volume detached for volume \"neutron-sriov-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-neutron-sriov-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.934700 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rsxr\" (UniqueName: \"kubernetes.io/projected/b0f24e55-9970-4294-8bca-b289f2958f85-kube-api-access-7rsxr\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:33 crc kubenswrapper[4831]: I1203 08:56:33.934714 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b0f24e55-9970-4294-8bca-b289f2958f85-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.087103 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" event={"ID":"b0f24e55-9970-4294-8bca-b289f2958f85","Type":"ContainerDied","Data":"5a4f020b7eb7d6de1128d4bc82a892e8f95dad979f61f79f04bc39f46f79066b"} Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.087160 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a4f020b7eb7d6de1128d4bc82a892e8f95dad979f61f79f04bc39f46f79066b" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.087644 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-sriov-openstack-openstack-cell1-7hxpj" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.192730 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-c65dr"] Dec 03 08:56:34 crc kubenswrapper[4831]: E1203 08:56:34.193184 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a65f44-18a8-4450-b7d6-56d0ec01ef63" containerName="extract-content" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193205 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a65f44-18a8-4450-b7d6-56d0ec01ef63" containerName="extract-content" Dec 03 08:56:34 crc kubenswrapper[4831]: E1203 08:56:34.193244 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2514854-4180-4ac5-8c48-6953ec3b32bc" containerName="registry-server" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193253 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2514854-4180-4ac5-8c48-6953ec3b32bc" containerName="registry-server" Dec 03 08:56:34 crc kubenswrapper[4831]: E1203 08:56:34.193268 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e2ca55-52a9-4317-ac36-1592f5df5424" containerName="extract-utilities" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193276 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e2ca55-52a9-4317-ac36-1592f5df5424" containerName="extract-utilities" Dec 03 08:56:34 crc kubenswrapper[4831]: E1203 08:56:34.193285 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a65f44-18a8-4450-b7d6-56d0ec01ef63" containerName="registry-server" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193292 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a65f44-18a8-4450-b7d6-56d0ec01ef63" containerName="registry-server" Dec 03 08:56:34 crc kubenswrapper[4831]: E1203 08:56:34.193337 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2514854-4180-4ac5-8c48-6953ec3b32bc" containerName="extract-content" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193347 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2514854-4180-4ac5-8c48-6953ec3b32bc" containerName="extract-content" Dec 03 08:56:34 crc kubenswrapper[4831]: E1203 08:56:34.193361 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0f24e55-9970-4294-8bca-b289f2958f85" containerName="neutron-sriov-openstack-openstack-cell1" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193369 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0f24e55-9970-4294-8bca-b289f2958f85" containerName="neutron-sriov-openstack-openstack-cell1" Dec 03 08:56:34 crc kubenswrapper[4831]: E1203 08:56:34.193385 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2514854-4180-4ac5-8c48-6953ec3b32bc" containerName="extract-utilities" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193393 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2514854-4180-4ac5-8c48-6953ec3b32bc" containerName="extract-utilities" Dec 03 08:56:34 crc kubenswrapper[4831]: E1203 08:56:34.193430 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e2ca55-52a9-4317-ac36-1592f5df5424" containerName="extract-content" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193439 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e2ca55-52a9-4317-ac36-1592f5df5424" containerName="extract-content" Dec 03 08:56:34 crc kubenswrapper[4831]: E1203 08:56:34.193450 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80e2ca55-52a9-4317-ac36-1592f5df5424" containerName="registry-server" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193457 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="80e2ca55-52a9-4317-ac36-1592f5df5424" containerName="registry-server" Dec 03 08:56:34 crc kubenswrapper[4831]: E1203 08:56:34.193473 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41a65f44-18a8-4450-b7d6-56d0ec01ef63" containerName="extract-utilities" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193480 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="41a65f44-18a8-4450-b7d6-56d0ec01ef63" containerName="extract-utilities" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193720 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2514854-4180-4ac5-8c48-6953ec3b32bc" containerName="registry-server" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193745 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="41a65f44-18a8-4450-b7d6-56d0ec01ef63" containerName="registry-server" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193754 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="80e2ca55-52a9-4317-ac36-1592f5df5424" containerName="registry-server" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.193765 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0f24e55-9970-4294-8bca-b289f2958f85" containerName="neutron-sriov-openstack-openstack-cell1" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.194556 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.199081 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.199259 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.199436 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.199635 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.200655 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-dhcp-agent-neutron-config" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.221222 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-c65dr"] Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.241541 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.241666 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.241705 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.241756 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.241818 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.241882 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb2vq\" (UniqueName: \"kubernetes.io/projected/44f4ecb4-df16-4534-804d-2df7d53861cb-kube-api-access-zb2vq\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.343487 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.343972 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.344055 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.344163 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.344263 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb2vq\" (UniqueName: \"kubernetes.io/projected/44f4ecb4-df16-4534-804d-2df7d53861cb-kube-api-access-zb2vq\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.344335 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.350471 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-inventory\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.351115 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-ceph\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.352372 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-neutron-dhcp-combined-ca-bundle\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.352386 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-neutron-dhcp-agent-neutron-config-0\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.357773 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-ssh-key\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.368663 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb2vq\" (UniqueName: \"kubernetes.io/projected/44f4ecb4-df16-4534-804d-2df7d53861cb-kube-api-access-zb2vq\") pod \"neutron-dhcp-openstack-openstack-cell1-c65dr\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:34 crc kubenswrapper[4831]: I1203 08:56:34.540283 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:56:35 crc kubenswrapper[4831]: I1203 08:56:35.188475 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-dhcp-openstack-openstack-cell1-c65dr"] Dec 03 08:56:35 crc kubenswrapper[4831]: W1203 08:56:35.199296 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44f4ecb4_df16_4534_804d_2df7d53861cb.slice/crio-36c3471b26c53db153969b8e628d2f80f7880258cdb1b8dfd641e00e513aca98 WatchSource:0}: Error finding container 36c3471b26c53db153969b8e628d2f80f7880258cdb1b8dfd641e00e513aca98: Status 404 returned error can't find the container with id 36c3471b26c53db153969b8e628d2f80f7880258cdb1b8dfd641e00e513aca98 Dec 03 08:56:36 crc kubenswrapper[4831]: I1203 08:56:36.113702 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" event={"ID":"44f4ecb4-df16-4534-804d-2df7d53861cb","Type":"ContainerStarted","Data":"29d6d3cf35578a0f5fc7b95b566441044ff37f4f712134fe91dfe62456e36e41"} Dec 03 08:56:36 crc kubenswrapper[4831]: I1203 08:56:36.114276 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" event={"ID":"44f4ecb4-df16-4534-804d-2df7d53861cb","Type":"ContainerStarted","Data":"36c3471b26c53db153969b8e628d2f80f7880258cdb1b8dfd641e00e513aca98"} Dec 03 08:56:36 crc kubenswrapper[4831]: I1203 08:56:36.148427 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" podStartSLOduration=1.9361744810000001 podStartE2EDuration="2.148399541s" podCreationTimestamp="2025-12-03 08:56:34 +0000 UTC" firstStartedPulling="2025-12-03 08:56:35.202652434 +0000 UTC m=+8732.546235952" lastFinishedPulling="2025-12-03 08:56:35.414877494 +0000 UTC m=+8732.758461012" observedRunningTime="2025-12-03 08:56:36.137842052 +0000 UTC m=+8733.481425650" watchObservedRunningTime="2025-12-03 08:56:36.148399541 +0000 UTC m=+8733.491983089" Dec 03 08:56:45 crc kubenswrapper[4831]: I1203 08:56:45.014724 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:56:45 crc kubenswrapper[4831]: E1203 08:56:45.015965 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:56:59 crc kubenswrapper[4831]: I1203 08:56:59.014144 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:56:59 crc kubenswrapper[4831]: E1203 08:56:59.015120 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:57:11 crc kubenswrapper[4831]: I1203 08:57:11.013788 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:57:11 crc kubenswrapper[4831]: E1203 08:57:11.014476 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:57:22 crc kubenswrapper[4831]: I1203 08:57:22.012829 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:57:22 crc kubenswrapper[4831]: E1203 08:57:22.013598 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 08:57:33 crc kubenswrapper[4831]: I1203 08:57:33.025654 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 08:57:33 crc kubenswrapper[4831]: I1203 08:57:33.810412 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"0cb79465c4851d9ccab0a172ec47f2783944898da9d4aedbce6d8b6c849dbeb6"} Dec 03 08:58:20 crc kubenswrapper[4831]: I1203 08:58:20.465955 4831 generic.go:334] "Generic (PLEG): container finished" podID="44f4ecb4-df16-4534-804d-2df7d53861cb" containerID="29d6d3cf35578a0f5fc7b95b566441044ff37f4f712134fe91dfe62456e36e41" exitCode=0 Dec 03 08:58:20 crc kubenswrapper[4831]: I1203 08:58:20.466354 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" event={"ID":"44f4ecb4-df16-4534-804d-2df7d53861cb","Type":"ContainerDied","Data":"29d6d3cf35578a0f5fc7b95b566441044ff37f4f712134fe91dfe62456e36e41"} Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.030200 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.084714 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-neutron-dhcp-agent-neutron-config-0\") pod \"44f4ecb4-df16-4534-804d-2df7d53861cb\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.084902 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-inventory\") pod \"44f4ecb4-df16-4534-804d-2df7d53861cb\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.085040 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-ssh-key\") pod \"44f4ecb4-df16-4534-804d-2df7d53861cb\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.085407 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zb2vq\" (UniqueName: \"kubernetes.io/projected/44f4ecb4-df16-4534-804d-2df7d53861cb-kube-api-access-zb2vq\") pod \"44f4ecb4-df16-4534-804d-2df7d53861cb\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.085657 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-ceph\") pod \"44f4ecb4-df16-4534-804d-2df7d53861cb\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.085721 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-neutron-dhcp-combined-ca-bundle\") pod \"44f4ecb4-df16-4534-804d-2df7d53861cb\" (UID: \"44f4ecb4-df16-4534-804d-2df7d53861cb\") " Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.098726 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44f4ecb4-df16-4534-804d-2df7d53861cb-kube-api-access-zb2vq" (OuterVolumeSpecName: "kube-api-access-zb2vq") pod "44f4ecb4-df16-4534-804d-2df7d53861cb" (UID: "44f4ecb4-df16-4534-804d-2df7d53861cb"). InnerVolumeSpecName "kube-api-access-zb2vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.099108 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-ceph" (OuterVolumeSpecName: "ceph") pod "44f4ecb4-df16-4534-804d-2df7d53861cb" (UID: "44f4ecb4-df16-4534-804d-2df7d53861cb"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.100669 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-neutron-dhcp-combined-ca-bundle" (OuterVolumeSpecName: "neutron-dhcp-combined-ca-bundle") pod "44f4ecb4-df16-4534-804d-2df7d53861cb" (UID: "44f4ecb4-df16-4534-804d-2df7d53861cb"). InnerVolumeSpecName "neutron-dhcp-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.133450 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "44f4ecb4-df16-4534-804d-2df7d53861cb" (UID: "44f4ecb4-df16-4534-804d-2df7d53861cb"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.134906 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-neutron-dhcp-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-dhcp-agent-neutron-config-0") pod "44f4ecb4-df16-4534-804d-2df7d53861cb" (UID: "44f4ecb4-df16-4534-804d-2df7d53861cb"). InnerVolumeSpecName "neutron-dhcp-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.166051 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-inventory" (OuterVolumeSpecName: "inventory") pod "44f4ecb4-df16-4534-804d-2df7d53861cb" (UID: "44f4ecb4-df16-4534-804d-2df7d53861cb"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.188539 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zb2vq\" (UniqueName: \"kubernetes.io/projected/44f4ecb4-df16-4534-804d-2df7d53861cb-kube-api-access-zb2vq\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.188580 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.188592 4831 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-neutron-dhcp-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.188605 4831 reconciler_common.go:293] "Volume detached for volume \"neutron-dhcp-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-neutron-dhcp-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.188617 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.188626 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/44f4ecb4-df16-4534-804d-2df7d53861cb-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.490946 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" event={"ID":"44f4ecb4-df16-4534-804d-2df7d53861cb","Type":"ContainerDied","Data":"36c3471b26c53db153969b8e628d2f80f7880258cdb1b8dfd641e00e513aca98"} Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.490992 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="36c3471b26c53db153969b8e628d2f80f7880258cdb1b8dfd641e00e513aca98" Dec 03 08:58:22 crc kubenswrapper[4831]: I1203 08:58:22.491001 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-dhcp-openstack-openstack-cell1-c65dr" Dec 03 08:58:43 crc kubenswrapper[4831]: I1203 08:58:43.779233 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:58:43 crc kubenswrapper[4831]: I1203 08:58:43.779993 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell0-conductor-0" podUID="5e68d3fd-75f1-44e6-bbe6-3ce6a2657135" containerName="nova-cell0-conductor-conductor" containerID="cri-o://6d70e512d40e327aa2cfe8461f9a7f61bf8a9cd51f6fccbe2d8bb8e8db1c6120" gracePeriod=30 Dec 03 08:58:43 crc kubenswrapper[4831]: I1203 08:58:43.867289 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:58:43 crc kubenswrapper[4831]: I1203 08:58:43.867536 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-conductor-0" podUID="f75ecfa5-4e93-432b-907b-e79a5de81fc9" containerName="nova-cell1-conductor-conductor" containerID="cri-o://82d8a75c1e5c6455407f6bdf9a8d421543346dd521c05398480bc5dd2bc67eb5" gracePeriod=30 Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.487516 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk"] Dec 03 08:58:44 crc kubenswrapper[4831]: E1203 08:58:44.488190 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="44f4ecb4-df16-4534-804d-2df7d53861cb" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.488218 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="44f4ecb4-df16-4534-804d-2df7d53861cb" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.488602 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="44f4ecb4-df16-4534-804d-2df7d53861cb" containerName="neutron-dhcp-openstack-openstack-cell1" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.489554 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.493828 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.494294 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-cells-global-config" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.494372 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-cell1" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.494485 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-adoption-secret" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.494614 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.494749 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-dockercfg-n74w5" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.497589 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.511758 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.511830 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.511899 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.511934 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.511974 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.512038 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.512097 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.512159 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzzzg\" (UniqueName: \"kubernetes.io/projected/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-kube-api-access-lzzzg\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.512224 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.512262 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.512300 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.558810 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk"] Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.613980 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzzzg\" (UniqueName: \"kubernetes.io/projected/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-kube-api-access-lzzzg\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.614058 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.614113 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.614157 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.614198 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.614226 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.614279 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.614330 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.614371 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.614437 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.614500 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.615391 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cells-global-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.615850 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cells-global-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.623652 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-ssh-key\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.623895 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-migration-ssh-key-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.624359 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-ceph\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.624662 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-compute-config-1\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.624776 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-migration-ssh-key-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.625188 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-compute-config-0\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.625306 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-combined-ca-bundle\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.634617 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-inventory\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.636042 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzzzg\" (UniqueName: \"kubernetes.io/projected/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-kube-api-access-lzzzg\") pod \"nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.758832 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.760594 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e306eb84-97a5-44b8-9675-018da9b131a2" containerName="nova-api-api" containerID="cri-o://4e66c7cf75c0c3304e4b90e1c3e52b7136e800a21c8e342fc96adee70a183598" gracePeriod=30 Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.760967 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="e306eb84-97a5-44b8-9675-018da9b131a2" containerName="nova-api-log" containerID="cri-o://1ffa5f648dc2bef875942469a166ba6b5974d8cb1edfbd6b2b963cb68f7fb5cb" gracePeriod=30 Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.770685 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.770910 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="4856725f-47d3-4087-873b-89e7d53b6b0d" containerName="nova-scheduler-scheduler" containerID="cri-o://b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383" gracePeriod=30 Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.795869 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.796168 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerName="nova-metadata-log" containerID="cri-o://3f6898f03c18071164b5bd2e1f6e1bd607c4b65e8b1eae9acb8f07a6aa623283" gracePeriod=30 Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.796252 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerName="nova-metadata-metadata" containerID="cri-o://8d13188b6a11edaf0f068aab6ccc351d8563743e7c04acfee07314ab55af7794" gracePeriod=30 Dec 03 08:58:44 crc kubenswrapper[4831]: I1203 08:58:44.813279 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 08:58:45 crc kubenswrapper[4831]: I1203 08:58:45.395207 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk"] Dec 03 08:58:45 crc kubenswrapper[4831]: I1203 08:58:45.723858 4831 generic.go:334] "Generic (PLEG): container finished" podID="e306eb84-97a5-44b8-9675-018da9b131a2" containerID="1ffa5f648dc2bef875942469a166ba6b5974d8cb1edfbd6b2b963cb68f7fb5cb" exitCode=143 Dec 03 08:58:45 crc kubenswrapper[4831]: I1203 08:58:45.723960 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e306eb84-97a5-44b8-9675-018da9b131a2","Type":"ContainerDied","Data":"1ffa5f648dc2bef875942469a166ba6b5974d8cb1edfbd6b2b963cb68f7fb5cb"} Dec 03 08:58:45 crc kubenswrapper[4831]: I1203 08:58:45.725425 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" event={"ID":"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d","Type":"ContainerStarted","Data":"049fa21db10a0bd1e83834f1ffe2faa5e824647e9a2e18131ab4a841ab548247"} Dec 03 08:58:45 crc kubenswrapper[4831]: I1203 08:58:45.727769 4831 generic.go:334] "Generic (PLEG): container finished" podID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerID="3f6898f03c18071164b5bd2e1f6e1bd607c4b65e8b1eae9acb8f07a6aa623283" exitCode=143 Dec 03 08:58:45 crc kubenswrapper[4831]: I1203 08:58:45.727834 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"22841d09-0c8c-49b0-a674-8bc092431ad9","Type":"ContainerDied","Data":"3f6898f03c18071164b5bd2e1f6e1bd607c4b65e8b1eae9acb8f07a6aa623283"} Dec 03 08:58:45 crc kubenswrapper[4831]: E1203 08:58:45.867269 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 82d8a75c1e5c6455407f6bdf9a8d421543346dd521c05398480bc5dd2bc67eb5 is running failed: container process not found" containerID="82d8a75c1e5c6455407f6bdf9a8d421543346dd521c05398480bc5dd2bc67eb5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 08:58:45 crc kubenswrapper[4831]: E1203 08:58:45.868822 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 82d8a75c1e5c6455407f6bdf9a8d421543346dd521c05398480bc5dd2bc67eb5 is running failed: container process not found" containerID="82d8a75c1e5c6455407f6bdf9a8d421543346dd521c05398480bc5dd2bc67eb5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 08:58:45 crc kubenswrapper[4831]: E1203 08:58:45.869232 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 82d8a75c1e5c6455407f6bdf9a8d421543346dd521c05398480bc5dd2bc67eb5 is running failed: container process not found" containerID="82d8a75c1e5c6455407f6bdf9a8d421543346dd521c05398480bc5dd2bc67eb5" cmd=["/usr/bin/pgrep","-r","DRST","nova-conductor"] Dec 03 08:58:45 crc kubenswrapper[4831]: E1203 08:58:45.869277 4831 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 82d8a75c1e5c6455407f6bdf9a8d421543346dd521c05398480bc5dd2bc67eb5 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-cell1-conductor-0" podUID="f75ecfa5-4e93-432b-907b-e79a5de81fc9" containerName="nova-cell1-conductor-conductor" Dec 03 08:58:46 crc kubenswrapper[4831]: I1203 08:58:46.738544 4831 generic.go:334] "Generic (PLEG): container finished" podID="f75ecfa5-4e93-432b-907b-e79a5de81fc9" containerID="82d8a75c1e5c6455407f6bdf9a8d421543346dd521c05398480bc5dd2bc67eb5" exitCode=0 Dec 03 08:58:46 crc kubenswrapper[4831]: I1203 08:58:46.738683 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f75ecfa5-4e93-432b-907b-e79a5de81fc9","Type":"ContainerDied","Data":"82d8a75c1e5c6455407f6bdf9a8d421543346dd521c05398480bc5dd2bc67eb5"} Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.329544 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:47 crc kubenswrapper[4831]: E1203 08:58:47.497873 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383 is running failed: container process not found" containerID="b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 08:58:47 crc kubenswrapper[4831]: E1203 08:58:47.498599 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383 is running failed: container process not found" containerID="b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 08:58:47 crc kubenswrapper[4831]: E1203 08:58:47.498941 4831 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383 is running failed: container process not found" containerID="b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Dec 03 08:58:47 crc kubenswrapper[4831]: E1203 08:58:47.498981 4831 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383 is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="4856725f-47d3-4087-873b-89e7d53b6b0d" containerName="nova-scheduler-scheduler" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.504014 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hwv4t\" (UniqueName: \"kubernetes.io/projected/f75ecfa5-4e93-432b-907b-e79a5de81fc9-kube-api-access-hwv4t\") pod \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\" (UID: \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\") " Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.504168 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f75ecfa5-4e93-432b-907b-e79a5de81fc9-config-data\") pod \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\" (UID: \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\") " Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.504402 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f75ecfa5-4e93-432b-907b-e79a5de81fc9-combined-ca-bundle\") pod \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\" (UID: \"f75ecfa5-4e93-432b-907b-e79a5de81fc9\") " Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.510536 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f75ecfa5-4e93-432b-907b-e79a5de81fc9-kube-api-access-hwv4t" (OuterVolumeSpecName: "kube-api-access-hwv4t") pod "f75ecfa5-4e93-432b-907b-e79a5de81fc9" (UID: "f75ecfa5-4e93-432b-907b-e79a5de81fc9"). InnerVolumeSpecName "kube-api-access-hwv4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.515058 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.542406 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f75ecfa5-4e93-432b-907b-e79a5de81fc9-config-data" (OuterVolumeSpecName: "config-data") pod "f75ecfa5-4e93-432b-907b-e79a5de81fc9" (UID: "f75ecfa5-4e93-432b-907b-e79a5de81fc9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.551748 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f75ecfa5-4e93-432b-907b-e79a5de81fc9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f75ecfa5-4e93-432b-907b-e79a5de81fc9" (UID: "f75ecfa5-4e93-432b-907b-e79a5de81fc9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.607268 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f75ecfa5-4e93-432b-907b-e79a5de81fc9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.607304 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hwv4t\" (UniqueName: \"kubernetes.io/projected/f75ecfa5-4e93-432b-907b-e79a5de81fc9-kube-api-access-hwv4t\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.607337 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f75ecfa5-4e93-432b-907b-e79a5de81fc9-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.699470 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.708412 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-combined-ca-bundle\") pod \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\" (UID: \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\") " Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.708583 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n6twn\" (UniqueName: \"kubernetes.io/projected/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-kube-api-access-n6twn\") pod \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\" (UID: \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\") " Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.708627 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-config-data\") pod \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\" (UID: \"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135\") " Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.713764 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-kube-api-access-n6twn" (OuterVolumeSpecName: "kube-api-access-n6twn") pod "5e68d3fd-75f1-44e6-bbe6-3ce6a2657135" (UID: "5e68d3fd-75f1-44e6-bbe6-3ce6a2657135"). InnerVolumeSpecName "kube-api-access-n6twn". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.750471 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5e68d3fd-75f1-44e6-bbe6-3ce6a2657135" (UID: "5e68d3fd-75f1-44e6-bbe6-3ce6a2657135"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.761931 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-config-data" (OuterVolumeSpecName: "config-data") pod "5e68d3fd-75f1-44e6-bbe6-3ce6a2657135" (UID: "5e68d3fd-75f1-44e6-bbe6-3ce6a2657135"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.778581 4831 generic.go:334] "Generic (PLEG): container finished" podID="4856725f-47d3-4087-873b-89e7d53b6b0d" containerID="b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383" exitCode=0 Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.778685 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4856725f-47d3-4087-873b-89e7d53b6b0d","Type":"ContainerDied","Data":"b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383"} Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.778720 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"4856725f-47d3-4087-873b-89e7d53b6b0d","Type":"ContainerDied","Data":"9874ccca751e2b9abb7ac6b908f83e0af901a62cc566bf340ec4cfbcd623f2a0"} Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.778745 4831 scope.go:117] "RemoveContainer" containerID="b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.778908 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.783789 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" event={"ID":"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d","Type":"ContainerStarted","Data":"f32f4d1a918837e95d03cfc583751e02a14eb9b704a01c4074b8cdabc544e124"} Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.787906 4831 generic.go:334] "Generic (PLEG): container finished" podID="5e68d3fd-75f1-44e6-bbe6-3ce6a2657135" containerID="6d70e512d40e327aa2cfe8461f9a7f61bf8a9cd51f6fccbe2d8bb8e8db1c6120" exitCode=0 Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.787984 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135","Type":"ContainerDied","Data":"6d70e512d40e327aa2cfe8461f9a7f61bf8a9cd51f6fccbe2d8bb8e8db1c6120"} Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.788015 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"5e68d3fd-75f1-44e6-bbe6-3ce6a2657135","Type":"ContainerDied","Data":"6e4db7ee316c8f000d3c245b8096f7b54bd893bcd3f3ddfe39271f3b01d291e6"} Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.788078 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.793200 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"f75ecfa5-4e93-432b-907b-e79a5de81fc9","Type":"ContainerDied","Data":"9c6849908295908bc4900f5f55dc2239af4f0d39875298386f62d2f78e66e0fa"} Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.793270 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.811006 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4856725f-47d3-4087-873b-89e7d53b6b0d-config-data\") pod \"4856725f-47d3-4087-873b-89e7d53b6b0d\" (UID: \"4856725f-47d3-4087-873b-89e7d53b6b0d\") " Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.811311 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4856725f-47d3-4087-873b-89e7d53b6b0d-combined-ca-bundle\") pod \"4856725f-47d3-4087-873b-89e7d53b6b0d\" (UID: \"4856725f-47d3-4087-873b-89e7d53b6b0d\") " Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.811371 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rg4xz\" (UniqueName: \"kubernetes.io/projected/4856725f-47d3-4087-873b-89e7d53b6b0d-kube-api-access-rg4xz\") pod \"4856725f-47d3-4087-873b-89e7d53b6b0d\" (UID: \"4856725f-47d3-4087-873b-89e7d53b6b0d\") " Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.811785 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.811799 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n6twn\" (UniqueName: \"kubernetes.io/projected/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-kube-api-access-n6twn\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.811811 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.813539 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" podStartSLOduration=2.444637137 podStartE2EDuration="3.813517452s" podCreationTimestamp="2025-12-03 08:58:44 +0000 UTC" firstStartedPulling="2025-12-03 08:58:45.400996052 +0000 UTC m=+8862.744579560" lastFinishedPulling="2025-12-03 08:58:46.769876367 +0000 UTC m=+8864.113459875" observedRunningTime="2025-12-03 08:58:47.809311201 +0000 UTC m=+8865.152894729" watchObservedRunningTime="2025-12-03 08:58:47.813517452 +0000 UTC m=+8865.157100960" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.818818 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4856725f-47d3-4087-873b-89e7d53b6b0d-kube-api-access-rg4xz" (OuterVolumeSpecName: "kube-api-access-rg4xz") pod "4856725f-47d3-4087-873b-89e7d53b6b0d" (UID: "4856725f-47d3-4087-873b-89e7d53b6b0d"). InnerVolumeSpecName "kube-api-access-rg4xz". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.865306 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4856725f-47d3-4087-873b-89e7d53b6b0d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4856725f-47d3-4087-873b-89e7d53b6b0d" (UID: "4856725f-47d3-4087-873b-89e7d53b6b0d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.866628 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4856725f-47d3-4087-873b-89e7d53b6b0d-config-data" (OuterVolumeSpecName: "config-data") pod "4856725f-47d3-4087-873b-89e7d53b6b0d" (UID: "4856725f-47d3-4087-873b-89e7d53b6b0d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.913966 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4856725f-47d3-4087-873b-89e7d53b6b0d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.914010 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rg4xz\" (UniqueName: \"kubernetes.io/projected/4856725f-47d3-4087-873b-89e7d53b6b0d-kube-api-access-rg4xz\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.914028 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4856725f-47d3-4087-873b-89e7d53b6b0d-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.942679 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.89:8775/\": read tcp 10.217.0.2:57072->10.217.1.89:8775: read: connection reset by peer" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.942703 4831 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.89:8775/\": read tcp 10.217.0.2:57074->10.217.1.89:8775: read: connection reset by peer" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.965640 4831 scope.go:117] "RemoveContainer" containerID="b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383" Dec 03 08:58:47 crc kubenswrapper[4831]: E1203 08:58:47.966128 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383\": container with ID starting with b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383 not found: ID does not exist" containerID="b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.966163 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383"} err="failed to get container status \"b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383\": rpc error: code = NotFound desc = could not find container \"b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383\": container with ID starting with b8f032094c4ec3c6cae833ccf7539525b5ea337e2844e71b4bf0efc85a6b6383 not found: ID does not exist" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.966183 4831 scope.go:117] "RemoveContainer" containerID="6d70e512d40e327aa2cfe8461f9a7f61bf8a9cd51f6fccbe2d8bb8e8db1c6120" Dec 03 08:58:47 crc kubenswrapper[4831]: I1203 08:58:47.974614 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.003769 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.035965 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.066173 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.075770 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:58:48 crc kubenswrapper[4831]: E1203 08:58:48.076264 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e68d3fd-75f1-44e6-bbe6-3ce6a2657135" containerName="nova-cell0-conductor-conductor" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.076282 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e68d3fd-75f1-44e6-bbe6-3ce6a2657135" containerName="nova-cell0-conductor-conductor" Dec 03 08:58:48 crc kubenswrapper[4831]: E1203 08:58:48.076305 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f75ecfa5-4e93-432b-907b-e79a5de81fc9" containerName="nova-cell1-conductor-conductor" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.076312 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f75ecfa5-4e93-432b-907b-e79a5de81fc9" containerName="nova-cell1-conductor-conductor" Dec 03 08:58:48 crc kubenswrapper[4831]: E1203 08:58:48.076366 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4856725f-47d3-4087-873b-89e7d53b6b0d" containerName="nova-scheduler-scheduler" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.076371 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="4856725f-47d3-4087-873b-89e7d53b6b0d" containerName="nova-scheduler-scheduler" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.076571 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f75ecfa5-4e93-432b-907b-e79a5de81fc9" containerName="nova-cell1-conductor-conductor" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.076593 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="4856725f-47d3-4087-873b-89e7d53b6b0d" containerName="nova-scheduler-scheduler" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.076613 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e68d3fd-75f1-44e6-bbe6-3ce6a2657135" containerName="nova-cell0-conductor-conductor" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.077413 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.083054 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.085224 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.086555 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.088540 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.095422 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.105957 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.136528 4831 scope.go:117] "RemoveContainer" containerID="6d70e512d40e327aa2cfe8461f9a7f61bf8a9cd51f6fccbe2d8bb8e8db1c6120" Dec 03 08:58:48 crc kubenswrapper[4831]: E1203 08:58:48.138306 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d70e512d40e327aa2cfe8461f9a7f61bf8a9cd51f6fccbe2d8bb8e8db1c6120\": container with ID starting with 6d70e512d40e327aa2cfe8461f9a7f61bf8a9cd51f6fccbe2d8bb8e8db1c6120 not found: ID does not exist" containerID="6d70e512d40e327aa2cfe8461f9a7f61bf8a9cd51f6fccbe2d8bb8e8db1c6120" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.138351 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d70e512d40e327aa2cfe8461f9a7f61bf8a9cd51f6fccbe2d8bb8e8db1c6120"} err="failed to get container status \"6d70e512d40e327aa2cfe8461f9a7f61bf8a9cd51f6fccbe2d8bb8e8db1c6120\": rpc error: code = NotFound desc = could not find container \"6d70e512d40e327aa2cfe8461f9a7f61bf8a9cd51f6fccbe2d8bb8e8db1c6120\": container with ID starting with 6d70e512d40e327aa2cfe8461f9a7f61bf8a9cd51f6fccbe2d8bb8e8db1c6120 not found: ID does not exist" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.138371 4831 scope.go:117] "RemoveContainer" containerID="82d8a75c1e5c6455407f6bdf9a8d421543346dd521c05398480bc5dd2bc67eb5" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.148185 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.158251 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.171649 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.173019 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.178734 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.182143 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.224479 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a89396d-a305-4cea-b054-d3ad772f79e0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2a89396d-a305-4cea-b054-d3ad772f79e0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.224567 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4nw5\" (UniqueName: \"kubernetes.io/projected/2a89396d-a305-4cea-b054-d3ad772f79e0-kube-api-access-p4nw5\") pod \"nova-cell1-conductor-0\" (UID: \"2a89396d-a305-4cea-b054-d3ad772f79e0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.224747 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2be4cea-396e-49b0-aef1-4c28ac8dcd78-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a2be4cea-396e-49b0-aef1-4c28ac8dcd78\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.224785 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2be4cea-396e-49b0-aef1-4c28ac8dcd78-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a2be4cea-396e-49b0-aef1-4c28ac8dcd78\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.224841 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4hdf\" (UniqueName: \"kubernetes.io/projected/a2be4cea-396e-49b0-aef1-4c28ac8dcd78-kube-api-access-q4hdf\") pod \"nova-cell0-conductor-0\" (UID: \"a2be4cea-396e-49b0-aef1-4c28ac8dcd78\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.224878 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a89396d-a305-4cea-b054-d3ad772f79e0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2a89396d-a305-4cea-b054-d3ad772f79e0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.327597 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fm85\" (UniqueName: \"kubernetes.io/projected/b2e131ef-51b4-4d7b-95f1-be753d22436a-kube-api-access-2fm85\") pod \"nova-scheduler-0\" (UID: \"b2e131ef-51b4-4d7b-95f1-be753d22436a\") " pod="openstack/nova-scheduler-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.327872 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2be4cea-396e-49b0-aef1-4c28ac8dcd78-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a2be4cea-396e-49b0-aef1-4c28ac8dcd78\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.327945 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2be4cea-396e-49b0-aef1-4c28ac8dcd78-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a2be4cea-396e-49b0-aef1-4c28ac8dcd78\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.328044 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2e131ef-51b4-4d7b-95f1-be753d22436a-config-data\") pod \"nova-scheduler-0\" (UID: \"b2e131ef-51b4-4d7b-95f1-be753d22436a\") " pod="openstack/nova-scheduler-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.328110 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4hdf\" (UniqueName: \"kubernetes.io/projected/a2be4cea-396e-49b0-aef1-4c28ac8dcd78-kube-api-access-q4hdf\") pod \"nova-cell0-conductor-0\" (UID: \"a2be4cea-396e-49b0-aef1-4c28ac8dcd78\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.328149 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e131ef-51b4-4d7b-95f1-be753d22436a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2e131ef-51b4-4d7b-95f1-be753d22436a\") " pod="openstack/nova-scheduler-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.328210 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a89396d-a305-4cea-b054-d3ad772f79e0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2a89396d-a305-4cea-b054-d3ad772f79e0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.328240 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a89396d-a305-4cea-b054-d3ad772f79e0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2a89396d-a305-4cea-b054-d3ad772f79e0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.328362 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4nw5\" (UniqueName: \"kubernetes.io/projected/2a89396d-a305-4cea-b054-d3ad772f79e0-kube-api-access-p4nw5\") pod \"nova-cell1-conductor-0\" (UID: \"2a89396d-a305-4cea-b054-d3ad772f79e0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.335160 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a89396d-a305-4cea-b054-d3ad772f79e0-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"2a89396d-a305-4cea-b054-d3ad772f79e0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.342793 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a89396d-a305-4cea-b054-d3ad772f79e0-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"2a89396d-a305-4cea-b054-d3ad772f79e0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.345919 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a2be4cea-396e-49b0-aef1-4c28ac8dcd78-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"a2be4cea-396e-49b0-aef1-4c28ac8dcd78\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.348527 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a2be4cea-396e-49b0-aef1-4c28ac8dcd78-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"a2be4cea-396e-49b0-aef1-4c28ac8dcd78\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.350792 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4nw5\" (UniqueName: \"kubernetes.io/projected/2a89396d-a305-4cea-b054-d3ad772f79e0-kube-api-access-p4nw5\") pod \"nova-cell1-conductor-0\" (UID: \"2a89396d-a305-4cea-b054-d3ad772f79e0\") " pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.356500 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4hdf\" (UniqueName: \"kubernetes.io/projected/a2be4cea-396e-49b0-aef1-4c28ac8dcd78-kube-api-access-q4hdf\") pod \"nova-cell0-conductor-0\" (UID: \"a2be4cea-396e-49b0-aef1-4c28ac8dcd78\") " pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.417518 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.433138 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.433479 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2e131ef-51b4-4d7b-95f1-be753d22436a-config-data\") pod \"nova-scheduler-0\" (UID: \"b2e131ef-51b4-4d7b-95f1-be753d22436a\") " pod="openstack/nova-scheduler-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.433539 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e131ef-51b4-4d7b-95f1-be753d22436a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2e131ef-51b4-4d7b-95f1-be753d22436a\") " pod="openstack/nova-scheduler-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.433700 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fm85\" (UniqueName: \"kubernetes.io/projected/b2e131ef-51b4-4d7b-95f1-be753d22436a-kube-api-access-2fm85\") pod \"nova-scheduler-0\" (UID: \"b2e131ef-51b4-4d7b-95f1-be753d22436a\") " pod="openstack/nova-scheduler-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.441364 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2e131ef-51b4-4d7b-95f1-be753d22436a-config-data\") pod \"nova-scheduler-0\" (UID: \"b2e131ef-51b4-4d7b-95f1-be753d22436a\") " pod="openstack/nova-scheduler-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.445950 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b2e131ef-51b4-4d7b-95f1-be753d22436a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"b2e131ef-51b4-4d7b-95f1-be753d22436a\") " pod="openstack/nova-scheduler-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.458096 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fm85\" (UniqueName: \"kubernetes.io/projected/b2e131ef-51b4-4d7b-95f1-be753d22436a-kube-api-access-2fm85\") pod \"nova-scheduler-0\" (UID: \"b2e131ef-51b4-4d7b-95f1-be753d22436a\") " pod="openstack/nova-scheduler-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.502240 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.506525 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.602636 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.669485 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pw2g6\" (UniqueName: \"kubernetes.io/projected/22841d09-0c8c-49b0-a674-8bc092431ad9-kube-api-access-pw2g6\") pod \"22841d09-0c8c-49b0-a674-8bc092431ad9\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.669614 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e306eb84-97a5-44b8-9675-018da9b131a2-logs\") pod \"e306eb84-97a5-44b8-9675-018da9b131a2\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.669688 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22841d09-0c8c-49b0-a674-8bc092431ad9-combined-ca-bundle\") pod \"22841d09-0c8c-49b0-a674-8bc092431ad9\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.669703 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e306eb84-97a5-44b8-9675-018da9b131a2-combined-ca-bundle\") pod \"e306eb84-97a5-44b8-9675-018da9b131a2\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.669752 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22841d09-0c8c-49b0-a674-8bc092431ad9-logs\") pod \"22841d09-0c8c-49b0-a674-8bc092431ad9\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.669796 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e306eb84-97a5-44b8-9675-018da9b131a2-config-data\") pod \"e306eb84-97a5-44b8-9675-018da9b131a2\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.669860 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22841d09-0c8c-49b0-a674-8bc092431ad9-config-data\") pod \"22841d09-0c8c-49b0-a674-8bc092431ad9\" (UID: \"22841d09-0c8c-49b0-a674-8bc092431ad9\") " Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.669882 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5ndt\" (UniqueName: \"kubernetes.io/projected/e306eb84-97a5-44b8-9675-018da9b131a2-kube-api-access-g5ndt\") pod \"e306eb84-97a5-44b8-9675-018da9b131a2\" (UID: \"e306eb84-97a5-44b8-9675-018da9b131a2\") " Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.674979 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e306eb84-97a5-44b8-9675-018da9b131a2-logs" (OuterVolumeSpecName: "logs") pod "e306eb84-97a5-44b8-9675-018da9b131a2" (UID: "e306eb84-97a5-44b8-9675-018da9b131a2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.675664 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22841d09-0c8c-49b0-a674-8bc092431ad9-logs" (OuterVolumeSpecName: "logs") pod "22841d09-0c8c-49b0-a674-8bc092431ad9" (UID: "22841d09-0c8c-49b0-a674-8bc092431ad9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.682778 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e306eb84-97a5-44b8-9675-018da9b131a2-kube-api-access-g5ndt" (OuterVolumeSpecName: "kube-api-access-g5ndt") pod "e306eb84-97a5-44b8-9675-018da9b131a2" (UID: "e306eb84-97a5-44b8-9675-018da9b131a2"). InnerVolumeSpecName "kube-api-access-g5ndt". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.684038 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22841d09-0c8c-49b0-a674-8bc092431ad9-kube-api-access-pw2g6" (OuterVolumeSpecName: "kube-api-access-pw2g6") pod "22841d09-0c8c-49b0-a674-8bc092431ad9" (UID: "22841d09-0c8c-49b0-a674-8bc092431ad9"). InnerVolumeSpecName "kube-api-access-pw2g6". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.723679 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22841d09-0c8c-49b0-a674-8bc092431ad9-config-data" (OuterVolumeSpecName: "config-data") pod "22841d09-0c8c-49b0-a674-8bc092431ad9" (UID: "22841d09-0c8c-49b0-a674-8bc092431ad9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.754650 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22841d09-0c8c-49b0-a674-8bc092431ad9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "22841d09-0c8c-49b0-a674-8bc092431ad9" (UID: "22841d09-0c8c-49b0-a674-8bc092431ad9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.776573 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e306eb84-97a5-44b8-9675-018da9b131a2-config-data" (OuterVolumeSpecName: "config-data") pod "e306eb84-97a5-44b8-9675-018da9b131a2" (UID: "e306eb84-97a5-44b8-9675-018da9b131a2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.779061 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/22841d09-0c8c-49b0-a674-8bc092431ad9-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.779100 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5ndt\" (UniqueName: \"kubernetes.io/projected/e306eb84-97a5-44b8-9675-018da9b131a2-kube-api-access-g5ndt\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.779110 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pw2g6\" (UniqueName: \"kubernetes.io/projected/22841d09-0c8c-49b0-a674-8bc092431ad9-kube-api-access-pw2g6\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.779121 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e306eb84-97a5-44b8-9675-018da9b131a2-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.779130 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/22841d09-0c8c-49b0-a674-8bc092431ad9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.779139 4831 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/22841d09-0c8c-49b0-a674-8bc092431ad9-logs\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.779147 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e306eb84-97a5-44b8-9675-018da9b131a2-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.791016 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e306eb84-97a5-44b8-9675-018da9b131a2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e306eb84-97a5-44b8-9675-018da9b131a2" (UID: "e306eb84-97a5-44b8-9675-018da9b131a2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.825629 4831 generic.go:334] "Generic (PLEG): container finished" podID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerID="8d13188b6a11edaf0f068aab6ccc351d8563743e7c04acfee07314ab55af7794" exitCode=0 Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.826088 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"22841d09-0c8c-49b0-a674-8bc092431ad9","Type":"ContainerDied","Data":"8d13188b6a11edaf0f068aab6ccc351d8563743e7c04acfee07314ab55af7794"} Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.826521 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"22841d09-0c8c-49b0-a674-8bc092431ad9","Type":"ContainerDied","Data":"0f4ac62a9a26eeab2a44f876d31982e589bc149781159d72c65c09be1539b92f"} Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.826615 4831 scope.go:117] "RemoveContainer" containerID="8d13188b6a11edaf0f068aab6ccc351d8563743e7c04acfee07314ab55af7794" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.827333 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.834113 4831 generic.go:334] "Generic (PLEG): container finished" podID="e306eb84-97a5-44b8-9675-018da9b131a2" containerID="4e66c7cf75c0c3304e4b90e1c3e52b7136e800a21c8e342fc96adee70a183598" exitCode=0 Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.836817 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.838810 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e306eb84-97a5-44b8-9675-018da9b131a2","Type":"ContainerDied","Data":"4e66c7cf75c0c3304e4b90e1c3e52b7136e800a21c8e342fc96adee70a183598"} Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.838855 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"e306eb84-97a5-44b8-9675-018da9b131a2","Type":"ContainerDied","Data":"c46966245dc37b5e552e6963f251fa84110f2946eda6c2bfa1a06399e75a26f1"} Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.881473 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e306eb84-97a5-44b8-9675-018da9b131a2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.941701 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.941820 4831 scope.go:117] "RemoveContainer" containerID="3f6898f03c18071164b5bd2e1f6e1bd607c4b65e8b1eae9acb8f07a6aa623283" Dec 03 08:58:48 crc kubenswrapper[4831]: I1203 08:58:48.979797 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.005771 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.016941 4831 scope.go:117] "RemoveContainer" containerID="8d13188b6a11edaf0f068aab6ccc351d8563743e7c04acfee07314ab55af7794" Dec 03 08:58:49 crc kubenswrapper[4831]: E1203 08:58:49.018602 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d13188b6a11edaf0f068aab6ccc351d8563743e7c04acfee07314ab55af7794\": container with ID starting with 8d13188b6a11edaf0f068aab6ccc351d8563743e7c04acfee07314ab55af7794 not found: ID does not exist" containerID="8d13188b6a11edaf0f068aab6ccc351d8563743e7c04acfee07314ab55af7794" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.018696 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d13188b6a11edaf0f068aab6ccc351d8563743e7c04acfee07314ab55af7794"} err="failed to get container status \"8d13188b6a11edaf0f068aab6ccc351d8563743e7c04acfee07314ab55af7794\": rpc error: code = NotFound desc = could not find container \"8d13188b6a11edaf0f068aab6ccc351d8563743e7c04acfee07314ab55af7794\": container with ID starting with 8d13188b6a11edaf0f068aab6ccc351d8563743e7c04acfee07314ab55af7794 not found: ID does not exist" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.018731 4831 scope.go:117] "RemoveContainer" containerID="3f6898f03c18071164b5bd2e1f6e1bd607c4b65e8b1eae9acb8f07a6aa623283" Dec 03 08:58:49 crc kubenswrapper[4831]: E1203 08:58:49.034553 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f6898f03c18071164b5bd2e1f6e1bd607c4b65e8b1eae9acb8f07a6aa623283\": container with ID starting with 3f6898f03c18071164b5bd2e1f6e1bd607c4b65e8b1eae9acb8f07a6aa623283 not found: ID does not exist" containerID="3f6898f03c18071164b5bd2e1f6e1bd607c4b65e8b1eae9acb8f07a6aa623283" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.034985 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f6898f03c18071164b5bd2e1f6e1bd607c4b65e8b1eae9acb8f07a6aa623283"} err="failed to get container status \"3f6898f03c18071164b5bd2e1f6e1bd607c4b65e8b1eae9acb8f07a6aa623283\": rpc error: code = NotFound desc = could not find container \"3f6898f03c18071164b5bd2e1f6e1bd607c4b65e8b1eae9acb8f07a6aa623283\": container with ID starting with 3f6898f03c18071164b5bd2e1f6e1bd607c4b65e8b1eae9acb8f07a6aa623283 not found: ID does not exist" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.036160 4831 scope.go:117] "RemoveContainer" containerID="4e66c7cf75c0c3304e4b90e1c3e52b7136e800a21c8e342fc96adee70a183598" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.093673 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22841d09-0c8c-49b0-a674-8bc092431ad9" path="/var/lib/kubelet/pods/22841d09-0c8c-49b0-a674-8bc092431ad9/volumes" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.102278 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4856725f-47d3-4087-873b-89e7d53b6b0d" path="/var/lib/kubelet/pods/4856725f-47d3-4087-873b-89e7d53b6b0d/volumes" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.106602 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5e68d3fd-75f1-44e6-bbe6-3ce6a2657135" path="/var/lib/kubelet/pods/5e68d3fd-75f1-44e6-bbe6-3ce6a2657135/volumes" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.108004 4831 scope.go:117] "RemoveContainer" containerID="1ffa5f648dc2bef875942469a166ba6b5974d8cb1edfbd6b2b963cb68f7fb5cb" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.109810 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f75ecfa5-4e93-432b-907b-e79a5de81fc9" path="/var/lib/kubelet/pods/f75ecfa5-4e93-432b-907b-e79a5de81fc9/volumes" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.110477 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:58:49 crc kubenswrapper[4831]: E1203 08:58:49.110843 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerName="nova-metadata-log" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.110860 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerName="nova-metadata-log" Dec 03 08:58:49 crc kubenswrapper[4831]: E1203 08:58:49.110898 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerName="nova-metadata-metadata" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.110904 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerName="nova-metadata-metadata" Dec 03 08:58:49 crc kubenswrapper[4831]: E1203 08:58:49.110916 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e306eb84-97a5-44b8-9675-018da9b131a2" containerName="nova-api-api" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.110922 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e306eb84-97a5-44b8-9675-018da9b131a2" containerName="nova-api-api" Dec 03 08:58:49 crc kubenswrapper[4831]: E1203 08:58:49.110938 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e306eb84-97a5-44b8-9675-018da9b131a2" containerName="nova-api-log" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.110944 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="e306eb84-97a5-44b8-9675-018da9b131a2" containerName="nova-api-log" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.115136 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerName="nova-metadata-log" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.115195 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e306eb84-97a5-44b8-9675-018da9b131a2" containerName="nova-api-api" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.115213 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="22841d09-0c8c-49b0-a674-8bc092431ad9" containerName="nova-metadata-metadata" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.115226 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="e306eb84-97a5-44b8-9675-018da9b131a2" containerName="nova-api-log" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.126169 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.126291 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.128980 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.150979 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.178852 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.179075 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.182836 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.187085 4831 scope.go:117] "RemoveContainer" containerID="4e66c7cf75c0c3304e4b90e1c3e52b7136e800a21c8e342fc96adee70a183598" Dec 03 08:58:49 crc kubenswrapper[4831]: E1203 08:58:49.188813 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4e66c7cf75c0c3304e4b90e1c3e52b7136e800a21c8e342fc96adee70a183598\": container with ID starting with 4e66c7cf75c0c3304e4b90e1c3e52b7136e800a21c8e342fc96adee70a183598 not found: ID does not exist" containerID="4e66c7cf75c0c3304e4b90e1c3e52b7136e800a21c8e342fc96adee70a183598" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.188873 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4e66c7cf75c0c3304e4b90e1c3e52b7136e800a21c8e342fc96adee70a183598"} err="failed to get container status \"4e66c7cf75c0c3304e4b90e1c3e52b7136e800a21c8e342fc96adee70a183598\": rpc error: code = NotFound desc = could not find container \"4e66c7cf75c0c3304e4b90e1c3e52b7136e800a21c8e342fc96adee70a183598\": container with ID starting with 4e66c7cf75c0c3304e4b90e1c3e52b7136e800a21c8e342fc96adee70a183598 not found: ID does not exist" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.188940 4831 scope.go:117] "RemoveContainer" containerID="1ffa5f648dc2bef875942469a166ba6b5974d8cb1edfbd6b2b963cb68f7fb5cb" Dec 03 08:58:49 crc kubenswrapper[4831]: E1203 08:58:49.189558 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ffa5f648dc2bef875942469a166ba6b5974d8cb1edfbd6b2b963cb68f7fb5cb\": container with ID starting with 1ffa5f648dc2bef875942469a166ba6b5974d8cb1edfbd6b2b963cb68f7fb5cb not found: ID does not exist" containerID="1ffa5f648dc2bef875942469a166ba6b5974d8cb1edfbd6b2b963cb68f7fb5cb" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.189586 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ffa5f648dc2bef875942469a166ba6b5974d8cb1edfbd6b2b963cb68f7fb5cb"} err="failed to get container status \"1ffa5f648dc2bef875942469a166ba6b5974d8cb1edfbd6b2b963cb68f7fb5cb\": rpc error: code = NotFound desc = could not find container \"1ffa5f648dc2bef875942469a166ba6b5974d8cb1edfbd6b2b963cb68f7fb5cb\": container with ID starting with 1ffa5f648dc2bef875942469a166ba6b5974d8cb1edfbd6b2b963cb68f7fb5cb not found: ID does not exist" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.196990 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.253268 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.300335 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/913de50c-5388-47ab-9bc1-32292ef5c42f-config-data\") pod \"nova-api-0\" (UID: \"913de50c-5388-47ab-9bc1-32292ef5c42f\") " pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.300854 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/913de50c-5388-47ab-9bc1-32292ef5c42f-logs\") pod \"nova-api-0\" (UID: \"913de50c-5388-47ab-9bc1-32292ef5c42f\") " pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.301072 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fafc47f-ac43-45da-b0da-9941dbdc87f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9fafc47f-ac43-45da-b0da-9941dbdc87f1\") " pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.301402 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz5b8\" (UniqueName: \"kubernetes.io/projected/9fafc47f-ac43-45da-b0da-9941dbdc87f1-kube-api-access-lz5b8\") pod \"nova-metadata-0\" (UID: \"9fafc47f-ac43-45da-b0da-9941dbdc87f1\") " pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.301718 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/913de50c-5388-47ab-9bc1-32292ef5c42f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"913de50c-5388-47ab-9bc1-32292ef5c42f\") " pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.302004 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47slj\" (UniqueName: \"kubernetes.io/projected/913de50c-5388-47ab-9bc1-32292ef5c42f-kube-api-access-47slj\") pod \"nova-api-0\" (UID: \"913de50c-5388-47ab-9bc1-32292ef5c42f\") " pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.302674 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fafc47f-ac43-45da-b0da-9941dbdc87f1-config-data\") pod \"nova-metadata-0\" (UID: \"9fafc47f-ac43-45da-b0da-9941dbdc87f1\") " pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.302804 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fafc47f-ac43-45da-b0da-9941dbdc87f1-logs\") pod \"nova-metadata-0\" (UID: \"9fafc47f-ac43-45da-b0da-9941dbdc87f1\") " pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.310196 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.403530 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.404344 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/913de50c-5388-47ab-9bc1-32292ef5c42f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"913de50c-5388-47ab-9bc1-32292ef5c42f\") " pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.404427 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-47slj\" (UniqueName: \"kubernetes.io/projected/913de50c-5388-47ab-9bc1-32292ef5c42f-kube-api-access-47slj\") pod \"nova-api-0\" (UID: \"913de50c-5388-47ab-9bc1-32292ef5c42f\") " pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.404584 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fafc47f-ac43-45da-b0da-9941dbdc87f1-config-data\") pod \"nova-metadata-0\" (UID: \"9fafc47f-ac43-45da-b0da-9941dbdc87f1\") " pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.404600 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fafc47f-ac43-45da-b0da-9941dbdc87f1-logs\") pod \"nova-metadata-0\" (UID: \"9fafc47f-ac43-45da-b0da-9941dbdc87f1\") " pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.404676 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/913de50c-5388-47ab-9bc1-32292ef5c42f-config-data\") pod \"nova-api-0\" (UID: \"913de50c-5388-47ab-9bc1-32292ef5c42f\") " pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.404730 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/913de50c-5388-47ab-9bc1-32292ef5c42f-logs\") pod \"nova-api-0\" (UID: \"913de50c-5388-47ab-9bc1-32292ef5c42f\") " pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.404757 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fafc47f-ac43-45da-b0da-9941dbdc87f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9fafc47f-ac43-45da-b0da-9941dbdc87f1\") " pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.404796 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lz5b8\" (UniqueName: \"kubernetes.io/projected/9fafc47f-ac43-45da-b0da-9941dbdc87f1-kube-api-access-lz5b8\") pod \"nova-metadata-0\" (UID: \"9fafc47f-ac43-45da-b0da-9941dbdc87f1\") " pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.408451 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9fafc47f-ac43-45da-b0da-9941dbdc87f1-logs\") pod \"nova-metadata-0\" (UID: \"9fafc47f-ac43-45da-b0da-9941dbdc87f1\") " pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.408720 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/913de50c-5388-47ab-9bc1-32292ef5c42f-logs\") pod \"nova-api-0\" (UID: \"913de50c-5388-47ab-9bc1-32292ef5c42f\") " pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.408940 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/913de50c-5388-47ab-9bc1-32292ef5c42f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"913de50c-5388-47ab-9bc1-32292ef5c42f\") " pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.410807 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9fafc47f-ac43-45da-b0da-9941dbdc87f1-config-data\") pod \"nova-metadata-0\" (UID: \"9fafc47f-ac43-45da-b0da-9941dbdc87f1\") " pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.413417 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/913de50c-5388-47ab-9bc1-32292ef5c42f-config-data\") pod \"nova-api-0\" (UID: \"913de50c-5388-47ab-9bc1-32292ef5c42f\") " pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: W1203 08:58:49.415266 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2e131ef_51b4_4d7b_95f1_be753d22436a.slice/crio-6e6ec4fe78ef2d21ce602544a95debff782259e2f3c770ab61bb3263c2b14176 WatchSource:0}: Error finding container 6e6ec4fe78ef2d21ce602544a95debff782259e2f3c770ab61bb3263c2b14176: Status 404 returned error can't find the container with id 6e6ec4fe78ef2d21ce602544a95debff782259e2f3c770ab61bb3263c2b14176 Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.416736 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9fafc47f-ac43-45da-b0da-9941dbdc87f1-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"9fafc47f-ac43-45da-b0da-9941dbdc87f1\") " pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.423776 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz5b8\" (UniqueName: \"kubernetes.io/projected/9fafc47f-ac43-45da-b0da-9941dbdc87f1-kube-api-access-lz5b8\") pod \"nova-metadata-0\" (UID: \"9fafc47f-ac43-45da-b0da-9941dbdc87f1\") " pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.424025 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-47slj\" (UniqueName: \"kubernetes.io/projected/913de50c-5388-47ab-9bc1-32292ef5c42f-kube-api-access-47slj\") pod \"nova-api-0\" (UID: \"913de50c-5388-47ab-9bc1-32292ef5c42f\") " pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.500052 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.527541 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.855391 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2a89396d-a305-4cea-b054-d3ad772f79e0","Type":"ContainerStarted","Data":"aa82d9b57ea91632c5b344ab7568e1e995ed20f1207875c1ebe7898be3fa7193"} Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.855954 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"2a89396d-a305-4cea-b054-d3ad772f79e0","Type":"ContainerStarted","Data":"b1237389ee4221afb0b247a5fbb155c2fcf13a7e2573c7b114c7922f5db41324"} Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.856022 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.864841 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a2be4cea-396e-49b0-aef1-4c28ac8dcd78","Type":"ContainerStarted","Data":"d1895adc55d3ff323395464864d0db5323b713d9bab0325b4b46a377fc29e326"} Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.864890 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"a2be4cea-396e-49b0-aef1-4c28ac8dcd78","Type":"ContainerStarted","Data":"1f5308290b39ab77280ee49db88eaef8e6a94c8bc8860e2fe2dba39a89a449f8"} Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.865087 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.872902 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2e131ef-51b4-4d7b-95f1-be753d22436a","Type":"ContainerStarted","Data":"834c2b88ca90f0893d2b50355835597c5582e29725c4bfff12e1c5a8e9740cbf"} Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.872951 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"b2e131ef-51b4-4d7b-95f1-be753d22436a","Type":"ContainerStarted","Data":"6e6ec4fe78ef2d21ce602544a95debff782259e2f3c770ab61bb3263c2b14176"} Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.882070 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.882049749 podStartE2EDuration="2.882049749s" podCreationTimestamp="2025-12-03 08:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:58:49.872244284 +0000 UTC m=+8867.215827792" watchObservedRunningTime="2025-12-03 08:58:49.882049749 +0000 UTC m=+8867.225633267" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.897119 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.897094578 podStartE2EDuration="2.897094578s" podCreationTimestamp="2025-12-03 08:58:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:58:49.889953135 +0000 UTC m=+8867.233536653" watchObservedRunningTime="2025-12-03 08:58:49.897094578 +0000 UTC m=+8867.240678086" Dec 03 08:58:49 crc kubenswrapper[4831]: I1203 08:58:49.922162 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.922139138 podStartE2EDuration="1.922139138s" podCreationTimestamp="2025-12-03 08:58:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:58:49.90326303 +0000 UTC m=+8867.246846548" watchObservedRunningTime="2025-12-03 08:58:49.922139138 +0000 UTC m=+8867.265722646" Dec 03 08:58:50 crc kubenswrapper[4831]: I1203 08:58:50.072026 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Dec 03 08:58:50 crc kubenswrapper[4831]: W1203 08:58:50.074741 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod913de50c_5388_47ab_9bc1_32292ef5c42f.slice/crio-c78a2b7e2bf48c5d951ffdcb54356742aa445e14d66be9e2d1995b9b87bc7a51 WatchSource:0}: Error finding container c78a2b7e2bf48c5d951ffdcb54356742aa445e14d66be9e2d1995b9b87bc7a51: Status 404 returned error can't find the container with id c78a2b7e2bf48c5d951ffdcb54356742aa445e14d66be9e2d1995b9b87bc7a51 Dec 03 08:58:50 crc kubenswrapper[4831]: W1203 08:58:50.081906 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fafc47f_ac43_45da_b0da_9941dbdc87f1.slice/crio-6ff78e87ea2a04460cdd8c065d16e505b880158eb6917d550c7ec3ad9b3e726a WatchSource:0}: Error finding container 6ff78e87ea2a04460cdd8c065d16e505b880158eb6917d550c7ec3ad9b3e726a: Status 404 returned error can't find the container with id 6ff78e87ea2a04460cdd8c065d16e505b880158eb6917d550c7ec3ad9b3e726a Dec 03 08:58:50 crc kubenswrapper[4831]: I1203 08:58:50.084814 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Dec 03 08:58:50 crc kubenswrapper[4831]: I1203 08:58:50.884956 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9fafc47f-ac43-45da-b0da-9941dbdc87f1","Type":"ContainerStarted","Data":"3d783a7c3001d0b7380a37a703bc7354190618bb4a9fc0d9193230bb7ef1287a"} Dec 03 08:58:50 crc kubenswrapper[4831]: I1203 08:58:50.885364 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9fafc47f-ac43-45da-b0da-9941dbdc87f1","Type":"ContainerStarted","Data":"73d2823622cd4973adf9e611a01881aa59d7649e6fe3ff579c5913d4d158053e"} Dec 03 08:58:50 crc kubenswrapper[4831]: I1203 08:58:50.885384 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"9fafc47f-ac43-45da-b0da-9941dbdc87f1","Type":"ContainerStarted","Data":"6ff78e87ea2a04460cdd8c065d16e505b880158eb6917d550c7ec3ad9b3e726a"} Dec 03 08:58:50 crc kubenswrapper[4831]: I1203 08:58:50.887659 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"913de50c-5388-47ab-9bc1-32292ef5c42f","Type":"ContainerStarted","Data":"61e9f489d849e59ccaa2686a041033f0960913e6cd842591091d4bf45bcf1a57"} Dec 03 08:58:50 crc kubenswrapper[4831]: I1203 08:58:50.887718 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"913de50c-5388-47ab-9bc1-32292ef5c42f","Type":"ContainerStarted","Data":"753e465a9fd91fec5212221949c26819d7d5bab1d27766dea2dfbcd747ae48c6"} Dec 03 08:58:50 crc kubenswrapper[4831]: I1203 08:58:50.887733 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"913de50c-5388-47ab-9bc1-32292ef5c42f","Type":"ContainerStarted","Data":"c78a2b7e2bf48c5d951ffdcb54356742aa445e14d66be9e2d1995b9b87bc7a51"} Dec 03 08:58:50 crc kubenswrapper[4831]: I1203 08:58:50.923927 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.923907979 podStartE2EDuration="2.923907979s" podCreationTimestamp="2025-12-03 08:58:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:58:50.900479249 +0000 UTC m=+8868.244062747" watchObservedRunningTime="2025-12-03 08:58:50.923907979 +0000 UTC m=+8868.267491487" Dec 03 08:58:50 crc kubenswrapper[4831]: I1203 08:58:50.957151 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.957124313 podStartE2EDuration="2.957124313s" podCreationTimestamp="2025-12-03 08:58:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 08:58:50.93710093 +0000 UTC m=+8868.280684458" watchObservedRunningTime="2025-12-03 08:58:50.957124313 +0000 UTC m=+8868.300707831" Dec 03 08:58:51 crc kubenswrapper[4831]: I1203 08:58:51.031613 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e306eb84-97a5-44b8-9675-018da9b131a2" path="/var/lib/kubelet/pods/e306eb84-97a5-44b8-9675-018da9b131a2/volumes" Dec 03 08:58:53 crc kubenswrapper[4831]: I1203 08:58:53.503142 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Dec 03 08:58:54 crc kubenswrapper[4831]: I1203 08:58:54.500468 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 08:58:54 crc kubenswrapper[4831]: I1203 08:58:54.500798 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Dec 03 08:58:58 crc kubenswrapper[4831]: I1203 08:58:58.462473 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Dec 03 08:58:58 crc kubenswrapper[4831]: I1203 08:58:58.472962 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Dec 03 08:58:58 crc kubenswrapper[4831]: I1203 08:58:58.502783 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Dec 03 08:58:58 crc kubenswrapper[4831]: I1203 08:58:58.547890 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Dec 03 08:58:59 crc kubenswrapper[4831]: I1203 08:58:59.031308 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Dec 03 08:58:59 crc kubenswrapper[4831]: I1203 08:58:59.500388 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 08:58:59 crc kubenswrapper[4831]: I1203 08:58:59.500454 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Dec 03 08:58:59 crc kubenswrapper[4831]: I1203 08:58:59.528144 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 08:58:59 crc kubenswrapper[4831]: I1203 08:58:59.528274 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Dec 03 08:59:00 crc kubenswrapper[4831]: I1203 08:59:00.623637 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9fafc47f-ac43-45da-b0da-9941dbdc87f1" containerName="nova-metadata-log" probeResult="failure" output="Get \"http://10.217.1.199:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:59:00 crc kubenswrapper[4831]: I1203 08:59:00.623576 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="9fafc47f-ac43-45da-b0da-9941dbdc87f1" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"http://10.217.1.199:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:59:00 crc kubenswrapper[4831]: I1203 08:59:00.623854 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="913de50c-5388-47ab-9bc1-32292ef5c42f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.1.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:59:00 crc kubenswrapper[4831]: I1203 08:59:00.623815 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="913de50c-5388-47ab-9bc1-32292ef5c42f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.1.200:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Dec 03 08:59:09 crc kubenswrapper[4831]: I1203 08:59:09.503179 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 08:59:09 crc kubenswrapper[4831]: I1203 08:59:09.503921 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Dec 03 08:59:09 crc kubenswrapper[4831]: I1203 08:59:09.505802 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 08:59:09 crc kubenswrapper[4831]: I1203 08:59:09.505874 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Dec 03 08:59:09 crc kubenswrapper[4831]: I1203 08:59:09.538027 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 08:59:09 crc kubenswrapper[4831]: I1203 08:59:09.538346 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 08:59:09 crc kubenswrapper[4831]: I1203 08:59:09.538878 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Dec 03 08:59:09 crc kubenswrapper[4831]: I1203 08:59:09.541181 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 08:59:10 crc kubenswrapper[4831]: I1203 08:59:10.150244 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Dec 03 08:59:10 crc kubenswrapper[4831]: I1203 08:59:10.153614 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Dec 03 08:59:57 crc kubenswrapper[4831]: I1203 08:59:57.597250 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 08:59:57 crc kubenswrapper[4831]: I1203 08:59:57.597923 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.167923 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c"] Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.170237 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.172620 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.175401 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.185890 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c"] Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.240407 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41be7f8a-432f-4ec6-8abd-190a8be28327-secret-volume\") pod \"collect-profiles-29412540-d6j7c\" (UID: \"41be7f8a-432f-4ec6-8abd-190a8be28327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.240603 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41be7f8a-432f-4ec6-8abd-190a8be28327-config-volume\") pod \"collect-profiles-29412540-d6j7c\" (UID: \"41be7f8a-432f-4ec6-8abd-190a8be28327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.240645 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vz8jv\" (UniqueName: \"kubernetes.io/projected/41be7f8a-432f-4ec6-8abd-190a8be28327-kube-api-access-vz8jv\") pod \"collect-profiles-29412540-d6j7c\" (UID: \"41be7f8a-432f-4ec6-8abd-190a8be28327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.342052 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41be7f8a-432f-4ec6-8abd-190a8be28327-config-volume\") pod \"collect-profiles-29412540-d6j7c\" (UID: \"41be7f8a-432f-4ec6-8abd-190a8be28327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.342114 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vz8jv\" (UniqueName: \"kubernetes.io/projected/41be7f8a-432f-4ec6-8abd-190a8be28327-kube-api-access-vz8jv\") pod \"collect-profiles-29412540-d6j7c\" (UID: \"41be7f8a-432f-4ec6-8abd-190a8be28327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.342190 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41be7f8a-432f-4ec6-8abd-190a8be28327-secret-volume\") pod \"collect-profiles-29412540-d6j7c\" (UID: \"41be7f8a-432f-4ec6-8abd-190a8be28327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.343307 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41be7f8a-432f-4ec6-8abd-190a8be28327-config-volume\") pod \"collect-profiles-29412540-d6j7c\" (UID: \"41be7f8a-432f-4ec6-8abd-190a8be28327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.351461 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41be7f8a-432f-4ec6-8abd-190a8be28327-secret-volume\") pod \"collect-profiles-29412540-d6j7c\" (UID: \"41be7f8a-432f-4ec6-8abd-190a8be28327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.359208 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vz8jv\" (UniqueName: \"kubernetes.io/projected/41be7f8a-432f-4ec6-8abd-190a8be28327-kube-api-access-vz8jv\") pod \"collect-profiles-29412540-d6j7c\" (UID: \"41be7f8a-432f-4ec6-8abd-190a8be28327\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" Dec 03 09:00:00 crc kubenswrapper[4831]: I1203 09:00:00.503868 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" Dec 03 09:00:01 crc kubenswrapper[4831]: I1203 09:00:01.045253 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c"] Dec 03 09:00:01 crc kubenswrapper[4831]: E1203 09:00:01.690873 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41be7f8a_432f_4ec6_8abd_190a8be28327.slice/crio-conmon-9519ad879e1c0dcb06ed80824fc341435bf69419f811cefb505644844e180766.scope\": RecentStats: unable to find data in memory cache]" Dec 03 09:00:01 crc kubenswrapper[4831]: I1203 09:00:01.733997 4831 generic.go:334] "Generic (PLEG): container finished" podID="41be7f8a-432f-4ec6-8abd-190a8be28327" containerID="9519ad879e1c0dcb06ed80824fc341435bf69419f811cefb505644844e180766" exitCode=0 Dec 03 09:00:01 crc kubenswrapper[4831]: I1203 09:00:01.734053 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" event={"ID":"41be7f8a-432f-4ec6-8abd-190a8be28327","Type":"ContainerDied","Data":"9519ad879e1c0dcb06ed80824fc341435bf69419f811cefb505644844e180766"} Dec 03 09:00:01 crc kubenswrapper[4831]: I1203 09:00:01.734446 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" event={"ID":"41be7f8a-432f-4ec6-8abd-190a8be28327","Type":"ContainerStarted","Data":"b0b65522018550d02a00a13d7f85b12c9bb4c3da88e166f723fca25d86f4f52f"} Dec 03 09:00:03 crc kubenswrapper[4831]: I1203 09:00:03.212325 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" Dec 03 09:00:03 crc kubenswrapper[4831]: I1203 09:00:03.400080 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41be7f8a-432f-4ec6-8abd-190a8be28327-secret-volume\") pod \"41be7f8a-432f-4ec6-8abd-190a8be28327\" (UID: \"41be7f8a-432f-4ec6-8abd-190a8be28327\") " Dec 03 09:00:03 crc kubenswrapper[4831]: I1203 09:00:03.400589 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vz8jv\" (UniqueName: \"kubernetes.io/projected/41be7f8a-432f-4ec6-8abd-190a8be28327-kube-api-access-vz8jv\") pod \"41be7f8a-432f-4ec6-8abd-190a8be28327\" (UID: \"41be7f8a-432f-4ec6-8abd-190a8be28327\") " Dec 03 09:00:03 crc kubenswrapper[4831]: I1203 09:00:03.400651 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41be7f8a-432f-4ec6-8abd-190a8be28327-config-volume\") pod \"41be7f8a-432f-4ec6-8abd-190a8be28327\" (UID: \"41be7f8a-432f-4ec6-8abd-190a8be28327\") " Dec 03 09:00:03 crc kubenswrapper[4831]: I1203 09:00:03.401999 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/41be7f8a-432f-4ec6-8abd-190a8be28327-config-volume" (OuterVolumeSpecName: "config-volume") pod "41be7f8a-432f-4ec6-8abd-190a8be28327" (UID: "41be7f8a-432f-4ec6-8abd-190a8be28327"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:00:03 crc kubenswrapper[4831]: I1203 09:00:03.412816 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41be7f8a-432f-4ec6-8abd-190a8be28327-kube-api-access-vz8jv" (OuterVolumeSpecName: "kube-api-access-vz8jv") pod "41be7f8a-432f-4ec6-8abd-190a8be28327" (UID: "41be7f8a-432f-4ec6-8abd-190a8be28327"). InnerVolumeSpecName "kube-api-access-vz8jv". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:00:03 crc kubenswrapper[4831]: I1203 09:00:03.413497 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41be7f8a-432f-4ec6-8abd-190a8be28327-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "41be7f8a-432f-4ec6-8abd-190a8be28327" (UID: "41be7f8a-432f-4ec6-8abd-190a8be28327"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:00:03 crc kubenswrapper[4831]: I1203 09:00:03.504760 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/41be7f8a-432f-4ec6-8abd-190a8be28327-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:03 crc kubenswrapper[4831]: I1203 09:00:03.504810 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vz8jv\" (UniqueName: \"kubernetes.io/projected/41be7f8a-432f-4ec6-8abd-190a8be28327-kube-api-access-vz8jv\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:03 crc kubenswrapper[4831]: I1203 09:00:03.504823 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/41be7f8a-432f-4ec6-8abd-190a8be28327-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:00:03 crc kubenswrapper[4831]: I1203 09:00:03.760149 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" event={"ID":"41be7f8a-432f-4ec6-8abd-190a8be28327","Type":"ContainerDied","Data":"b0b65522018550d02a00a13d7f85b12c9bb4c3da88e166f723fca25d86f4f52f"} Dec 03 09:00:03 crc kubenswrapper[4831]: I1203 09:00:03.760196 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b0b65522018550d02a00a13d7f85b12c9bb4c3da88e166f723fca25d86f4f52f" Dec 03 09:00:03 crc kubenswrapper[4831]: I1203 09:00:03.760718 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412540-d6j7c" Dec 03 09:00:04 crc kubenswrapper[4831]: I1203 09:00:04.337417 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs"] Dec 03 09:00:04 crc kubenswrapper[4831]: I1203 09:00:04.355823 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412495-dsnxs"] Dec 03 09:00:05 crc kubenswrapper[4831]: I1203 09:00:05.038675 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9cbbf48-7afa-4974-abdf-29610bca3012" path="/var/lib/kubelet/pods/f9cbbf48-7afa-4974-abdf-29610bca3012/volumes" Dec 03 09:00:27 crc kubenswrapper[4831]: I1203 09:00:27.596638 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:00:27 crc kubenswrapper[4831]: I1203 09:00:27.597259 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:00:27 crc kubenswrapper[4831]: I1203 09:00:27.614724 4831 scope.go:117] "RemoveContainer" containerID="f414287ed552dd798b0f0866d2be1f587127d848aa75325c91a72844d9535b21" Dec 03 09:00:57 crc kubenswrapper[4831]: I1203 09:00:57.597246 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:00:57 crc kubenswrapper[4831]: I1203 09:00:57.598032 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:00:57 crc kubenswrapper[4831]: I1203 09:00:57.598099 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 09:00:57 crc kubenswrapper[4831]: I1203 09:00:57.599296 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"0cb79465c4851d9ccab0a172ec47f2783944898da9d4aedbce6d8b6c849dbeb6"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:00:57 crc kubenswrapper[4831]: I1203 09:00:57.599422 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://0cb79465c4851d9ccab0a172ec47f2783944898da9d4aedbce6d8b6c849dbeb6" gracePeriod=600 Dec 03 09:00:58 crc kubenswrapper[4831]: I1203 09:00:58.376948 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="0cb79465c4851d9ccab0a172ec47f2783944898da9d4aedbce6d8b6c849dbeb6" exitCode=0 Dec 03 09:00:58 crc kubenswrapper[4831]: I1203 09:00:58.377136 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"0cb79465c4851d9ccab0a172ec47f2783944898da9d4aedbce6d8b6c849dbeb6"} Dec 03 09:00:58 crc kubenswrapper[4831]: I1203 09:00:58.377370 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2"} Dec 03 09:00:58 crc kubenswrapper[4831]: I1203 09:00:58.377393 4831 scope.go:117] "RemoveContainer" containerID="421fc2704f0f234fbdab13a9d44757f9a21558ccef91f7e881550d37771b8eed" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.167074 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29412541-9qw4r"] Dec 03 09:01:00 crc kubenswrapper[4831]: E1203 09:01:00.168195 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="41be7f8a-432f-4ec6-8abd-190a8be28327" containerName="collect-profiles" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.168212 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="41be7f8a-432f-4ec6-8abd-190a8be28327" containerName="collect-profiles" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.168493 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="41be7f8a-432f-4ec6-8abd-190a8be28327" containerName="collect-profiles" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.169730 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.183998 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412541-9qw4r"] Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.301409 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-combined-ca-bundle\") pod \"keystone-cron-29412541-9qw4r\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.301856 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g7nc\" (UniqueName: \"kubernetes.io/projected/b8d4be97-9167-4feb-b779-ba8db1269611-kube-api-access-4g7nc\") pod \"keystone-cron-29412541-9qw4r\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.302080 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-config-data\") pod \"keystone-cron-29412541-9qw4r\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.302234 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-fernet-keys\") pod \"keystone-cron-29412541-9qw4r\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.403848 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-combined-ca-bundle\") pod \"keystone-cron-29412541-9qw4r\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.404098 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g7nc\" (UniqueName: \"kubernetes.io/projected/b8d4be97-9167-4feb-b779-ba8db1269611-kube-api-access-4g7nc\") pod \"keystone-cron-29412541-9qw4r\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.404274 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-config-data\") pod \"keystone-cron-29412541-9qw4r\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.404394 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-fernet-keys\") pod \"keystone-cron-29412541-9qw4r\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.410511 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-combined-ca-bundle\") pod \"keystone-cron-29412541-9qw4r\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.410640 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-config-data\") pod \"keystone-cron-29412541-9qw4r\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.414402 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-fernet-keys\") pod \"keystone-cron-29412541-9qw4r\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.426645 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g7nc\" (UniqueName: \"kubernetes.io/projected/b8d4be97-9167-4feb-b779-ba8db1269611-kube-api-access-4g7nc\") pod \"keystone-cron-29412541-9qw4r\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:00 crc kubenswrapper[4831]: I1203 09:01:00.501923 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:01 crc kubenswrapper[4831]: W1203 09:01:01.067008 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8d4be97_9167_4feb_b779_ba8db1269611.slice/crio-b1cd184f3d4a16e99a02be1d85af8e2597f84514b98698235101a44330f35270 WatchSource:0}: Error finding container b1cd184f3d4a16e99a02be1d85af8e2597f84514b98698235101a44330f35270: Status 404 returned error can't find the container with id b1cd184f3d4a16e99a02be1d85af8e2597f84514b98698235101a44330f35270 Dec 03 09:01:01 crc kubenswrapper[4831]: I1203 09:01:01.067982 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29412541-9qw4r"] Dec 03 09:01:01 crc kubenswrapper[4831]: I1203 09:01:01.416741 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412541-9qw4r" event={"ID":"b8d4be97-9167-4feb-b779-ba8db1269611","Type":"ContainerStarted","Data":"257aedfb686d30f9aa640b0238c3536f6ece0598f3e12f0f4c0bc471f14e5909"} Dec 03 09:01:01 crc kubenswrapper[4831]: I1203 09:01:01.417066 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412541-9qw4r" event={"ID":"b8d4be97-9167-4feb-b779-ba8db1269611","Type":"ContainerStarted","Data":"b1cd184f3d4a16e99a02be1d85af8e2597f84514b98698235101a44330f35270"} Dec 03 09:01:01 crc kubenswrapper[4831]: I1203 09:01:01.440691 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29412541-9qw4r" podStartSLOduration=1.440669111 podStartE2EDuration="1.440669111s" podCreationTimestamp="2025-12-03 09:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-03 09:01:01.43484676 +0000 UTC m=+8998.778430268" watchObservedRunningTime="2025-12-03 09:01:01.440669111 +0000 UTC m=+8998.784252619" Dec 03 09:01:03 crc kubenswrapper[4831]: E1203 09:01:03.803088 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb8d4be97_9167_4feb_b779_ba8db1269611.slice/crio-257aedfb686d30f9aa640b0238c3536f6ece0598f3e12f0f4c0bc471f14e5909.scope\": RecentStats: unable to find data in memory cache]" Dec 03 09:01:04 crc kubenswrapper[4831]: I1203 09:01:04.452672 4831 generic.go:334] "Generic (PLEG): container finished" podID="b8d4be97-9167-4feb-b779-ba8db1269611" containerID="257aedfb686d30f9aa640b0238c3536f6ece0598f3e12f0f4c0bc471f14e5909" exitCode=0 Dec 03 09:01:04 crc kubenswrapper[4831]: I1203 09:01:04.452882 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412541-9qw4r" event={"ID":"b8d4be97-9167-4feb-b779-ba8db1269611","Type":"ContainerDied","Data":"257aedfb686d30f9aa640b0238c3536f6ece0598f3e12f0f4c0bc471f14e5909"} Dec 03 09:01:05 crc kubenswrapper[4831]: I1203 09:01:05.888003 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.023920 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g7nc\" (UniqueName: \"kubernetes.io/projected/b8d4be97-9167-4feb-b779-ba8db1269611-kube-api-access-4g7nc\") pod \"b8d4be97-9167-4feb-b779-ba8db1269611\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.024055 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-config-data\") pod \"b8d4be97-9167-4feb-b779-ba8db1269611\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.024276 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-combined-ca-bundle\") pod \"b8d4be97-9167-4feb-b779-ba8db1269611\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.024345 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-fernet-keys\") pod \"b8d4be97-9167-4feb-b779-ba8db1269611\" (UID: \"b8d4be97-9167-4feb-b779-ba8db1269611\") " Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.045357 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "b8d4be97-9167-4feb-b779-ba8db1269611" (UID: "b8d4be97-9167-4feb-b779-ba8db1269611"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.047529 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8d4be97-9167-4feb-b779-ba8db1269611-kube-api-access-4g7nc" (OuterVolumeSpecName: "kube-api-access-4g7nc") pod "b8d4be97-9167-4feb-b779-ba8db1269611" (UID: "b8d4be97-9167-4feb-b779-ba8db1269611"). InnerVolumeSpecName "kube-api-access-4g7nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.058525 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8d4be97-9167-4feb-b779-ba8db1269611" (UID: "b8d4be97-9167-4feb-b779-ba8db1269611"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.087035 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-config-data" (OuterVolumeSpecName: "config-data") pod "b8d4be97-9167-4feb-b779-ba8db1269611" (UID: "b8d4be97-9167-4feb-b779-ba8db1269611"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.127017 4831 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.127053 4831 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-fernet-keys\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.127066 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g7nc\" (UniqueName: \"kubernetes.io/projected/b8d4be97-9167-4feb-b779-ba8db1269611-kube-api-access-4g7nc\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.127078 4831 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8d4be97-9167-4feb-b779-ba8db1269611-config-data\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.474132 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29412541-9qw4r" event={"ID":"b8d4be97-9167-4feb-b779-ba8db1269611","Type":"ContainerDied","Data":"b1cd184f3d4a16e99a02be1d85af8e2597f84514b98698235101a44330f35270"} Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.474559 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1cd184f3d4a16e99a02be1d85af8e2597f84514b98698235101a44330f35270" Dec 03 09:01:06 crc kubenswrapper[4831]: I1203 09:01:06.474233 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29412541-9qw4r" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.207120 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ngslh"] Dec 03 09:01:34 crc kubenswrapper[4831]: E1203 09:01:34.208367 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8d4be97-9167-4feb-b779-ba8db1269611" containerName="keystone-cron" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.208385 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8d4be97-9167-4feb-b779-ba8db1269611" containerName="keystone-cron" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.208702 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8d4be97-9167-4feb-b779-ba8db1269611" containerName="keystone-cron" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.210985 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.225733 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngslh"] Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.283746 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9f20e9-0fef-4514-98b7-95513a86ff1c-utilities\") pod \"redhat-operators-ngslh\" (UID: \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\") " pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.283782 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9f20e9-0fef-4514-98b7-95513a86ff1c-catalog-content\") pod \"redhat-operators-ngslh\" (UID: \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\") " pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.283860 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2lx2\" (UniqueName: \"kubernetes.io/projected/8e9f20e9-0fef-4514-98b7-95513a86ff1c-kube-api-access-c2lx2\") pod \"redhat-operators-ngslh\" (UID: \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\") " pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.386605 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9f20e9-0fef-4514-98b7-95513a86ff1c-utilities\") pod \"redhat-operators-ngslh\" (UID: \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\") " pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.386658 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9f20e9-0fef-4514-98b7-95513a86ff1c-catalog-content\") pod \"redhat-operators-ngslh\" (UID: \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\") " pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.386790 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c2lx2\" (UniqueName: \"kubernetes.io/projected/8e9f20e9-0fef-4514-98b7-95513a86ff1c-kube-api-access-c2lx2\") pod \"redhat-operators-ngslh\" (UID: \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\") " pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.387263 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9f20e9-0fef-4514-98b7-95513a86ff1c-utilities\") pod \"redhat-operators-ngslh\" (UID: \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\") " pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.388645 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9f20e9-0fef-4514-98b7-95513a86ff1c-catalog-content\") pod \"redhat-operators-ngslh\" (UID: \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\") " pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.408501 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2lx2\" (UniqueName: \"kubernetes.io/projected/8e9f20e9-0fef-4514-98b7-95513a86ff1c-kube-api-access-c2lx2\") pod \"redhat-operators-ngslh\" (UID: \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\") " pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:34 crc kubenswrapper[4831]: I1203 09:01:34.542659 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:35 crc kubenswrapper[4831]: I1203 09:01:35.073532 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ngslh"] Dec 03 09:01:35 crc kubenswrapper[4831]: I1203 09:01:35.820953 4831 generic.go:334] "Generic (PLEG): container finished" podID="8e9f20e9-0fef-4514-98b7-95513a86ff1c" containerID="5b27e44bac93e2391f1e5134468625c12c54c0d9d9bccc0e2574663571963576" exitCode=0 Dec 03 09:01:35 crc kubenswrapper[4831]: I1203 09:01:35.821011 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngslh" event={"ID":"8e9f20e9-0fef-4514-98b7-95513a86ff1c","Type":"ContainerDied","Data":"5b27e44bac93e2391f1e5134468625c12c54c0d9d9bccc0e2574663571963576"} Dec 03 09:01:35 crc kubenswrapper[4831]: I1203 09:01:35.821254 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngslh" event={"ID":"8e9f20e9-0fef-4514-98b7-95513a86ff1c","Type":"ContainerStarted","Data":"9dd5bf559700774f12ec588f2b6d349665db29f7236f23fd80dbcf07affea03e"} Dec 03 09:01:35 crc kubenswrapper[4831]: I1203 09:01:35.823230 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:01:37 crc kubenswrapper[4831]: I1203 09:01:37.843803 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngslh" event={"ID":"8e9f20e9-0fef-4514-98b7-95513a86ff1c","Type":"ContainerStarted","Data":"848266d7a0c7bd8941a2658681e684d7af8326e23dc0b3ad2048549cfc85a5ea"} Dec 03 09:01:40 crc kubenswrapper[4831]: I1203 09:01:40.881901 4831 generic.go:334] "Generic (PLEG): container finished" podID="8e9f20e9-0fef-4514-98b7-95513a86ff1c" containerID="848266d7a0c7bd8941a2658681e684d7af8326e23dc0b3ad2048549cfc85a5ea" exitCode=0 Dec 03 09:01:40 crc kubenswrapper[4831]: I1203 09:01:40.881994 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngslh" event={"ID":"8e9f20e9-0fef-4514-98b7-95513a86ff1c","Type":"ContainerDied","Data":"848266d7a0c7bd8941a2658681e684d7af8326e23dc0b3ad2048549cfc85a5ea"} Dec 03 09:01:41 crc kubenswrapper[4831]: I1203 09:01:41.898459 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngslh" event={"ID":"8e9f20e9-0fef-4514-98b7-95513a86ff1c","Type":"ContainerStarted","Data":"52957348b0fd1d96351f730d86d3dbe6ab804a44ad019a750069b498cc04942c"} Dec 03 09:01:41 crc kubenswrapper[4831]: I1203 09:01:41.931655 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ngslh" podStartSLOduration=2.446810539 podStartE2EDuration="7.931622983s" podCreationTimestamp="2025-12-03 09:01:34 +0000 UTC" firstStartedPulling="2025-12-03 09:01:35.823019101 +0000 UTC m=+9033.166602609" lastFinishedPulling="2025-12-03 09:01:41.307831515 +0000 UTC m=+9038.651415053" observedRunningTime="2025-12-03 09:01:41.920768035 +0000 UTC m=+9039.264351543" watchObservedRunningTime="2025-12-03 09:01:41.931622983 +0000 UTC m=+9039.275206521" Dec 03 09:01:44 crc kubenswrapper[4831]: I1203 09:01:44.543209 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:44 crc kubenswrapper[4831]: I1203 09:01:44.543720 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:45 crc kubenswrapper[4831]: I1203 09:01:45.589444 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ngslh" podUID="8e9f20e9-0fef-4514-98b7-95513a86ff1c" containerName="registry-server" probeResult="failure" output=< Dec 03 09:01:45 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 03 09:01:45 crc kubenswrapper[4831]: > Dec 03 09:01:54 crc kubenswrapper[4831]: I1203 09:01:54.616047 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:54 crc kubenswrapper[4831]: I1203 09:01:54.686554 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:54 crc kubenswrapper[4831]: I1203 09:01:54.872143 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngslh"] Dec 03 09:01:56 crc kubenswrapper[4831]: I1203 09:01:56.094155 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ngslh" podUID="8e9f20e9-0fef-4514-98b7-95513a86ff1c" containerName="registry-server" containerID="cri-o://52957348b0fd1d96351f730d86d3dbe6ab804a44ad019a750069b498cc04942c" gracePeriod=2 Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.111653 4831 generic.go:334] "Generic (PLEG): container finished" podID="8e9f20e9-0fef-4514-98b7-95513a86ff1c" containerID="52957348b0fd1d96351f730d86d3dbe6ab804a44ad019a750069b498cc04942c" exitCode=0 Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.111742 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngslh" event={"ID":"8e9f20e9-0fef-4514-98b7-95513a86ff1c","Type":"ContainerDied","Data":"52957348b0fd1d96351f730d86d3dbe6ab804a44ad019a750069b498cc04942c"} Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.111999 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ngslh" event={"ID":"8e9f20e9-0fef-4514-98b7-95513a86ff1c","Type":"ContainerDied","Data":"9dd5bf559700774f12ec588f2b6d349665db29f7236f23fd80dbcf07affea03e"} Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.112018 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9dd5bf559700774f12ec588f2b6d349665db29f7236f23fd80dbcf07affea03e" Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.390516 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.511217 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9f20e9-0fef-4514-98b7-95513a86ff1c-catalog-content\") pod \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\" (UID: \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\") " Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.511399 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c2lx2\" (UniqueName: \"kubernetes.io/projected/8e9f20e9-0fef-4514-98b7-95513a86ff1c-kube-api-access-c2lx2\") pod \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\" (UID: \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\") " Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.511509 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9f20e9-0fef-4514-98b7-95513a86ff1c-utilities\") pod \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\" (UID: \"8e9f20e9-0fef-4514-98b7-95513a86ff1c\") " Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.512509 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e9f20e9-0fef-4514-98b7-95513a86ff1c-utilities" (OuterVolumeSpecName: "utilities") pod "8e9f20e9-0fef-4514-98b7-95513a86ff1c" (UID: "8e9f20e9-0fef-4514-98b7-95513a86ff1c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.517933 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e9f20e9-0fef-4514-98b7-95513a86ff1c-kube-api-access-c2lx2" (OuterVolumeSpecName: "kube-api-access-c2lx2") pod "8e9f20e9-0fef-4514-98b7-95513a86ff1c" (UID: "8e9f20e9-0fef-4514-98b7-95513a86ff1c"). InnerVolumeSpecName "kube-api-access-c2lx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.614461 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c2lx2\" (UniqueName: \"kubernetes.io/projected/8e9f20e9-0fef-4514-98b7-95513a86ff1c-kube-api-access-c2lx2\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.614493 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8e9f20e9-0fef-4514-98b7-95513a86ff1c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.614733 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e9f20e9-0fef-4514-98b7-95513a86ff1c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8e9f20e9-0fef-4514-98b7-95513a86ff1c" (UID: "8e9f20e9-0fef-4514-98b7-95513a86ff1c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:01:57 crc kubenswrapper[4831]: I1203 09:01:57.717933 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8e9f20e9-0fef-4514-98b7-95513a86ff1c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:01:58 crc kubenswrapper[4831]: I1203 09:01:58.131892 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ngslh" Dec 03 09:01:58 crc kubenswrapper[4831]: I1203 09:01:58.201043 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ngslh"] Dec 03 09:01:58 crc kubenswrapper[4831]: I1203 09:01:58.214561 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ngslh"] Dec 03 09:01:59 crc kubenswrapper[4831]: I1203 09:01:59.037537 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e9f20e9-0fef-4514-98b7-95513a86ff1c" path="/var/lib/kubelet/pods/8e9f20e9-0fef-4514-98b7-95513a86ff1c/volumes" Dec 03 09:03:27 crc kubenswrapper[4831]: I1203 09:03:27.596579 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:03:27 crc kubenswrapper[4831]: I1203 09:03:27.597468 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:03:57 crc kubenswrapper[4831]: I1203 09:03:57.596265 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:03:57 crc kubenswrapper[4831]: I1203 09:03:57.596783 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:03:59 crc kubenswrapper[4831]: E1203 09:03:59.000074 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbfa3cd02_c3f9_4817_8c79_f57f2d2dfa4d.slice/crio-f32f4d1a918837e95d03cfc583751e02a14eb9b704a01c4074b8cdabc544e124.scope\": RecentStats: unable to find data in memory cache]" Dec 03 09:03:59 crc kubenswrapper[4831]: I1203 09:03:59.663469 4831 generic.go:334] "Generic (PLEG): container finished" podID="bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" containerID="f32f4d1a918837e95d03cfc583751e02a14eb9b704a01c4074b8cdabc544e124" exitCode=0 Dec 03 09:03:59 crc kubenswrapper[4831]: I1203 09:03:59.663577 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" event={"ID":"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d","Type":"ContainerDied","Data":"f32f4d1a918837e95d03cfc583751e02a14eb9b704a01c4074b8cdabc544e124"} Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.245363 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.338411 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-compute-config-1\") pod \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.338475 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-migration-ssh-key-1\") pod \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.338516 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-combined-ca-bundle\") pod \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.338573 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cells-global-config-0\") pod \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.338662 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cells-global-config-1\") pod \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.338751 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzzzg\" (UniqueName: \"kubernetes.io/projected/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-kube-api-access-lzzzg\") pod \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.338820 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-ceph\") pod \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.338863 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-inventory\") pod \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.338886 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-compute-config-0\") pod \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.338924 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-migration-ssh-key-0\") pod \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.338956 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-ssh-key\") pod \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\" (UID: \"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d\") " Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.351012 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-ceph" (OuterVolumeSpecName: "ceph") pod "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" (UID: "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.360220 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-kube-api-access-lzzzg" (OuterVolumeSpecName: "kube-api-access-lzzzg") pod "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" (UID: "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d"). InnerVolumeSpecName "kube-api-access-lzzzg". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.376935 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-combined-ca-bundle" (OuterVolumeSpecName: "nova-cell1-combined-ca-bundle") pod "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" (UID: "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d"). InnerVolumeSpecName "nova-cell1-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.378849 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cells-global-config-1" (OuterVolumeSpecName: "nova-cells-global-config-1") pod "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" (UID: "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d"). InnerVolumeSpecName "nova-cells-global-config-1". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.385833 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-inventory" (OuterVolumeSpecName: "inventory") pod "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" (UID: "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.386931 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" (UID: "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.387001 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cells-global-config-0" (OuterVolumeSpecName: "nova-cells-global-config-0") pod "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" (UID: "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d"). InnerVolumeSpecName "nova-cells-global-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.398673 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" (UID: "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.402987 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" (UID: "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.406793 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" (UID: "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.409919 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" (UID: "bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.441972 4831 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.442007 4831 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.442016 4831 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.442028 4831 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-0\" (UniqueName: \"kubernetes.io/configmap/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cells-global-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.442037 4831 reconciler_common.go:293] "Volume detached for volume \"nova-cells-global-config-1\" (UniqueName: \"kubernetes.io/configmap/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cells-global-config-1\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.442045 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzzzg\" (UniqueName: \"kubernetes.io/projected/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-kube-api-access-lzzzg\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.442054 4831 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-ceph\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.442062 4831 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-inventory\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.442070 4831 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.442078 4831 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.442085 4831 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d-ssh-key\") on node \"crc\" DevicePath \"\"" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.683616 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" event={"ID":"bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d","Type":"ContainerDied","Data":"049fa21db10a0bd1e83834f1ffe2faa5e824647e9a2e18131ab4a841ab548247"} Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.684276 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="049fa21db10a0bd1e83834f1ffe2faa5e824647e9a2e18131ab4a841ab548247" Dec 03 09:04:01 crc kubenswrapper[4831]: I1203 09:04:01.683683 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk" Dec 03 09:04:21 crc kubenswrapper[4831]: E1203 09:04:21.139156 4831 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.234:54242->38.102.83.234:39573: write tcp 38.102.83.234:54242->38.102.83.234:39573: write: broken pipe Dec 03 09:04:27 crc kubenswrapper[4831]: I1203 09:04:27.597408 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:04:27 crc kubenswrapper[4831]: I1203 09:04:27.597876 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:04:27 crc kubenswrapper[4831]: I1203 09:04:27.597925 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 09:04:27 crc kubenswrapper[4831]: I1203 09:04:27.598841 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:04:27 crc kubenswrapper[4831]: I1203 09:04:27.598910 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" gracePeriod=600 Dec 03 09:04:28 crc kubenswrapper[4831]: E1203 09:04:28.154939 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:04:28 crc kubenswrapper[4831]: I1203 09:04:28.998086 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" exitCode=0 Dec 03 09:04:28 crc kubenswrapper[4831]: I1203 09:04:28.998171 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2"} Dec 03 09:04:28 crc kubenswrapper[4831]: I1203 09:04:28.998571 4831 scope.go:117] "RemoveContainer" containerID="0cb79465c4851d9ccab0a172ec47f2783944898da9d4aedbce6d8b6c849dbeb6" Dec 03 09:04:28 crc kubenswrapper[4831]: I1203 09:04:28.999425 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:04:28 crc kubenswrapper[4831]: E1203 09:04:28.999837 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:04:43 crc kubenswrapper[4831]: I1203 09:04:43.026689 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:04:43 crc kubenswrapper[4831]: E1203 09:04:43.027723 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:04:58 crc kubenswrapper[4831]: I1203 09:04:58.013351 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:04:58 crc kubenswrapper[4831]: E1203 09:04:58.014366 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:05:11 crc kubenswrapper[4831]: I1203 09:05:11.013357 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:05:11 crc kubenswrapper[4831]: E1203 09:05:11.014050 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:05:25 crc kubenswrapper[4831]: I1203 09:05:25.013338 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:05:25 crc kubenswrapper[4831]: E1203 09:05:25.014273 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:05:37 crc kubenswrapper[4831]: I1203 09:05:37.013594 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:05:37 crc kubenswrapper[4831]: E1203 09:05:37.014412 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:05:50 crc kubenswrapper[4831]: I1203 09:05:50.013106 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:05:50 crc kubenswrapper[4831]: E1203 09:05:50.014186 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.201067 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-mnvp8"] Dec 03 09:06:00 crc kubenswrapper[4831]: E1203 09:06:00.202107 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9f20e9-0fef-4514-98b7-95513a86ff1c" containerName="extract-utilities" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.202122 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9f20e9-0fef-4514-98b7-95513a86ff1c" containerName="extract-utilities" Dec 03 09:06:00 crc kubenswrapper[4831]: E1203 09:06:00.202138 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9f20e9-0fef-4514-98b7-95513a86ff1c" containerName="extract-content" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.202144 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9f20e9-0fef-4514-98b7-95513a86ff1c" containerName="extract-content" Dec 03 09:06:00 crc kubenswrapper[4831]: E1203 09:06:00.202160 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.202170 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 03 09:06:00 crc kubenswrapper[4831]: E1203 09:06:00.202192 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e9f20e9-0fef-4514-98b7-95513a86ff1c" containerName="registry-server" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.202197 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e9f20e9-0fef-4514-98b7-95513a86ff1c" containerName="registry-server" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.202434 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d" containerName="nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.202521 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e9f20e9-0fef-4514-98b7-95513a86ff1c" containerName="registry-server" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.204276 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.234990 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnvp8"] Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.290584 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0913e04-f037-42e1-9ec2-38926d31853c-utilities\") pod \"redhat-marketplace-mnvp8\" (UID: \"f0913e04-f037-42e1-9ec2-38926d31853c\") " pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.290713 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r987\" (UniqueName: \"kubernetes.io/projected/f0913e04-f037-42e1-9ec2-38926d31853c-kube-api-access-9r987\") pod \"redhat-marketplace-mnvp8\" (UID: \"f0913e04-f037-42e1-9ec2-38926d31853c\") " pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.290754 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0913e04-f037-42e1-9ec2-38926d31853c-catalog-content\") pod \"redhat-marketplace-mnvp8\" (UID: \"f0913e04-f037-42e1-9ec2-38926d31853c\") " pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.392434 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0913e04-f037-42e1-9ec2-38926d31853c-utilities\") pod \"redhat-marketplace-mnvp8\" (UID: \"f0913e04-f037-42e1-9ec2-38926d31853c\") " pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.392547 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r987\" (UniqueName: \"kubernetes.io/projected/f0913e04-f037-42e1-9ec2-38926d31853c-kube-api-access-9r987\") pod \"redhat-marketplace-mnvp8\" (UID: \"f0913e04-f037-42e1-9ec2-38926d31853c\") " pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.392585 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0913e04-f037-42e1-9ec2-38926d31853c-catalog-content\") pod \"redhat-marketplace-mnvp8\" (UID: \"f0913e04-f037-42e1-9ec2-38926d31853c\") " pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.393214 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0913e04-f037-42e1-9ec2-38926d31853c-catalog-content\") pod \"redhat-marketplace-mnvp8\" (UID: \"f0913e04-f037-42e1-9ec2-38926d31853c\") " pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.393231 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0913e04-f037-42e1-9ec2-38926d31853c-utilities\") pod \"redhat-marketplace-mnvp8\" (UID: \"f0913e04-f037-42e1-9ec2-38926d31853c\") " pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.419599 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r987\" (UniqueName: \"kubernetes.io/projected/f0913e04-f037-42e1-9ec2-38926d31853c-kube-api-access-9r987\") pod \"redhat-marketplace-mnvp8\" (UID: \"f0913e04-f037-42e1-9ec2-38926d31853c\") " pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:00 crc kubenswrapper[4831]: I1203 09:06:00.539626 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:01 crc kubenswrapper[4831]: I1203 09:06:01.062060 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnvp8"] Dec 03 09:06:01 crc kubenswrapper[4831]: I1203 09:06:01.079084 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnvp8" event={"ID":"f0913e04-f037-42e1-9ec2-38926d31853c","Type":"ContainerStarted","Data":"edb8c0a1857d5dc5fdfd4a057493cacd119d96ccda7a83acf7a3ed52f0184f5b"} Dec 03 09:06:02 crc kubenswrapper[4831]: I1203 09:06:02.098911 4831 generic.go:334] "Generic (PLEG): container finished" podID="f0913e04-f037-42e1-9ec2-38926d31853c" containerID="87c8b3404e509be6e975f32faf9d03022a615deec0c942c0203f7499fc97854d" exitCode=0 Dec 03 09:06:02 crc kubenswrapper[4831]: I1203 09:06:02.099254 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnvp8" event={"ID":"f0913e04-f037-42e1-9ec2-38926d31853c","Type":"ContainerDied","Data":"87c8b3404e509be6e975f32faf9d03022a615deec0c942c0203f7499fc97854d"} Dec 03 09:06:04 crc kubenswrapper[4831]: I1203 09:06:04.122212 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnvp8" event={"ID":"f0913e04-f037-42e1-9ec2-38926d31853c","Type":"ContainerStarted","Data":"f82bfbe36fb40e04b32bc018b374dc5fa4ee6cdc5ca8f920da9fdf03aa72a367"} Dec 03 09:06:04 crc kubenswrapper[4831]: I1203 09:06:04.977388 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-gkzxn"] Dec 03 09:06:04 crc kubenswrapper[4831]: I1203 09:06:04.983766 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:04 crc kubenswrapper[4831]: I1203 09:06:04.994375 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gkzxn"] Dec 03 09:06:04 crc kubenswrapper[4831]: I1203 09:06:04.999272 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffdf963-3509-4ebd-aa16-b701a9ac2086-catalog-content\") pod \"certified-operators-gkzxn\" (UID: \"dffdf963-3509-4ebd-aa16-b701a9ac2086\") " pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:04 crc kubenswrapper[4831]: I1203 09:06:04.999468 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffdf963-3509-4ebd-aa16-b701a9ac2086-utilities\") pod \"certified-operators-gkzxn\" (UID: \"dffdf963-3509-4ebd-aa16-b701a9ac2086\") " pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:04 crc kubenswrapper[4831]: I1203 09:06:04.999519 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfqhm\" (UniqueName: \"kubernetes.io/projected/dffdf963-3509-4ebd-aa16-b701a9ac2086-kube-api-access-tfqhm\") pod \"certified-operators-gkzxn\" (UID: \"dffdf963-3509-4ebd-aa16-b701a9ac2086\") " pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:05 crc kubenswrapper[4831]: I1203 09:06:05.017960 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:06:05 crc kubenswrapper[4831]: E1203 09:06:05.018762 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:06:05 crc kubenswrapper[4831]: I1203 09:06:05.101778 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffdf963-3509-4ebd-aa16-b701a9ac2086-utilities\") pod \"certified-operators-gkzxn\" (UID: \"dffdf963-3509-4ebd-aa16-b701a9ac2086\") " pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:05 crc kubenswrapper[4831]: I1203 09:06:05.102535 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfqhm\" (UniqueName: \"kubernetes.io/projected/dffdf963-3509-4ebd-aa16-b701a9ac2086-kube-api-access-tfqhm\") pod \"certified-operators-gkzxn\" (UID: \"dffdf963-3509-4ebd-aa16-b701a9ac2086\") " pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:05 crc kubenswrapper[4831]: I1203 09:06:05.102544 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffdf963-3509-4ebd-aa16-b701a9ac2086-utilities\") pod \"certified-operators-gkzxn\" (UID: \"dffdf963-3509-4ebd-aa16-b701a9ac2086\") " pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:05 crc kubenswrapper[4831]: I1203 09:06:05.103068 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffdf963-3509-4ebd-aa16-b701a9ac2086-catalog-content\") pod \"certified-operators-gkzxn\" (UID: \"dffdf963-3509-4ebd-aa16-b701a9ac2086\") " pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:05 crc kubenswrapper[4831]: I1203 09:06:05.103520 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffdf963-3509-4ebd-aa16-b701a9ac2086-catalog-content\") pod \"certified-operators-gkzxn\" (UID: \"dffdf963-3509-4ebd-aa16-b701a9ac2086\") " pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:05 crc kubenswrapper[4831]: I1203 09:06:05.136499 4831 generic.go:334] "Generic (PLEG): container finished" podID="f0913e04-f037-42e1-9ec2-38926d31853c" containerID="f82bfbe36fb40e04b32bc018b374dc5fa4ee6cdc5ca8f920da9fdf03aa72a367" exitCode=0 Dec 03 09:06:05 crc kubenswrapper[4831]: I1203 09:06:05.136535 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnvp8" event={"ID":"f0913e04-f037-42e1-9ec2-38926d31853c","Type":"ContainerDied","Data":"f82bfbe36fb40e04b32bc018b374dc5fa4ee6cdc5ca8f920da9fdf03aa72a367"} Dec 03 09:06:05 crc kubenswrapper[4831]: I1203 09:06:05.820987 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfqhm\" (UniqueName: \"kubernetes.io/projected/dffdf963-3509-4ebd-aa16-b701a9ac2086-kube-api-access-tfqhm\") pod \"certified-operators-gkzxn\" (UID: \"dffdf963-3509-4ebd-aa16-b701a9ac2086\") " pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:05 crc kubenswrapper[4831]: I1203 09:06:05.906688 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:06 crc kubenswrapper[4831]: I1203 09:06:06.172403 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnvp8" event={"ID":"f0913e04-f037-42e1-9ec2-38926d31853c","Type":"ContainerStarted","Data":"2aedae8f9fd45a0213d7ed83a60586c48e44e5d7bdb174b2ea0532e373639c38"} Dec 03 09:06:06 crc kubenswrapper[4831]: I1203 09:06:06.211136 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-mnvp8" podStartSLOduration=2.526160681 podStartE2EDuration="6.211113834s" podCreationTimestamp="2025-12-03 09:06:00 +0000 UTC" firstStartedPulling="2025-12-03 09:06:02.102246605 +0000 UTC m=+9299.445830113" lastFinishedPulling="2025-12-03 09:06:05.787199758 +0000 UTC m=+9303.130783266" observedRunningTime="2025-12-03 09:06:06.200800404 +0000 UTC m=+9303.544383912" watchObservedRunningTime="2025-12-03 09:06:06.211113834 +0000 UTC m=+9303.554697342" Dec 03 09:06:06 crc kubenswrapper[4831]: I1203 09:06:06.441973 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-gkzxn"] Dec 03 09:06:06 crc kubenswrapper[4831]: W1203 09:06:06.448823 4831 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddffdf963_3509_4ebd_aa16_b701a9ac2086.slice/crio-992cdd5f80338f44cde3b2c501fdd58d161ef1f904094e9d6c8808203a15aef4 WatchSource:0}: Error finding container 992cdd5f80338f44cde3b2c501fdd58d161ef1f904094e9d6c8808203a15aef4: Status 404 returned error can't find the container with id 992cdd5f80338f44cde3b2c501fdd58d161ef1f904094e9d6c8808203a15aef4 Dec 03 09:06:07 crc kubenswrapper[4831]: I1203 09:06:07.184096 4831 generic.go:334] "Generic (PLEG): container finished" podID="dffdf963-3509-4ebd-aa16-b701a9ac2086" containerID="2219ce6d5801081b2891befc7737c51de81d0d9b33093e6a8922d74e771daea9" exitCode=0 Dec 03 09:06:07 crc kubenswrapper[4831]: I1203 09:06:07.184154 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzxn" event={"ID":"dffdf963-3509-4ebd-aa16-b701a9ac2086","Type":"ContainerDied","Data":"2219ce6d5801081b2891befc7737c51de81d0d9b33093e6a8922d74e771daea9"} Dec 03 09:06:07 crc kubenswrapper[4831]: I1203 09:06:07.185443 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzxn" event={"ID":"dffdf963-3509-4ebd-aa16-b701a9ac2086","Type":"ContainerStarted","Data":"992cdd5f80338f44cde3b2c501fdd58d161ef1f904094e9d6c8808203a15aef4"} Dec 03 09:06:09 crc kubenswrapper[4831]: I1203 09:06:09.211688 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzxn" event={"ID":"dffdf963-3509-4ebd-aa16-b701a9ac2086","Type":"ContainerStarted","Data":"07a070aa973e6185ee384388985435b453259a614c32a95fd155059bcb11c9a4"} Dec 03 09:06:10 crc kubenswrapper[4831]: I1203 09:06:10.223390 4831 generic.go:334] "Generic (PLEG): container finished" podID="dffdf963-3509-4ebd-aa16-b701a9ac2086" containerID="07a070aa973e6185ee384388985435b453259a614c32a95fd155059bcb11c9a4" exitCode=0 Dec 03 09:06:10 crc kubenswrapper[4831]: I1203 09:06:10.223453 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzxn" event={"ID":"dffdf963-3509-4ebd-aa16-b701a9ac2086","Type":"ContainerDied","Data":"07a070aa973e6185ee384388985435b453259a614c32a95fd155059bcb11c9a4"} Dec 03 09:06:10 crc kubenswrapper[4831]: I1203 09:06:10.543461 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:10 crc kubenswrapper[4831]: I1203 09:06:10.543513 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:10 crc kubenswrapper[4831]: I1203 09:06:10.597275 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:11 crc kubenswrapper[4831]: I1203 09:06:11.238488 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzxn" event={"ID":"dffdf963-3509-4ebd-aa16-b701a9ac2086","Type":"ContainerStarted","Data":"1be466a454b9361b9495bd475a4cd0d9f3a83503afd3ee3639edfb55501f6333"} Dec 03 09:06:11 crc kubenswrapper[4831]: I1203 09:06:11.281496 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-gkzxn" podStartSLOduration=3.662035633 podStartE2EDuration="7.281475107s" podCreationTimestamp="2025-12-03 09:06:04 +0000 UTC" firstStartedPulling="2025-12-03 09:06:07.186502569 +0000 UTC m=+9304.530086087" lastFinishedPulling="2025-12-03 09:06:10.805942013 +0000 UTC m=+9308.149525561" observedRunningTime="2025-12-03 09:06:11.270997671 +0000 UTC m=+9308.614581219" watchObservedRunningTime="2025-12-03 09:06:11.281475107 +0000 UTC m=+9308.625058615" Dec 03 09:06:11 crc kubenswrapper[4831]: I1203 09:06:11.318340 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:12 crc kubenswrapper[4831]: I1203 09:06:12.986355 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnvp8"] Dec 03 09:06:13 crc kubenswrapper[4831]: I1203 09:06:13.258068 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-mnvp8" podUID="f0913e04-f037-42e1-9ec2-38926d31853c" containerName="registry-server" containerID="cri-o://2aedae8f9fd45a0213d7ed83a60586c48e44e5d7bdb174b2ea0532e373639c38" gracePeriod=2 Dec 03 09:06:14 crc kubenswrapper[4831]: I1203 09:06:14.297625 4831 generic.go:334] "Generic (PLEG): container finished" podID="f0913e04-f037-42e1-9ec2-38926d31853c" containerID="2aedae8f9fd45a0213d7ed83a60586c48e44e5d7bdb174b2ea0532e373639c38" exitCode=0 Dec 03 09:06:14 crc kubenswrapper[4831]: I1203 09:06:14.297695 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnvp8" event={"ID":"f0913e04-f037-42e1-9ec2-38926d31853c","Type":"ContainerDied","Data":"2aedae8f9fd45a0213d7ed83a60586c48e44e5d7bdb174b2ea0532e373639c38"} Dec 03 09:06:14 crc kubenswrapper[4831]: I1203 09:06:14.443666 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:14 crc kubenswrapper[4831]: I1203 09:06:14.526392 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r987\" (UniqueName: \"kubernetes.io/projected/f0913e04-f037-42e1-9ec2-38926d31853c-kube-api-access-9r987\") pod \"f0913e04-f037-42e1-9ec2-38926d31853c\" (UID: \"f0913e04-f037-42e1-9ec2-38926d31853c\") " Dec 03 09:06:14 crc kubenswrapper[4831]: I1203 09:06:14.526808 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0913e04-f037-42e1-9ec2-38926d31853c-utilities\") pod \"f0913e04-f037-42e1-9ec2-38926d31853c\" (UID: \"f0913e04-f037-42e1-9ec2-38926d31853c\") " Dec 03 09:06:14 crc kubenswrapper[4831]: I1203 09:06:14.527117 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0913e04-f037-42e1-9ec2-38926d31853c-catalog-content\") pod \"f0913e04-f037-42e1-9ec2-38926d31853c\" (UID: \"f0913e04-f037-42e1-9ec2-38926d31853c\") " Dec 03 09:06:14 crc kubenswrapper[4831]: I1203 09:06:14.529183 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0913e04-f037-42e1-9ec2-38926d31853c-utilities" (OuterVolumeSpecName: "utilities") pod "f0913e04-f037-42e1-9ec2-38926d31853c" (UID: "f0913e04-f037-42e1-9ec2-38926d31853c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:06:14 crc kubenswrapper[4831]: I1203 09:06:14.535550 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f0913e04-f037-42e1-9ec2-38926d31853c-kube-api-access-9r987" (OuterVolumeSpecName: "kube-api-access-9r987") pod "f0913e04-f037-42e1-9ec2-38926d31853c" (UID: "f0913e04-f037-42e1-9ec2-38926d31853c"). InnerVolumeSpecName "kube-api-access-9r987". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:14 crc kubenswrapper[4831]: I1203 09:06:14.554413 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f0913e04-f037-42e1-9ec2-38926d31853c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f0913e04-f037-42e1-9ec2-38926d31853c" (UID: "f0913e04-f037-42e1-9ec2-38926d31853c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:06:14 crc kubenswrapper[4831]: I1203 09:06:14.630088 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f0913e04-f037-42e1-9ec2-38926d31853c-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:14 crc kubenswrapper[4831]: I1203 09:06:14.630141 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r987\" (UniqueName: \"kubernetes.io/projected/f0913e04-f037-42e1-9ec2-38926d31853c-kube-api-access-9r987\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:14 crc kubenswrapper[4831]: I1203 09:06:14.630173 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f0913e04-f037-42e1-9ec2-38926d31853c-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:15 crc kubenswrapper[4831]: I1203 09:06:15.319968 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-mnvp8" event={"ID":"f0913e04-f037-42e1-9ec2-38926d31853c","Type":"ContainerDied","Data":"edb8c0a1857d5dc5fdfd4a057493cacd119d96ccda7a83acf7a3ed52f0184f5b"} Dec 03 09:06:15 crc kubenswrapper[4831]: I1203 09:06:15.320063 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-mnvp8" Dec 03 09:06:15 crc kubenswrapper[4831]: I1203 09:06:15.321168 4831 scope.go:117] "RemoveContainer" containerID="2aedae8f9fd45a0213d7ed83a60586c48e44e5d7bdb174b2ea0532e373639c38" Dec 03 09:06:15 crc kubenswrapper[4831]: I1203 09:06:15.349667 4831 scope.go:117] "RemoveContainer" containerID="f82bfbe36fb40e04b32bc018b374dc5fa4ee6cdc5ca8f920da9fdf03aa72a367" Dec 03 09:06:15 crc kubenswrapper[4831]: I1203 09:06:15.350373 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnvp8"] Dec 03 09:06:15 crc kubenswrapper[4831]: I1203 09:06:15.361780 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-mnvp8"] Dec 03 09:06:15 crc kubenswrapper[4831]: I1203 09:06:15.372258 4831 scope.go:117] "RemoveContainer" containerID="87c8b3404e509be6e975f32faf9d03022a615deec0c942c0203f7499fc97854d" Dec 03 09:06:15 crc kubenswrapper[4831]: I1203 09:06:15.906806 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:15 crc kubenswrapper[4831]: I1203 09:06:15.907101 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.231665 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.231875 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/mariadb-copy-data" podUID="18e2c38d-2593-41b7-a74a-74fd5671f27d" containerName="adoption" containerID="cri-o://e1026a1036699d43cf788f6d01284c6257c26b6806cfb3a948b3e5cac425bee1" gracePeriod=30 Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.465086 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.519839 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.781812 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wq55h"] Dec 03 09:06:16 crc kubenswrapper[4831]: E1203 09:06:16.782285 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0913e04-f037-42e1-9ec2-38926d31853c" containerName="registry-server" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.782304 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0913e04-f037-42e1-9ec2-38926d31853c" containerName="registry-server" Dec 03 09:06:16 crc kubenswrapper[4831]: E1203 09:06:16.782797 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0913e04-f037-42e1-9ec2-38926d31853c" containerName="extract-utilities" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.782811 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0913e04-f037-42e1-9ec2-38926d31853c" containerName="extract-utilities" Dec 03 09:06:16 crc kubenswrapper[4831]: E1203 09:06:16.782845 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f0913e04-f037-42e1-9ec2-38926d31853c" containerName="extract-content" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.782861 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="f0913e04-f037-42e1-9ec2-38926d31853c" containerName="extract-content" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.783137 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="f0913e04-f037-42e1-9ec2-38926d31853c" containerName="registry-server" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.785869 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.817502 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wq55h"] Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.885695 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab25bea-c97e-49b5-9660-2328f2eb9f00-utilities\") pod \"community-operators-wq55h\" (UID: \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\") " pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.886061 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab25bea-c97e-49b5-9660-2328f2eb9f00-catalog-content\") pod \"community-operators-wq55h\" (UID: \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\") " pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.886998 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8slb\" (UniqueName: \"kubernetes.io/projected/5ab25bea-c97e-49b5-9660-2328f2eb9f00-kube-api-access-d8slb\") pod \"community-operators-wq55h\" (UID: \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\") " pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.989250 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8slb\" (UniqueName: \"kubernetes.io/projected/5ab25bea-c97e-49b5-9660-2328f2eb9f00-kube-api-access-d8slb\") pod \"community-operators-wq55h\" (UID: \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\") " pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.989655 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab25bea-c97e-49b5-9660-2328f2eb9f00-utilities\") pod \"community-operators-wq55h\" (UID: \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\") " pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.989772 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab25bea-c97e-49b5-9660-2328f2eb9f00-catalog-content\") pod \"community-operators-wq55h\" (UID: \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\") " pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.990156 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab25bea-c97e-49b5-9660-2328f2eb9f00-utilities\") pod \"community-operators-wq55h\" (UID: \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\") " pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:16 crc kubenswrapper[4831]: I1203 09:06:16.990268 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab25bea-c97e-49b5-9660-2328f2eb9f00-catalog-content\") pod \"community-operators-wq55h\" (UID: \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\") " pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:17 crc kubenswrapper[4831]: I1203 09:06:17.009161 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8slb\" (UniqueName: \"kubernetes.io/projected/5ab25bea-c97e-49b5-9660-2328f2eb9f00-kube-api-access-d8slb\") pod \"community-operators-wq55h\" (UID: \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\") " pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:17 crc kubenswrapper[4831]: I1203 09:06:17.013391 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:06:17 crc kubenswrapper[4831]: E1203 09:06:17.013642 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:06:17 crc kubenswrapper[4831]: I1203 09:06:17.027800 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f0913e04-f037-42e1-9ec2-38926d31853c" path="/var/lib/kubelet/pods/f0913e04-f037-42e1-9ec2-38926d31853c/volumes" Dec 03 09:06:17 crc kubenswrapper[4831]: I1203 09:06:17.106200 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:17 crc kubenswrapper[4831]: I1203 09:06:17.621196 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wq55h"] Dec 03 09:06:18 crc kubenswrapper[4831]: I1203 09:06:18.369982 4831 generic.go:334] "Generic (PLEG): container finished" podID="5ab25bea-c97e-49b5-9660-2328f2eb9f00" containerID="23f9b0ba7af2dca77140f70853d7cf749f4ce1a71a4360e63cc9a88b421bd755" exitCode=0 Dec 03 09:06:18 crc kubenswrapper[4831]: I1203 09:06:18.370059 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq55h" event={"ID":"5ab25bea-c97e-49b5-9660-2328f2eb9f00","Type":"ContainerDied","Data":"23f9b0ba7af2dca77140f70853d7cf749f4ce1a71a4360e63cc9a88b421bd755"} Dec 03 09:06:18 crc kubenswrapper[4831]: I1203 09:06:18.370308 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq55h" event={"ID":"5ab25bea-c97e-49b5-9660-2328f2eb9f00","Type":"ContainerStarted","Data":"d58fafaa7c0a23c842efc53ba9540c3085edd510dad0b5b00e0a8e5c0301b8ca"} Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.171168 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gkzxn"] Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.171854 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-gkzxn" podUID="dffdf963-3509-4ebd-aa16-b701a9ac2086" containerName="registry-server" containerID="cri-o://1be466a454b9361b9495bd475a4cd0d9f3a83503afd3ee3639edfb55501f6333" gracePeriod=2 Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.388616 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq55h" event={"ID":"5ab25bea-c97e-49b5-9660-2328f2eb9f00","Type":"ContainerStarted","Data":"98b3a3f4c75e151e0123cee5f039aeb78cf204942049435e32adf827551fafab"} Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.403203 4831 generic.go:334] "Generic (PLEG): container finished" podID="dffdf963-3509-4ebd-aa16-b701a9ac2086" containerID="1be466a454b9361b9495bd475a4cd0d9f3a83503afd3ee3639edfb55501f6333" exitCode=0 Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.403252 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzxn" event={"ID":"dffdf963-3509-4ebd-aa16-b701a9ac2086","Type":"ContainerDied","Data":"1be466a454b9361b9495bd475a4cd0d9f3a83503afd3ee3639edfb55501f6333"} Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.703726 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.853072 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffdf963-3509-4ebd-aa16-b701a9ac2086-utilities\") pod \"dffdf963-3509-4ebd-aa16-b701a9ac2086\" (UID: \"dffdf963-3509-4ebd-aa16-b701a9ac2086\") " Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.853217 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tfqhm\" (UniqueName: \"kubernetes.io/projected/dffdf963-3509-4ebd-aa16-b701a9ac2086-kube-api-access-tfqhm\") pod \"dffdf963-3509-4ebd-aa16-b701a9ac2086\" (UID: \"dffdf963-3509-4ebd-aa16-b701a9ac2086\") " Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.853402 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffdf963-3509-4ebd-aa16-b701a9ac2086-catalog-content\") pod \"dffdf963-3509-4ebd-aa16-b701a9ac2086\" (UID: \"dffdf963-3509-4ebd-aa16-b701a9ac2086\") " Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.854173 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dffdf963-3509-4ebd-aa16-b701a9ac2086-utilities" (OuterVolumeSpecName: "utilities") pod "dffdf963-3509-4ebd-aa16-b701a9ac2086" (UID: "dffdf963-3509-4ebd-aa16-b701a9ac2086"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.862671 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dffdf963-3509-4ebd-aa16-b701a9ac2086-kube-api-access-tfqhm" (OuterVolumeSpecName: "kube-api-access-tfqhm") pod "dffdf963-3509-4ebd-aa16-b701a9ac2086" (UID: "dffdf963-3509-4ebd-aa16-b701a9ac2086"). InnerVolumeSpecName "kube-api-access-tfqhm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.914739 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dffdf963-3509-4ebd-aa16-b701a9ac2086-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "dffdf963-3509-4ebd-aa16-b701a9ac2086" (UID: "dffdf963-3509-4ebd-aa16-b701a9ac2086"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.955958 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dffdf963-3509-4ebd-aa16-b701a9ac2086-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.955990 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dffdf963-3509-4ebd-aa16-b701a9ac2086-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:19 crc kubenswrapper[4831]: I1203 09:06:19.956001 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tfqhm\" (UniqueName: \"kubernetes.io/projected/dffdf963-3509-4ebd-aa16-b701a9ac2086-kube-api-access-tfqhm\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:20 crc kubenswrapper[4831]: I1203 09:06:20.419682 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-gkzxn" event={"ID":"dffdf963-3509-4ebd-aa16-b701a9ac2086","Type":"ContainerDied","Data":"992cdd5f80338f44cde3b2c501fdd58d161ef1f904094e9d6c8808203a15aef4"} Dec 03 09:06:20 crc kubenswrapper[4831]: I1203 09:06:20.419748 4831 scope.go:117] "RemoveContainer" containerID="1be466a454b9361b9495bd475a4cd0d9f3a83503afd3ee3639edfb55501f6333" Dec 03 09:06:20 crc kubenswrapper[4831]: I1203 09:06:20.419960 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-gkzxn" Dec 03 09:06:20 crc kubenswrapper[4831]: I1203 09:06:20.422913 4831 generic.go:334] "Generic (PLEG): container finished" podID="5ab25bea-c97e-49b5-9660-2328f2eb9f00" containerID="98b3a3f4c75e151e0123cee5f039aeb78cf204942049435e32adf827551fafab" exitCode=0 Dec 03 09:06:20 crc kubenswrapper[4831]: I1203 09:06:20.423212 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq55h" event={"ID":"5ab25bea-c97e-49b5-9660-2328f2eb9f00","Type":"ContainerDied","Data":"98b3a3f4c75e151e0123cee5f039aeb78cf204942049435e32adf827551fafab"} Dec 03 09:06:20 crc kubenswrapper[4831]: I1203 09:06:20.453988 4831 scope.go:117] "RemoveContainer" containerID="07a070aa973e6185ee384388985435b453259a614c32a95fd155059bcb11c9a4" Dec 03 09:06:20 crc kubenswrapper[4831]: I1203 09:06:20.478620 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-gkzxn"] Dec 03 09:06:20 crc kubenswrapper[4831]: I1203 09:06:20.490226 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-gkzxn"] Dec 03 09:06:20 crc kubenswrapper[4831]: I1203 09:06:20.491035 4831 scope.go:117] "RemoveContainer" containerID="2219ce6d5801081b2891befc7737c51de81d0d9b33093e6a8922d74e771daea9" Dec 03 09:06:21 crc kubenswrapper[4831]: I1203 09:06:21.026525 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dffdf963-3509-4ebd-aa16-b701a9ac2086" path="/var/lib/kubelet/pods/dffdf963-3509-4ebd-aa16-b701a9ac2086/volumes" Dec 03 09:06:22 crc kubenswrapper[4831]: I1203 09:06:22.449578 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq55h" event={"ID":"5ab25bea-c97e-49b5-9660-2328f2eb9f00","Type":"ContainerStarted","Data":"5cdcb1e125bf0070f60db261e39539ff37d1bd7426d413382c5cfc197617c84b"} Dec 03 09:06:22 crc kubenswrapper[4831]: I1203 09:06:22.469275 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wq55h" podStartSLOduration=3.206098082 podStartE2EDuration="6.469251465s" podCreationTimestamp="2025-12-03 09:06:16 +0000 UTC" firstStartedPulling="2025-12-03 09:06:18.372626176 +0000 UTC m=+9315.716209674" lastFinishedPulling="2025-12-03 09:06:21.635779549 +0000 UTC m=+9318.979363057" observedRunningTime="2025-12-03 09:06:22.463796585 +0000 UTC m=+9319.807380083" watchObservedRunningTime="2025-12-03 09:06:22.469251465 +0000 UTC m=+9319.812834973" Dec 03 09:06:27 crc kubenswrapper[4831]: I1203 09:06:27.106647 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:27 crc kubenswrapper[4831]: I1203 09:06:27.107497 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:28 crc kubenswrapper[4831]: I1203 09:06:28.173062 4831 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-wq55h" podUID="5ab25bea-c97e-49b5-9660-2328f2eb9f00" containerName="registry-server" probeResult="failure" output=< Dec 03 09:06:28 crc kubenswrapper[4831]: timeout: failed to connect service ":50051" within 1s Dec 03 09:06:28 crc kubenswrapper[4831]: > Dec 03 09:06:32 crc kubenswrapper[4831]: I1203 09:06:32.012864 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:06:32 crc kubenswrapper[4831]: E1203 09:06:32.013593 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:06:37 crc kubenswrapper[4831]: I1203 09:06:37.180195 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:37 crc kubenswrapper[4831]: I1203 09:06:37.237817 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:37 crc kubenswrapper[4831]: I1203 09:06:37.420174 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wq55h"] Dec 03 09:06:38 crc kubenswrapper[4831]: I1203 09:06:38.683185 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wq55h" podUID="5ab25bea-c97e-49b5-9660-2328f2eb9f00" containerName="registry-server" containerID="cri-o://5cdcb1e125bf0070f60db261e39539ff37d1bd7426d413382c5cfc197617c84b" gracePeriod=2 Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.363954 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.443227 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab25bea-c97e-49b5-9660-2328f2eb9f00-catalog-content\") pod \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\" (UID: \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\") " Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.443482 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab25bea-c97e-49b5-9660-2328f2eb9f00-utilities\") pod \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\" (UID: \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\") " Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.443760 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8slb\" (UniqueName: \"kubernetes.io/projected/5ab25bea-c97e-49b5-9660-2328f2eb9f00-kube-api-access-d8slb\") pod \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\" (UID: \"5ab25bea-c97e-49b5-9660-2328f2eb9f00\") " Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.444505 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab25bea-c97e-49b5-9660-2328f2eb9f00-utilities" (OuterVolumeSpecName: "utilities") pod "5ab25bea-c97e-49b5-9660-2328f2eb9f00" (UID: "5ab25bea-c97e-49b5-9660-2328f2eb9f00"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.444804 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5ab25bea-c97e-49b5-9660-2328f2eb9f00-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.449392 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ab25bea-c97e-49b5-9660-2328f2eb9f00-kube-api-access-d8slb" (OuterVolumeSpecName: "kube-api-access-d8slb") pod "5ab25bea-c97e-49b5-9660-2328f2eb9f00" (UID: "5ab25bea-c97e-49b5-9660-2328f2eb9f00"). InnerVolumeSpecName "kube-api-access-d8slb". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.499365 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab25bea-c97e-49b5-9660-2328f2eb9f00-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5ab25bea-c97e-49b5-9660-2328f2eb9f00" (UID: "5ab25bea-c97e-49b5-9660-2328f2eb9f00"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.546540 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5ab25bea-c97e-49b5-9660-2328f2eb9f00-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.546571 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8slb\" (UniqueName: \"kubernetes.io/projected/5ab25bea-c97e-49b5-9660-2328f2eb9f00-kube-api-access-d8slb\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.694783 4831 generic.go:334] "Generic (PLEG): container finished" podID="5ab25bea-c97e-49b5-9660-2328f2eb9f00" containerID="5cdcb1e125bf0070f60db261e39539ff37d1bd7426d413382c5cfc197617c84b" exitCode=0 Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.694838 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wq55h" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.694843 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq55h" event={"ID":"5ab25bea-c97e-49b5-9660-2328f2eb9f00","Type":"ContainerDied","Data":"5cdcb1e125bf0070f60db261e39539ff37d1bd7426d413382c5cfc197617c84b"} Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.696358 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wq55h" event={"ID":"5ab25bea-c97e-49b5-9660-2328f2eb9f00","Type":"ContainerDied","Data":"d58fafaa7c0a23c842efc53ba9540c3085edd510dad0b5b00e0a8e5c0301b8ca"} Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.696383 4831 scope.go:117] "RemoveContainer" containerID="5cdcb1e125bf0070f60db261e39539ff37d1bd7426d413382c5cfc197617c84b" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.721053 4831 scope.go:117] "RemoveContainer" containerID="98b3a3f4c75e151e0123cee5f039aeb78cf204942049435e32adf827551fafab" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.753171 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wq55h"] Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.754463 4831 scope.go:117] "RemoveContainer" containerID="23f9b0ba7af2dca77140f70853d7cf749f4ce1a71a4360e63cc9a88b421bd755" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.764191 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wq55h"] Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.815811 4831 scope.go:117] "RemoveContainer" containerID="5cdcb1e125bf0070f60db261e39539ff37d1bd7426d413382c5cfc197617c84b" Dec 03 09:06:39 crc kubenswrapper[4831]: E1203 09:06:39.816342 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cdcb1e125bf0070f60db261e39539ff37d1bd7426d413382c5cfc197617c84b\": container with ID starting with 5cdcb1e125bf0070f60db261e39539ff37d1bd7426d413382c5cfc197617c84b not found: ID does not exist" containerID="5cdcb1e125bf0070f60db261e39539ff37d1bd7426d413382c5cfc197617c84b" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.816380 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cdcb1e125bf0070f60db261e39539ff37d1bd7426d413382c5cfc197617c84b"} err="failed to get container status \"5cdcb1e125bf0070f60db261e39539ff37d1bd7426d413382c5cfc197617c84b\": rpc error: code = NotFound desc = could not find container \"5cdcb1e125bf0070f60db261e39539ff37d1bd7426d413382c5cfc197617c84b\": container with ID starting with 5cdcb1e125bf0070f60db261e39539ff37d1bd7426d413382c5cfc197617c84b not found: ID does not exist" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.816401 4831 scope.go:117] "RemoveContainer" containerID="98b3a3f4c75e151e0123cee5f039aeb78cf204942049435e32adf827551fafab" Dec 03 09:06:39 crc kubenswrapper[4831]: E1203 09:06:39.816748 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98b3a3f4c75e151e0123cee5f039aeb78cf204942049435e32adf827551fafab\": container with ID starting with 98b3a3f4c75e151e0123cee5f039aeb78cf204942049435e32adf827551fafab not found: ID does not exist" containerID="98b3a3f4c75e151e0123cee5f039aeb78cf204942049435e32adf827551fafab" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.816795 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98b3a3f4c75e151e0123cee5f039aeb78cf204942049435e32adf827551fafab"} err="failed to get container status \"98b3a3f4c75e151e0123cee5f039aeb78cf204942049435e32adf827551fafab\": rpc error: code = NotFound desc = could not find container \"98b3a3f4c75e151e0123cee5f039aeb78cf204942049435e32adf827551fafab\": container with ID starting with 98b3a3f4c75e151e0123cee5f039aeb78cf204942049435e32adf827551fafab not found: ID does not exist" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.816828 4831 scope.go:117] "RemoveContainer" containerID="23f9b0ba7af2dca77140f70853d7cf749f4ce1a71a4360e63cc9a88b421bd755" Dec 03 09:06:39 crc kubenswrapper[4831]: E1203 09:06:39.817694 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"23f9b0ba7af2dca77140f70853d7cf749f4ce1a71a4360e63cc9a88b421bd755\": container with ID starting with 23f9b0ba7af2dca77140f70853d7cf749f4ce1a71a4360e63cc9a88b421bd755 not found: ID does not exist" containerID="23f9b0ba7af2dca77140f70853d7cf749f4ce1a71a4360e63cc9a88b421bd755" Dec 03 09:06:39 crc kubenswrapper[4831]: I1203 09:06:39.817713 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23f9b0ba7af2dca77140f70853d7cf749f4ce1a71a4360e63cc9a88b421bd755"} err="failed to get container status \"23f9b0ba7af2dca77140f70853d7cf749f4ce1a71a4360e63cc9a88b421bd755\": rpc error: code = NotFound desc = could not find container \"23f9b0ba7af2dca77140f70853d7cf749f4ce1a71a4360e63cc9a88b421bd755\": container with ID starting with 23f9b0ba7af2dca77140f70853d7cf749f4ce1a71a4360e63cc9a88b421bd755 not found: ID does not exist" Dec 03 09:06:41 crc kubenswrapper[4831]: I1203 09:06:41.027896 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ab25bea-c97e-49b5-9660-2328f2eb9f00" path="/var/lib/kubelet/pods/5ab25bea-c97e-49b5-9660-2328f2eb9f00/volumes" Dec 03 09:06:44 crc kubenswrapper[4831]: I1203 09:06:44.013668 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:06:44 crc kubenswrapper[4831]: E1203 09:06:44.014655 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:06:46 crc kubenswrapper[4831]: I1203 09:06:46.788516 4831 generic.go:334] "Generic (PLEG): container finished" podID="18e2c38d-2593-41b7-a74a-74fd5671f27d" containerID="e1026a1036699d43cf788f6d01284c6257c26b6806cfb3a948b3e5cac425bee1" exitCode=137 Dec 03 09:06:46 crc kubenswrapper[4831]: I1203 09:06:46.788607 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"18e2c38d-2593-41b7-a74a-74fd5671f27d","Type":"ContainerDied","Data":"e1026a1036699d43cf788f6d01284c6257c26b6806cfb3a948b3e5cac425bee1"} Dec 03 09:06:46 crc kubenswrapper[4831]: I1203 09:06:46.967912 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.015129 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mariadb-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-39da139f-5ed3-4973-89fc-b011508a45de\") pod \"18e2c38d-2593-41b7-a74a-74fd5671f27d\" (UID: \"18e2c38d-2593-41b7-a74a-74fd5671f27d\") " Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.015205 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f7xml\" (UniqueName: \"kubernetes.io/projected/18e2c38d-2593-41b7-a74a-74fd5671f27d-kube-api-access-f7xml\") pod \"18e2c38d-2593-41b7-a74a-74fd5671f27d\" (UID: \"18e2c38d-2593-41b7-a74a-74fd5671f27d\") " Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.022071 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18e2c38d-2593-41b7-a74a-74fd5671f27d-kube-api-access-f7xml" (OuterVolumeSpecName: "kube-api-access-f7xml") pod "18e2c38d-2593-41b7-a74a-74fd5671f27d" (UID: "18e2c38d-2593-41b7-a74a-74fd5671f27d"). InnerVolumeSpecName "kube-api-access-f7xml". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.088403 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-39da139f-5ed3-4973-89fc-b011508a45de" (OuterVolumeSpecName: "mariadb-data") pod "18e2c38d-2593-41b7-a74a-74fd5671f27d" (UID: "18e2c38d-2593-41b7-a74a-74fd5671f27d"). InnerVolumeSpecName "pvc-39da139f-5ed3-4973-89fc-b011508a45de". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.118574 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-39da139f-5ed3-4973-89fc-b011508a45de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-39da139f-5ed3-4973-89fc-b011508a45de\") on node \"crc\" " Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.118624 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f7xml\" (UniqueName: \"kubernetes.io/projected/18e2c38d-2593-41b7-a74a-74fd5671f27d-kube-api-access-f7xml\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.208998 4831 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.209195 4831 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-39da139f-5ed3-4973-89fc-b011508a45de" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-39da139f-5ed3-4973-89fc-b011508a45de") on node "crc" Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.221413 4831 reconciler_common.go:293] "Volume detached for volume \"pvc-39da139f-5ed3-4973-89fc-b011508a45de\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-39da139f-5ed3-4973-89fc-b011508a45de\") on node \"crc\" DevicePath \"\"" Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.800575 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/mariadb-copy-data" event={"ID":"18e2c38d-2593-41b7-a74a-74fd5671f27d","Type":"ContainerDied","Data":"cb7060465747354d2071d28bf5973e06448c8266d3daf9aa7801a5461856fed9"} Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.800838 4831 scope.go:117] "RemoveContainer" containerID="e1026a1036699d43cf788f6d01284c6257c26b6806cfb3a948b3e5cac425bee1" Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.800836 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/mariadb-copy-data" Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.851205 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/mariadb-copy-data"] Dec 03 09:06:47 crc kubenswrapper[4831]: I1203 09:06:47.868282 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/mariadb-copy-data"] Dec 03 09:06:48 crc kubenswrapper[4831]: I1203 09:06:48.465610 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 03 09:06:48 crc kubenswrapper[4831]: I1203 09:06:48.466480 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ovn-copy-data" podUID="725c995a-d355-4a06-9824-518cea6948e5" containerName="adoption" containerID="cri-o://ec5851ca5e1181840c16cabb90be184bd203eb416b53f6c720f8c14986f6e1bc" gracePeriod=30 Dec 03 09:06:49 crc kubenswrapper[4831]: I1203 09:06:49.030080 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="18e2c38d-2593-41b7-a74a-74fd5671f27d" path="/var/lib/kubelet/pods/18e2c38d-2593-41b7-a74a-74fd5671f27d/volumes" Dec 03 09:06:55 crc kubenswrapper[4831]: I1203 09:06:55.013711 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:06:55 crc kubenswrapper[4831]: E1203 09:06:55.016036 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:07:10 crc kubenswrapper[4831]: I1203 09:07:10.013341 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:07:10 crc kubenswrapper[4831]: E1203 09:07:10.014118 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.212380 4831 generic.go:334] "Generic (PLEG): container finished" podID="725c995a-d355-4a06-9824-518cea6948e5" containerID="ec5851ca5e1181840c16cabb90be184bd203eb416b53f6c720f8c14986f6e1bc" exitCode=137 Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.212443 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"725c995a-d355-4a06-9824-518cea6948e5","Type":"ContainerDied","Data":"ec5851ca5e1181840c16cabb90be184bd203eb416b53f6c720f8c14986f6e1bc"} Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.517567 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.706571 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25\") pod \"725c995a-d355-4a06-9824-518cea6948e5\" (UID: \"725c995a-d355-4a06-9824-518cea6948e5\") " Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.706745 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zm7fp\" (UniqueName: \"kubernetes.io/projected/725c995a-d355-4a06-9824-518cea6948e5-kube-api-access-zm7fp\") pod \"725c995a-d355-4a06-9824-518cea6948e5\" (UID: \"725c995a-d355-4a06-9824-518cea6948e5\") " Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.706782 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/725c995a-d355-4a06-9824-518cea6948e5-ovn-data-cert\") pod \"725c995a-d355-4a06-9824-518cea6948e5\" (UID: \"725c995a-d355-4a06-9824-518cea6948e5\") " Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.714030 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/725c995a-d355-4a06-9824-518cea6948e5-kube-api-access-zm7fp" (OuterVolumeSpecName: "kube-api-access-zm7fp") pod "725c995a-d355-4a06-9824-518cea6948e5" (UID: "725c995a-d355-4a06-9824-518cea6948e5"). InnerVolumeSpecName "kube-api-access-zm7fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.714580 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/725c995a-d355-4a06-9824-518cea6948e5-ovn-data-cert" (OuterVolumeSpecName: "ovn-data-cert") pod "725c995a-d355-4a06-9824-518cea6948e5" (UID: "725c995a-d355-4a06-9824-518cea6948e5"). InnerVolumeSpecName "ovn-data-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.731851 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25" (OuterVolumeSpecName: "ovn-data") pod "725c995a-d355-4a06-9824-518cea6948e5" (UID: "725c995a-d355-4a06-9824-518cea6948e5"). InnerVolumeSpecName "pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25". PluginName "kubernetes.io/csi", VolumeGidValue "" Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.811757 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zm7fp\" (UniqueName: \"kubernetes.io/projected/725c995a-d355-4a06-9824-518cea6948e5-kube-api-access-zm7fp\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.811804 4831 reconciler_common.go:293] "Volume detached for volume \"ovn-data-cert\" (UniqueName: \"kubernetes.io/secret/725c995a-d355-4a06-9824-518cea6948e5-ovn-data-cert\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.811855 4831 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25\") on node \"crc\" " Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.853869 4831 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.854038 4831 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25") on node "crc" Dec 03 09:07:19 crc kubenswrapper[4831]: I1203 09:07:19.913634 4831 reconciler_common.go:293] "Volume detached for volume \"pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-d0d5a2b0-f55a-4115-8406-c9174547ec25\") on node \"crc\" DevicePath \"\"" Dec 03 09:07:20 crc kubenswrapper[4831]: I1203 09:07:20.230463 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-copy-data" event={"ID":"725c995a-d355-4a06-9824-518cea6948e5","Type":"ContainerDied","Data":"5df60974a8b13c23c66a903976e3a04a033baa85a6f77e007a314c5d86d39067"} Dec 03 09:07:20 crc kubenswrapper[4831]: I1203 09:07:20.230543 4831 scope.go:117] "RemoveContainer" containerID="ec5851ca5e1181840c16cabb90be184bd203eb416b53f6c720f8c14986f6e1bc" Dec 03 09:07:20 crc kubenswrapper[4831]: I1203 09:07:20.230813 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-copy-data" Dec 03 09:07:20 crc kubenswrapper[4831]: I1203 09:07:20.293464 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-copy-data"] Dec 03 09:07:20 crc kubenswrapper[4831]: I1203 09:07:20.303898 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-copy-data"] Dec 03 09:07:21 crc kubenswrapper[4831]: I1203 09:07:21.025717 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="725c995a-d355-4a06-9824-518cea6948e5" path="/var/lib/kubelet/pods/725c995a-d355-4a06-9824-518cea6948e5/volumes" Dec 03 09:07:22 crc kubenswrapper[4831]: I1203 09:07:22.012699 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:07:22 crc kubenswrapper[4831]: E1203 09:07:22.014394 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:07:33 crc kubenswrapper[4831]: I1203 09:07:33.022450 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:07:33 crc kubenswrapper[4831]: E1203 09:07:33.023225 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:07:45 crc kubenswrapper[4831]: I1203 09:07:45.013550 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:07:45 crc kubenswrapper[4831]: E1203 09:07:45.015980 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:07:57 crc kubenswrapper[4831]: I1203 09:07:57.013416 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:07:57 crc kubenswrapper[4831]: E1203 09:07:57.014284 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:08:12 crc kubenswrapper[4831]: I1203 09:08:12.013140 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:08:12 crc kubenswrapper[4831]: E1203 09:08:12.013959 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.904704 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cqxbg/must-gather-wpzz9"] Dec 03 09:08:17 crc kubenswrapper[4831]: E1203 09:08:17.905838 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab25bea-c97e-49b5-9660-2328f2eb9f00" containerName="extract-utilities" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.905857 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab25bea-c97e-49b5-9660-2328f2eb9f00" containerName="extract-utilities" Dec 03 09:08:17 crc kubenswrapper[4831]: E1203 09:08:17.905882 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18e2c38d-2593-41b7-a74a-74fd5671f27d" containerName="adoption" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.905890 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="18e2c38d-2593-41b7-a74a-74fd5671f27d" containerName="adoption" Dec 03 09:08:17 crc kubenswrapper[4831]: E1203 09:08:17.905909 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffdf963-3509-4ebd-aa16-b701a9ac2086" containerName="registry-server" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.905917 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffdf963-3509-4ebd-aa16-b701a9ac2086" containerName="registry-server" Dec 03 09:08:17 crc kubenswrapper[4831]: E1203 09:08:17.905935 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="725c995a-d355-4a06-9824-518cea6948e5" containerName="adoption" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.905942 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="725c995a-d355-4a06-9824-518cea6948e5" containerName="adoption" Dec 03 09:08:17 crc kubenswrapper[4831]: E1203 09:08:17.905960 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffdf963-3509-4ebd-aa16-b701a9ac2086" containerName="extract-utilities" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.905968 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffdf963-3509-4ebd-aa16-b701a9ac2086" containerName="extract-utilities" Dec 03 09:08:17 crc kubenswrapper[4831]: E1203 09:08:17.905998 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dffdf963-3509-4ebd-aa16-b701a9ac2086" containerName="extract-content" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.906004 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="dffdf963-3509-4ebd-aa16-b701a9ac2086" containerName="extract-content" Dec 03 09:08:17 crc kubenswrapper[4831]: E1203 09:08:17.906015 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab25bea-c97e-49b5-9660-2328f2eb9f00" containerName="registry-server" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.906022 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab25bea-c97e-49b5-9660-2328f2eb9f00" containerName="registry-server" Dec 03 09:08:17 crc kubenswrapper[4831]: E1203 09:08:17.906031 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ab25bea-c97e-49b5-9660-2328f2eb9f00" containerName="extract-content" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.906041 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab25bea-c97e-49b5-9660-2328f2eb9f00" containerName="extract-content" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.906303 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="dffdf963-3509-4ebd-aa16-b701a9ac2086" containerName="registry-server" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.906352 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="725c995a-d355-4a06-9824-518cea6948e5" containerName="adoption" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.906382 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ab25bea-c97e-49b5-9660-2328f2eb9f00" containerName="registry-server" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.906400 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="18e2c38d-2593-41b7-a74a-74fd5671f27d" containerName="adoption" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.908008 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxbg/must-gather-wpzz9" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.910193 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-cqxbg"/"default-dockercfg-g7888" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.910681 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cqxbg"/"kube-root-ca.crt" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.913554 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-cqxbg"/"openshift-service-ca.crt" Dec 03 09:08:17 crc kubenswrapper[4831]: I1203 09:08:17.915626 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cqxbg/must-gather-wpzz9"] Dec 03 09:08:18 crc kubenswrapper[4831]: I1203 09:08:18.014274 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mxv5\" (UniqueName: \"kubernetes.io/projected/8e6975b6-9f6b-4100-a72e-fcf15a303bc0-kube-api-access-7mxv5\") pod \"must-gather-wpzz9\" (UID: \"8e6975b6-9f6b-4100-a72e-fcf15a303bc0\") " pod="openshift-must-gather-cqxbg/must-gather-wpzz9" Dec 03 09:08:18 crc kubenswrapper[4831]: I1203 09:08:18.014594 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e6975b6-9f6b-4100-a72e-fcf15a303bc0-must-gather-output\") pod \"must-gather-wpzz9\" (UID: \"8e6975b6-9f6b-4100-a72e-fcf15a303bc0\") " pod="openshift-must-gather-cqxbg/must-gather-wpzz9" Dec 03 09:08:18 crc kubenswrapper[4831]: I1203 09:08:18.116107 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e6975b6-9f6b-4100-a72e-fcf15a303bc0-must-gather-output\") pod \"must-gather-wpzz9\" (UID: \"8e6975b6-9f6b-4100-a72e-fcf15a303bc0\") " pod="openshift-must-gather-cqxbg/must-gather-wpzz9" Dec 03 09:08:18 crc kubenswrapper[4831]: I1203 09:08:18.116410 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mxv5\" (UniqueName: \"kubernetes.io/projected/8e6975b6-9f6b-4100-a72e-fcf15a303bc0-kube-api-access-7mxv5\") pod \"must-gather-wpzz9\" (UID: \"8e6975b6-9f6b-4100-a72e-fcf15a303bc0\") " pod="openshift-must-gather-cqxbg/must-gather-wpzz9" Dec 03 09:08:18 crc kubenswrapper[4831]: I1203 09:08:18.116654 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e6975b6-9f6b-4100-a72e-fcf15a303bc0-must-gather-output\") pod \"must-gather-wpzz9\" (UID: \"8e6975b6-9f6b-4100-a72e-fcf15a303bc0\") " pod="openshift-must-gather-cqxbg/must-gather-wpzz9" Dec 03 09:08:18 crc kubenswrapper[4831]: I1203 09:08:18.140453 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mxv5\" (UniqueName: \"kubernetes.io/projected/8e6975b6-9f6b-4100-a72e-fcf15a303bc0-kube-api-access-7mxv5\") pod \"must-gather-wpzz9\" (UID: \"8e6975b6-9f6b-4100-a72e-fcf15a303bc0\") " pod="openshift-must-gather-cqxbg/must-gather-wpzz9" Dec 03 09:08:18 crc kubenswrapper[4831]: I1203 09:08:18.231035 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxbg/must-gather-wpzz9" Dec 03 09:08:18 crc kubenswrapper[4831]: I1203 09:08:18.730995 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-cqxbg/must-gather-wpzz9"] Dec 03 09:08:19 crc kubenswrapper[4831]: I1203 09:08:19.636877 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:08:19 crc kubenswrapper[4831]: I1203 09:08:19.917729 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxbg/must-gather-wpzz9" event={"ID":"8e6975b6-9f6b-4100-a72e-fcf15a303bc0","Type":"ContainerStarted","Data":"678b8d1f0e5f90d9782c152be657e5d937e602be123c899a8547497cd4eb669d"} Dec 03 09:08:25 crc kubenswrapper[4831]: I1203 09:08:25.983445 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxbg/must-gather-wpzz9" event={"ID":"8e6975b6-9f6b-4100-a72e-fcf15a303bc0","Type":"ContainerStarted","Data":"01f720971ce668fcc9ee70cd554765829dfb54cc48c34571669ba7468792312e"} Dec 03 09:08:25 crc kubenswrapper[4831]: I1203 09:08:25.984124 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxbg/must-gather-wpzz9" event={"ID":"8e6975b6-9f6b-4100-a72e-fcf15a303bc0","Type":"ContainerStarted","Data":"4c1f0e2b9ba0cd61d64214ec595fff1f77b9e18ed811e0e06aa1da228e175407"} Dec 03 09:08:26 crc kubenswrapper[4831]: I1203 09:08:26.006559 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cqxbg/must-gather-wpzz9" podStartSLOduration=3.49909257 podStartE2EDuration="9.006532518s" podCreationTimestamp="2025-12-03 09:08:17 +0000 UTC" firstStartedPulling="2025-12-03 09:08:19.636639391 +0000 UTC m=+9436.980222899" lastFinishedPulling="2025-12-03 09:08:25.144079319 +0000 UTC m=+9442.487662847" observedRunningTime="2025-12-03 09:08:25.996541466 +0000 UTC m=+9443.340124974" watchObservedRunningTime="2025-12-03 09:08:26.006532518 +0000 UTC m=+9443.350116046" Dec 03 09:08:26 crc kubenswrapper[4831]: I1203 09:08:26.013102 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:08:26 crc kubenswrapper[4831]: E1203 09:08:26.013589 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:08:27 crc kubenswrapper[4831]: E1203 09:08:27.516360 4831 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.234:45024->38.102.83.234:39573: read tcp 38.102.83.234:45024->38.102.83.234:39573: read: connection reset by peer Dec 03 09:08:28 crc kubenswrapper[4831]: I1203 09:08:28.022548 4831 scope.go:117] "RemoveContainer" containerID="848266d7a0c7bd8941a2658681e684d7af8326e23dc0b3ad2048549cfc85a5ea" Dec 03 09:08:28 crc kubenswrapper[4831]: I1203 09:08:28.057502 4831 scope.go:117] "RemoveContainer" containerID="52957348b0fd1d96351f730d86d3dbe6ab804a44ad019a750069b498cc04942c" Dec 03 09:08:28 crc kubenswrapper[4831]: I1203 09:08:28.113608 4831 scope.go:117] "RemoveContainer" containerID="5b27e44bac93e2391f1e5134468625c12c54c0d9d9bccc0e2574663571963576" Dec 03 09:08:29 crc kubenswrapper[4831]: I1203 09:08:29.654631 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cqxbg/crc-debug-4mzw4"] Dec 03 09:08:29 crc kubenswrapper[4831]: I1203 09:08:29.657102 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" Dec 03 09:08:29 crc kubenswrapper[4831]: I1203 09:08:29.789366 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22ae2838-9f61-41d7-bf3d-a7eddb6eea21-host\") pod \"crc-debug-4mzw4\" (UID: \"22ae2838-9f61-41d7-bf3d-a7eddb6eea21\") " pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" Dec 03 09:08:29 crc kubenswrapper[4831]: I1203 09:08:29.789479 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpjqm\" (UniqueName: \"kubernetes.io/projected/22ae2838-9f61-41d7-bf3d-a7eddb6eea21-kube-api-access-rpjqm\") pod \"crc-debug-4mzw4\" (UID: \"22ae2838-9f61-41d7-bf3d-a7eddb6eea21\") " pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" Dec 03 09:08:29 crc kubenswrapper[4831]: I1203 09:08:29.891572 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22ae2838-9f61-41d7-bf3d-a7eddb6eea21-host\") pod \"crc-debug-4mzw4\" (UID: \"22ae2838-9f61-41d7-bf3d-a7eddb6eea21\") " pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" Dec 03 09:08:29 crc kubenswrapper[4831]: I1203 09:08:29.891665 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpjqm\" (UniqueName: \"kubernetes.io/projected/22ae2838-9f61-41d7-bf3d-a7eddb6eea21-kube-api-access-rpjqm\") pod \"crc-debug-4mzw4\" (UID: \"22ae2838-9f61-41d7-bf3d-a7eddb6eea21\") " pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" Dec 03 09:08:29 crc kubenswrapper[4831]: I1203 09:08:29.891700 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22ae2838-9f61-41d7-bf3d-a7eddb6eea21-host\") pod \"crc-debug-4mzw4\" (UID: \"22ae2838-9f61-41d7-bf3d-a7eddb6eea21\") " pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" Dec 03 09:08:29 crc kubenswrapper[4831]: I1203 09:08:29.913482 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpjqm\" (UniqueName: \"kubernetes.io/projected/22ae2838-9f61-41d7-bf3d-a7eddb6eea21-kube-api-access-rpjqm\") pod \"crc-debug-4mzw4\" (UID: \"22ae2838-9f61-41d7-bf3d-a7eddb6eea21\") " pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" Dec 03 09:08:29 crc kubenswrapper[4831]: I1203 09:08:29.979966 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" Dec 03 09:08:31 crc kubenswrapper[4831]: I1203 09:08:31.044202 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" event={"ID":"22ae2838-9f61-41d7-bf3d-a7eddb6eea21","Type":"ContainerStarted","Data":"f3fdb1a3d7fe15dd51f8af2db1bf45c06191093832855d28186aca3b49c7bc15"} Dec 03 09:08:37 crc kubenswrapper[4831]: I1203 09:08:37.017062 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:08:37 crc kubenswrapper[4831]: E1203 09:08:37.017777 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:08:43 crc kubenswrapper[4831]: I1203 09:08:43.183137 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" event={"ID":"22ae2838-9f61-41d7-bf3d-a7eddb6eea21","Type":"ContainerStarted","Data":"097f4e54c14f87e9d683188e1eff86121233a196fcec34264c153e33753110c3"} Dec 03 09:08:43 crc kubenswrapper[4831]: I1203 09:08:43.207360 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" podStartSLOduration=1.966589275 podStartE2EDuration="14.20728221s" podCreationTimestamp="2025-12-03 09:08:29 +0000 UTC" firstStartedPulling="2025-12-03 09:08:30.027674246 +0000 UTC m=+9447.371257754" lastFinishedPulling="2025-12-03 09:08:42.268367181 +0000 UTC m=+9459.611950689" observedRunningTime="2025-12-03 09:08:43.199018543 +0000 UTC m=+9460.542602061" watchObservedRunningTime="2025-12-03 09:08:43.20728221 +0000 UTC m=+9460.550865718" Dec 03 09:08:50 crc kubenswrapper[4831]: I1203 09:08:50.014105 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:08:50 crc kubenswrapper[4831]: E1203 09:08:50.014766 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:09:02 crc kubenswrapper[4831]: I1203 09:09:02.012659 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:09:02 crc kubenswrapper[4831]: E1203 09:09:02.013946 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:09:02 crc kubenswrapper[4831]: I1203 09:09:02.387821 4831 generic.go:334] "Generic (PLEG): container finished" podID="22ae2838-9f61-41d7-bf3d-a7eddb6eea21" containerID="097f4e54c14f87e9d683188e1eff86121233a196fcec34264c153e33753110c3" exitCode=0 Dec 03 09:09:02 crc kubenswrapper[4831]: I1203 09:09:02.387869 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" event={"ID":"22ae2838-9f61-41d7-bf3d-a7eddb6eea21","Type":"ContainerDied","Data":"097f4e54c14f87e9d683188e1eff86121233a196fcec34264c153e33753110c3"} Dec 03 09:09:03 crc kubenswrapper[4831]: I1203 09:09:03.553700 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" Dec 03 09:09:03 crc kubenswrapper[4831]: I1203 09:09:03.596923 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cqxbg/crc-debug-4mzw4"] Dec 03 09:09:03 crc kubenswrapper[4831]: I1203 09:09:03.610735 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cqxbg/crc-debug-4mzw4"] Dec 03 09:09:03 crc kubenswrapper[4831]: I1203 09:09:03.702498 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22ae2838-9f61-41d7-bf3d-a7eddb6eea21-host\") pod \"22ae2838-9f61-41d7-bf3d-a7eddb6eea21\" (UID: \"22ae2838-9f61-41d7-bf3d-a7eddb6eea21\") " Dec 03 09:09:03 crc kubenswrapper[4831]: I1203 09:09:03.702657 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/22ae2838-9f61-41d7-bf3d-a7eddb6eea21-host" (OuterVolumeSpecName: "host") pod "22ae2838-9f61-41d7-bf3d-a7eddb6eea21" (UID: "22ae2838-9f61-41d7-bf3d-a7eddb6eea21"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 09:09:03 crc kubenswrapper[4831]: I1203 09:09:03.702738 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpjqm\" (UniqueName: \"kubernetes.io/projected/22ae2838-9f61-41d7-bf3d-a7eddb6eea21-kube-api-access-rpjqm\") pod \"22ae2838-9f61-41d7-bf3d-a7eddb6eea21\" (UID: \"22ae2838-9f61-41d7-bf3d-a7eddb6eea21\") " Dec 03 09:09:03 crc kubenswrapper[4831]: I1203 09:09:03.703119 4831 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/22ae2838-9f61-41d7-bf3d-a7eddb6eea21-host\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:03 crc kubenswrapper[4831]: I1203 09:09:03.708213 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22ae2838-9f61-41d7-bf3d-a7eddb6eea21-kube-api-access-rpjqm" (OuterVolumeSpecName: "kube-api-access-rpjqm") pod "22ae2838-9f61-41d7-bf3d-a7eddb6eea21" (UID: "22ae2838-9f61-41d7-bf3d-a7eddb6eea21"). InnerVolumeSpecName "kube-api-access-rpjqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:09:03 crc kubenswrapper[4831]: I1203 09:09:03.805037 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rpjqm\" (UniqueName: \"kubernetes.io/projected/22ae2838-9f61-41d7-bf3d-a7eddb6eea21-kube-api-access-rpjqm\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:04 crc kubenswrapper[4831]: I1203 09:09:04.427015 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3fdb1a3d7fe15dd51f8af2db1bf45c06191093832855d28186aca3b49c7bc15" Dec 03 09:09:04 crc kubenswrapper[4831]: I1203 09:09:04.427092 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxbg/crc-debug-4mzw4" Dec 03 09:09:04 crc kubenswrapper[4831]: I1203 09:09:04.771859 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-cqxbg/crc-debug-lv5zv"] Dec 03 09:09:04 crc kubenswrapper[4831]: E1203 09:09:04.772392 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22ae2838-9f61-41d7-bf3d-a7eddb6eea21" containerName="container-00" Dec 03 09:09:04 crc kubenswrapper[4831]: I1203 09:09:04.772412 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="22ae2838-9f61-41d7-bf3d-a7eddb6eea21" containerName="container-00" Dec 03 09:09:04 crc kubenswrapper[4831]: I1203 09:09:04.772661 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="22ae2838-9f61-41d7-bf3d-a7eddb6eea21" containerName="container-00" Dec 03 09:09:04 crc kubenswrapper[4831]: I1203 09:09:04.773546 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxbg/crc-debug-lv5zv" Dec 03 09:09:04 crc kubenswrapper[4831]: I1203 09:09:04.925344 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6szcm\" (UniqueName: \"kubernetes.io/projected/a4b179b2-3f26-47d2-9678-2ca4a7a17e49-kube-api-access-6szcm\") pod \"crc-debug-lv5zv\" (UID: \"a4b179b2-3f26-47d2-9678-2ca4a7a17e49\") " pod="openshift-must-gather-cqxbg/crc-debug-lv5zv" Dec 03 09:09:04 crc kubenswrapper[4831]: I1203 09:09:04.925398 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4b179b2-3f26-47d2-9678-2ca4a7a17e49-host\") pod \"crc-debug-lv5zv\" (UID: \"a4b179b2-3f26-47d2-9678-2ca4a7a17e49\") " pod="openshift-must-gather-cqxbg/crc-debug-lv5zv" Dec 03 09:09:05 crc kubenswrapper[4831]: I1203 09:09:05.028231 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6szcm\" (UniqueName: \"kubernetes.io/projected/a4b179b2-3f26-47d2-9678-2ca4a7a17e49-kube-api-access-6szcm\") pod \"crc-debug-lv5zv\" (UID: \"a4b179b2-3f26-47d2-9678-2ca4a7a17e49\") " pod="openshift-must-gather-cqxbg/crc-debug-lv5zv" Dec 03 09:09:05 crc kubenswrapper[4831]: I1203 09:09:05.028295 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4b179b2-3f26-47d2-9678-2ca4a7a17e49-host\") pod \"crc-debug-lv5zv\" (UID: \"a4b179b2-3f26-47d2-9678-2ca4a7a17e49\") " pod="openshift-must-gather-cqxbg/crc-debug-lv5zv" Dec 03 09:09:05 crc kubenswrapper[4831]: I1203 09:09:05.028524 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4b179b2-3f26-47d2-9678-2ca4a7a17e49-host\") pod \"crc-debug-lv5zv\" (UID: \"a4b179b2-3f26-47d2-9678-2ca4a7a17e49\") " pod="openshift-must-gather-cqxbg/crc-debug-lv5zv" Dec 03 09:09:05 crc kubenswrapper[4831]: I1203 09:09:05.033084 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22ae2838-9f61-41d7-bf3d-a7eddb6eea21" path="/var/lib/kubelet/pods/22ae2838-9f61-41d7-bf3d-a7eddb6eea21/volumes" Dec 03 09:09:05 crc kubenswrapper[4831]: I1203 09:09:05.049540 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6szcm\" (UniqueName: \"kubernetes.io/projected/a4b179b2-3f26-47d2-9678-2ca4a7a17e49-kube-api-access-6szcm\") pod \"crc-debug-lv5zv\" (UID: \"a4b179b2-3f26-47d2-9678-2ca4a7a17e49\") " pod="openshift-must-gather-cqxbg/crc-debug-lv5zv" Dec 03 09:09:05 crc kubenswrapper[4831]: I1203 09:09:05.097056 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxbg/crc-debug-lv5zv" Dec 03 09:09:05 crc kubenswrapper[4831]: I1203 09:09:05.438366 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxbg/crc-debug-lv5zv" event={"ID":"a4b179b2-3f26-47d2-9678-2ca4a7a17e49","Type":"ContainerStarted","Data":"02f68f083c80bf4c773970d6825366a20d3b66a78ca8ce9a5fb2e42c73cd961a"} Dec 03 09:09:06 crc kubenswrapper[4831]: I1203 09:09:06.463919 4831 generic.go:334] "Generic (PLEG): container finished" podID="a4b179b2-3f26-47d2-9678-2ca4a7a17e49" containerID="29572e064ace83bd878a24f4d3628ffec7719dda5aa7c36a7b7360ab8358e96c" exitCode=1 Dec 03 09:09:06 crc kubenswrapper[4831]: I1203 09:09:06.463988 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxbg/crc-debug-lv5zv" event={"ID":"a4b179b2-3f26-47d2-9678-2ca4a7a17e49","Type":"ContainerDied","Data":"29572e064ace83bd878a24f4d3628ffec7719dda5aa7c36a7b7360ab8358e96c"} Dec 03 09:09:06 crc kubenswrapper[4831]: I1203 09:09:06.514462 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cqxbg/crc-debug-lv5zv"] Dec 03 09:09:06 crc kubenswrapper[4831]: I1203 09:09:06.526149 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cqxbg/crc-debug-lv5zv"] Dec 03 09:09:07 crc kubenswrapper[4831]: I1203 09:09:07.601510 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxbg/crc-debug-lv5zv" Dec 03 09:09:07 crc kubenswrapper[4831]: I1203 09:09:07.698355 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6szcm\" (UniqueName: \"kubernetes.io/projected/a4b179b2-3f26-47d2-9678-2ca4a7a17e49-kube-api-access-6szcm\") pod \"a4b179b2-3f26-47d2-9678-2ca4a7a17e49\" (UID: \"a4b179b2-3f26-47d2-9678-2ca4a7a17e49\") " Dec 03 09:09:07 crc kubenswrapper[4831]: I1203 09:09:07.698502 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4b179b2-3f26-47d2-9678-2ca4a7a17e49-host\") pod \"a4b179b2-3f26-47d2-9678-2ca4a7a17e49\" (UID: \"a4b179b2-3f26-47d2-9678-2ca4a7a17e49\") " Dec 03 09:09:07 crc kubenswrapper[4831]: I1203 09:09:07.699204 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a4b179b2-3f26-47d2-9678-2ca4a7a17e49-host" (OuterVolumeSpecName: "host") pod "a4b179b2-3f26-47d2-9678-2ca4a7a17e49" (UID: "a4b179b2-3f26-47d2-9678-2ca4a7a17e49"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Dec 03 09:09:07 crc kubenswrapper[4831]: I1203 09:09:07.715379 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4b179b2-3f26-47d2-9678-2ca4a7a17e49-kube-api-access-6szcm" (OuterVolumeSpecName: "kube-api-access-6szcm") pod "a4b179b2-3f26-47d2-9678-2ca4a7a17e49" (UID: "a4b179b2-3f26-47d2-9678-2ca4a7a17e49"). InnerVolumeSpecName "kube-api-access-6szcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:09:07 crc kubenswrapper[4831]: I1203 09:09:07.800745 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6szcm\" (UniqueName: \"kubernetes.io/projected/a4b179b2-3f26-47d2-9678-2ca4a7a17e49-kube-api-access-6szcm\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:07 crc kubenswrapper[4831]: I1203 09:09:07.801010 4831 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a4b179b2-3f26-47d2-9678-2ca4a7a17e49-host\") on node \"crc\" DevicePath \"\"" Dec 03 09:09:08 crc kubenswrapper[4831]: I1203 09:09:08.491519 4831 scope.go:117] "RemoveContainer" containerID="29572e064ace83bd878a24f4d3628ffec7719dda5aa7c36a7b7360ab8358e96c" Dec 03 09:09:08 crc kubenswrapper[4831]: I1203 09:09:08.491617 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxbg/crc-debug-lv5zv" Dec 03 09:09:09 crc kubenswrapper[4831]: I1203 09:09:09.026108 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4b179b2-3f26-47d2-9678-2ca4a7a17e49" path="/var/lib/kubelet/pods/a4b179b2-3f26-47d2-9678-2ca4a7a17e49/volumes" Dec 03 09:09:16 crc kubenswrapper[4831]: I1203 09:09:16.013446 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:09:16 crc kubenswrapper[4831]: E1203 09:09:16.014182 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:09:30 crc kubenswrapper[4831]: I1203 09:09:30.013245 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:09:30 crc kubenswrapper[4831]: I1203 09:09:30.734934 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"fa4eb3500b4af6b5ec70d42d28664dd92eb6006e7417e0949f20cc87d4e6794c"} Dec 03 09:11:46 crc kubenswrapper[4831]: I1203 09:11:46.933242 4831 trace.go:236] Trace[1380917416]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-0" (03-Dec-2025 09:11:45.609) (total time: 1323ms): Dec 03 09:11:46 crc kubenswrapper[4831]: Trace[1380917416]: [1.323770518s] [1.323770518s] END Dec 03 09:11:57 crc kubenswrapper[4831]: I1203 09:11:57.597173 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:11:57 crc kubenswrapper[4831]: I1203 09:11:57.597974 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:12:27 crc kubenswrapper[4831]: I1203 09:12:27.597161 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:12:27 crc kubenswrapper[4831]: I1203 09:12:27.597719 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.278824 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-zsw26"] Dec 03 09:12:55 crc kubenswrapper[4831]: E1203 09:12:55.283038 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4b179b2-3f26-47d2-9678-2ca4a7a17e49" containerName="container-00" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.283055 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4b179b2-3f26-47d2-9678-2ca4a7a17e49" containerName="container-00" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.283269 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4b179b2-3f26-47d2-9678-2ca4a7a17e49" containerName="container-00" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.285073 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.298580 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zsw26"] Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.385893 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qnwl\" (UniqueName: \"kubernetes.io/projected/eff1e6c4-5941-48cc-a498-1dbd6c977a9a-kube-api-access-7qnwl\") pod \"redhat-operators-zsw26\" (UID: \"eff1e6c4-5941-48cc-a498-1dbd6c977a9a\") " pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.385949 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff1e6c4-5941-48cc-a498-1dbd6c977a9a-catalog-content\") pod \"redhat-operators-zsw26\" (UID: \"eff1e6c4-5941-48cc-a498-1dbd6c977a9a\") " pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.386230 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff1e6c4-5941-48cc-a498-1dbd6c977a9a-utilities\") pod \"redhat-operators-zsw26\" (UID: \"eff1e6c4-5941-48cc-a498-1dbd6c977a9a\") " pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.487667 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qnwl\" (UniqueName: \"kubernetes.io/projected/eff1e6c4-5941-48cc-a498-1dbd6c977a9a-kube-api-access-7qnwl\") pod \"redhat-operators-zsw26\" (UID: \"eff1e6c4-5941-48cc-a498-1dbd6c977a9a\") " pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.487708 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff1e6c4-5941-48cc-a498-1dbd6c977a9a-catalog-content\") pod \"redhat-operators-zsw26\" (UID: \"eff1e6c4-5941-48cc-a498-1dbd6c977a9a\") " pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.487782 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff1e6c4-5941-48cc-a498-1dbd6c977a9a-utilities\") pod \"redhat-operators-zsw26\" (UID: \"eff1e6c4-5941-48cc-a498-1dbd6c977a9a\") " pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.488352 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/eff1e6c4-5941-48cc-a498-1dbd6c977a9a-catalog-content\") pod \"redhat-operators-zsw26\" (UID: \"eff1e6c4-5941-48cc-a498-1dbd6c977a9a\") " pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.488459 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/eff1e6c4-5941-48cc-a498-1dbd6c977a9a-utilities\") pod \"redhat-operators-zsw26\" (UID: \"eff1e6c4-5941-48cc-a498-1dbd6c977a9a\") " pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.621096 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qnwl\" (UniqueName: \"kubernetes.io/projected/eff1e6c4-5941-48cc-a498-1dbd6c977a9a-kube-api-access-7qnwl\") pod \"redhat-operators-zsw26\" (UID: \"eff1e6c4-5941-48cc-a498-1dbd6c977a9a\") " pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:12:55 crc kubenswrapper[4831]: I1203 09:12:55.630928 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:12:56 crc kubenswrapper[4831]: I1203 09:12:56.169152 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zsw26"] Dec 03 09:12:56 crc kubenswrapper[4831]: I1203 09:12:56.488109 4831 generic.go:334] "Generic (PLEG): container finished" podID="eff1e6c4-5941-48cc-a498-1dbd6c977a9a" containerID="7c58f49efba622d99efeea3ee6b6dbe8ab870b4c234ef590dbf6fdde2ce3e01b" exitCode=0 Dec 03 09:12:56 crc kubenswrapper[4831]: I1203 09:12:56.488152 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zsw26" event={"ID":"eff1e6c4-5941-48cc-a498-1dbd6c977a9a","Type":"ContainerDied","Data":"7c58f49efba622d99efeea3ee6b6dbe8ab870b4c234ef590dbf6fdde2ce3e01b"} Dec 03 09:12:56 crc kubenswrapper[4831]: I1203 09:12:56.488179 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zsw26" event={"ID":"eff1e6c4-5941-48cc-a498-1dbd6c977a9a","Type":"ContainerStarted","Data":"3a26c1b72686c67563f08efc1e3f4e7e111cddaec70d6bd175373c6d34c76194"} Dec 03 09:12:57 crc kubenswrapper[4831]: I1203 09:12:57.596986 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:12:57 crc kubenswrapper[4831]: I1203 09:12:57.597360 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:12:57 crc kubenswrapper[4831]: I1203 09:12:57.597422 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 09:12:57 crc kubenswrapper[4831]: I1203 09:12:57.598401 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fa4eb3500b4af6b5ec70d42d28664dd92eb6006e7417e0949f20cc87d4e6794c"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:12:57 crc kubenswrapper[4831]: I1203 09:12:57.598481 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://fa4eb3500b4af6b5ec70d42d28664dd92eb6006e7417e0949f20cc87d4e6794c" gracePeriod=600 Dec 03 09:12:58 crc kubenswrapper[4831]: I1203 09:12:58.514770 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="fa4eb3500b4af6b5ec70d42d28664dd92eb6006e7417e0949f20cc87d4e6794c" exitCode=0 Dec 03 09:12:58 crc kubenswrapper[4831]: I1203 09:12:58.514902 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"fa4eb3500b4af6b5ec70d42d28664dd92eb6006e7417e0949f20cc87d4e6794c"} Dec 03 09:12:58 crc kubenswrapper[4831]: I1203 09:12:58.515111 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf"} Dec 03 09:12:58 crc kubenswrapper[4831]: I1203 09:12:58.515127 4831 scope.go:117] "RemoveContainer" containerID="ae1ca4b6a640dc2106b3b1cb5cbc27e00d63032cfd239b2c0572cb21720a4fa2" Dec 03 09:13:05 crc kubenswrapper[4831]: I1203 09:13:05.644966 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zsw26" event={"ID":"eff1e6c4-5941-48cc-a498-1dbd6c977a9a","Type":"ContainerStarted","Data":"e1b712a19c9f79464cc769c723f85eb642160ecfe79248ac1c98d0b83915acc1"} Dec 03 09:13:08 crc kubenswrapper[4831]: I1203 09:13:08.675918 4831 generic.go:334] "Generic (PLEG): container finished" podID="eff1e6c4-5941-48cc-a498-1dbd6c977a9a" containerID="e1b712a19c9f79464cc769c723f85eb642160ecfe79248ac1c98d0b83915acc1" exitCode=0 Dec 03 09:13:08 crc kubenswrapper[4831]: I1203 09:13:08.676018 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zsw26" event={"ID":"eff1e6c4-5941-48cc-a498-1dbd6c977a9a","Type":"ContainerDied","Data":"e1b712a19c9f79464cc769c723f85eb642160ecfe79248ac1c98d0b83915acc1"} Dec 03 09:13:09 crc kubenswrapper[4831]: I1203 09:13:09.690438 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-zsw26" event={"ID":"eff1e6c4-5941-48cc-a498-1dbd6c977a9a","Type":"ContainerStarted","Data":"f218503e5b04b98355ee1258c1be2b393beb5fa81dc71c8cd24fce4d4d4fda25"} Dec 03 09:13:09 crc kubenswrapper[4831]: I1203 09:13:09.723540 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-zsw26" podStartSLOduration=2.149398053 podStartE2EDuration="14.723516688s" podCreationTimestamp="2025-12-03 09:12:55 +0000 UTC" firstStartedPulling="2025-12-03 09:12:56.490016876 +0000 UTC m=+9713.833600384" lastFinishedPulling="2025-12-03 09:13:09.064135511 +0000 UTC m=+9726.407719019" observedRunningTime="2025-12-03 09:13:09.711160783 +0000 UTC m=+9727.054744291" watchObservedRunningTime="2025-12-03 09:13:09.723516688 +0000 UTC m=+9727.067100206" Dec 03 09:13:15 crc kubenswrapper[4831]: I1203 09:13:15.631047 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:13:15 crc kubenswrapper[4831]: I1203 09:13:15.631610 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:13:15 crc kubenswrapper[4831]: I1203 09:13:15.685557 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:13:15 crc kubenswrapper[4831]: I1203 09:13:15.804563 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-zsw26" Dec 03 09:13:15 crc kubenswrapper[4831]: I1203 09:13:15.871577 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-zsw26"] Dec 03 09:13:15 crc kubenswrapper[4831]: I1203 09:13:15.945302 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vshtj"] Dec 03 09:13:15 crc kubenswrapper[4831]: I1203 09:13:15.945866 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vshtj" podUID="89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" containerName="registry-server" containerID="cri-o://bfe1d53fe45e9da86d6522f96847e818410966057a73152043ae38589fbb626e" gracePeriod=2 Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.473278 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.598961 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2j59\" (UniqueName: \"kubernetes.io/projected/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-kube-api-access-d2j59\") pod \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\" (UID: \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\") " Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.599034 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-catalog-content\") pod \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\" (UID: \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\") " Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.599254 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-utilities\") pod \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\" (UID: \"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71\") " Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.599728 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-utilities" (OuterVolumeSpecName: "utilities") pod "89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" (UID: "89c1dc30-6b2e-4dbd-916d-0c6fd5907a71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.606770 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-kube-api-access-d2j59" (OuterVolumeSpecName: "kube-api-access-d2j59") pod "89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" (UID: "89c1dc30-6b2e-4dbd-916d-0c6fd5907a71"). InnerVolumeSpecName "kube-api-access-d2j59". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.701639 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.701682 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2j59\" (UniqueName: \"kubernetes.io/projected/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-kube-api-access-d2j59\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.712562 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" (UID: "89c1dc30-6b2e-4dbd-916d-0c6fd5907a71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.785979 4831 generic.go:334] "Generic (PLEG): container finished" podID="89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" containerID="bfe1d53fe45e9da86d6522f96847e818410966057a73152043ae38589fbb626e" exitCode=0 Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.786056 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vshtj" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.786081 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vshtj" event={"ID":"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71","Type":"ContainerDied","Data":"bfe1d53fe45e9da86d6522f96847e818410966057a73152043ae38589fbb626e"} Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.786129 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vshtj" event={"ID":"89c1dc30-6b2e-4dbd-916d-0c6fd5907a71","Type":"ContainerDied","Data":"33d389d2583fa36b98c2a5804a3956852ea846a7f412b3e3f3f7826bacced236"} Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.786150 4831 scope.go:117] "RemoveContainer" containerID="bfe1d53fe45e9da86d6522f96847e818410966057a73152043ae38589fbb626e" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.803737 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.817439 4831 scope.go:117] "RemoveContainer" containerID="f106ee1065d0edc8ea1907f13ee1f7274f73fd38a981dc7620cfcab19ae7e976" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.822358 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vshtj"] Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.839851 4831 scope.go:117] "RemoveContainer" containerID="197066afae1593da3e7c50c1d22446492151340e589f572f207388e4f644f78b" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.853189 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vshtj"] Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.891179 4831 scope.go:117] "RemoveContainer" containerID="bfe1d53fe45e9da86d6522f96847e818410966057a73152043ae38589fbb626e" Dec 03 09:13:16 crc kubenswrapper[4831]: E1203 09:13:16.891737 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfe1d53fe45e9da86d6522f96847e818410966057a73152043ae38589fbb626e\": container with ID starting with bfe1d53fe45e9da86d6522f96847e818410966057a73152043ae38589fbb626e not found: ID does not exist" containerID="bfe1d53fe45e9da86d6522f96847e818410966057a73152043ae38589fbb626e" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.891834 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfe1d53fe45e9da86d6522f96847e818410966057a73152043ae38589fbb626e"} err="failed to get container status \"bfe1d53fe45e9da86d6522f96847e818410966057a73152043ae38589fbb626e\": rpc error: code = NotFound desc = could not find container \"bfe1d53fe45e9da86d6522f96847e818410966057a73152043ae38589fbb626e\": container with ID starting with bfe1d53fe45e9da86d6522f96847e818410966057a73152043ae38589fbb626e not found: ID does not exist" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.891958 4831 scope.go:117] "RemoveContainer" containerID="f106ee1065d0edc8ea1907f13ee1f7274f73fd38a981dc7620cfcab19ae7e976" Dec 03 09:13:16 crc kubenswrapper[4831]: E1203 09:13:16.892292 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f106ee1065d0edc8ea1907f13ee1f7274f73fd38a981dc7620cfcab19ae7e976\": container with ID starting with f106ee1065d0edc8ea1907f13ee1f7274f73fd38a981dc7620cfcab19ae7e976 not found: ID does not exist" containerID="f106ee1065d0edc8ea1907f13ee1f7274f73fd38a981dc7620cfcab19ae7e976" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.892382 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f106ee1065d0edc8ea1907f13ee1f7274f73fd38a981dc7620cfcab19ae7e976"} err="failed to get container status \"f106ee1065d0edc8ea1907f13ee1f7274f73fd38a981dc7620cfcab19ae7e976\": rpc error: code = NotFound desc = could not find container \"f106ee1065d0edc8ea1907f13ee1f7274f73fd38a981dc7620cfcab19ae7e976\": container with ID starting with f106ee1065d0edc8ea1907f13ee1f7274f73fd38a981dc7620cfcab19ae7e976 not found: ID does not exist" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.892438 4831 scope.go:117] "RemoveContainer" containerID="197066afae1593da3e7c50c1d22446492151340e589f572f207388e4f644f78b" Dec 03 09:13:16 crc kubenswrapper[4831]: E1203 09:13:16.892761 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"197066afae1593da3e7c50c1d22446492151340e589f572f207388e4f644f78b\": container with ID starting with 197066afae1593da3e7c50c1d22446492151340e589f572f207388e4f644f78b not found: ID does not exist" containerID="197066afae1593da3e7c50c1d22446492151340e589f572f207388e4f644f78b" Dec 03 09:13:16 crc kubenswrapper[4831]: I1203 09:13:16.892814 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"197066afae1593da3e7c50c1d22446492151340e589f572f207388e4f644f78b"} err="failed to get container status \"197066afae1593da3e7c50c1d22446492151340e589f572f207388e4f644f78b\": rpc error: code = NotFound desc = could not find container \"197066afae1593da3e7c50c1d22446492151340e589f572f207388e4f644f78b\": container with ID starting with 197066afae1593da3e7c50c1d22446492151340e589f572f207388e4f644f78b not found: ID does not exist" Dec 03 09:13:17 crc kubenswrapper[4831]: I1203 09:13:17.025793 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" path="/var/lib/kubelet/pods/89c1dc30-6b2e-4dbd-916d-0c6fd5907a71/volumes" Dec 03 09:14:45 crc kubenswrapper[4831]: I1203 09:14:45.163659 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_aefc8d48-0a47-424c-bce7-8e5c75a6d0fe/init-config-reloader/0.log" Dec 03 09:14:45 crc kubenswrapper[4831]: I1203 09:14:45.368046 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_aefc8d48-0a47-424c-bce7-8e5c75a6d0fe/init-config-reloader/0.log" Dec 03 09:14:45 crc kubenswrapper[4831]: I1203 09:14:45.379755 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_aefc8d48-0a47-424c-bce7-8e5c75a6d0fe/alertmanager/0.log" Dec 03 09:14:45 crc kubenswrapper[4831]: I1203 09:14:45.429172 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_alertmanager-metric-storage-0_aefc8d48-0a47-424c-bce7-8e5c75a6d0fe/config-reloader/0.log" Dec 03 09:14:45 crc kubenswrapper[4831]: I1203 09:14:45.610816 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_53ba8fe0-f472-4cd8-b061-4e5ac2be04e2/aodh-api/0.log" Dec 03 09:14:45 crc kubenswrapper[4831]: I1203 09:14:45.681730 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_53ba8fe0-f472-4cd8-b061-4e5ac2be04e2/aodh-evaluator/0.log" Dec 03 09:14:45 crc kubenswrapper[4831]: I1203 09:14:45.785772 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_53ba8fe0-f472-4cd8-b061-4e5ac2be04e2/aodh-listener/0.log" Dec 03 09:14:45 crc kubenswrapper[4831]: I1203 09:14:45.827694 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_aodh-0_53ba8fe0-f472-4cd8-b061-4e5ac2be04e2/aodh-notifier/0.log" Dec 03 09:14:45 crc kubenswrapper[4831]: I1203 09:14:45.934464 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d665c4464-jxss9_882ddda9-c85e-4b93-afaf-b34b080d7047/barbican-api/0.log" Dec 03 09:14:45 crc kubenswrapper[4831]: I1203 09:14:45.994304 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-5d665c4464-jxss9_882ddda9-c85e-4b93-afaf-b34b080d7047/barbican-api-log/0.log" Dec 03 09:14:46 crc kubenswrapper[4831]: I1203 09:14:46.161967 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f65b6d578-mkb48_3f30a154-45b6-41f5-8aad-4019e18f01b6/barbican-keystone-listener/0.log" Dec 03 09:14:46 crc kubenswrapper[4831]: I1203 09:14:46.262246 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-f65b6d578-mkb48_3f30a154-45b6-41f5-8aad-4019e18f01b6/barbican-keystone-listener-log/0.log" Dec 03 09:14:46 crc kubenswrapper[4831]: I1203 09:14:46.340460 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-758d7559-cbnxt_e1877ded-6e84-4ca9-b911-3e2996993bdb/barbican-worker/0.log" Dec 03 09:14:46 crc kubenswrapper[4831]: I1203 09:14:46.407754 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-758d7559-cbnxt_e1877ded-6e84-4ca9-b911-3e2996993bdb/barbican-worker-log/0.log" Dec 03 09:14:46 crc kubenswrapper[4831]: I1203 09:14:46.629958 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-openstack-openstack-cell1-ksczq_8e3a439c-c9c5-432d-8fbd-c9854822d349/bootstrap-openstack-openstack-cell1/0.log" Dec 03 09:14:46 crc kubenswrapper[4831]: I1203 09:14:46.661069 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f6db43ea-cd06-4251-92b9-3b66231110ba/ceilometer-central-agent/0.log" Dec 03 09:14:46 crc kubenswrapper[4831]: I1203 09:14:46.801405 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f6db43ea-cd06-4251-92b9-3b66231110ba/ceilometer-notification-agent/0.log" Dec 03 09:14:46 crc kubenswrapper[4831]: I1203 09:14:46.838571 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f6db43ea-cd06-4251-92b9-3b66231110ba/proxy-httpd/0.log" Dec 03 09:14:46 crc kubenswrapper[4831]: I1203 09:14:46.896581 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_f6db43ea-cd06-4251-92b9-3b66231110ba/sg-core/0.log" Dec 03 09:14:47 crc kubenswrapper[4831]: I1203 09:14:47.084808 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-openstack-openstack-cell1-6mmwx_aa8d1192-9ba3-44c8-b5f0-78992aabc7d2/ceph-client-openstack-openstack-cell1/0.log" Dec 03 09:14:47 crc kubenswrapper[4831]: I1203 09:14:47.202734 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_80590171-e127-452a-8c6f-666b84bd0a6e/cinder-api/0.log" Dec 03 09:14:47 crc kubenswrapper[4831]: I1203 09:14:47.238681 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_80590171-e127-452a-8c6f-666b84bd0a6e/cinder-api-log/0.log" Dec 03 09:14:47 crc kubenswrapper[4831]: I1203 09:14:47.496210 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0dd4df11-40e0-455a-81a1-e0b82a541868/cinder-backup/0.log" Dec 03 09:14:47 crc kubenswrapper[4831]: I1203 09:14:47.941993 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_0dd4df11-40e0-455a-81a1-e0b82a541868/probe/0.log" Dec 03 09:14:47 crc kubenswrapper[4831]: I1203 09:14:47.970184 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_33a61679-48f8-4157-84db-db0208fc85ad/cinder-scheduler/0.log" Dec 03 09:14:48 crc kubenswrapper[4831]: I1203 09:14:48.046173 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_33a61679-48f8-4157-84db-db0208fc85ad/probe/0.log" Dec 03 09:14:48 crc kubenswrapper[4831]: I1203 09:14:48.288357 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8a7f424f-8685-47e7-a374-44e1e824e364/cinder-volume/0.log" Dec 03 09:14:48 crc kubenswrapper[4831]: I1203 09:14:48.336593 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_8a7f424f-8685-47e7-a374-44e1e824e364/probe/0.log" Dec 03 09:14:48 crc kubenswrapper[4831]: I1203 09:14:48.439799 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-openstack-openstack-cell1-gtncm_6843897c-b341-4bcc-9a38-c9c9707022e8/configure-network-openstack-openstack-cell1/0.log" Dec 03 09:14:48 crc kubenswrapper[4831]: I1203 09:14:48.752527 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-openstack-openstack-cell1-wgszl_61e7c5dd-7643-4f95-a7d0-acff11d86694/configure-os-openstack-openstack-cell1/0.log" Dec 03 09:14:48 crc kubenswrapper[4831]: I1203 09:14:48.967545 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5db96bfff9-kgj7z_4bc527be-3f47-4fac-9edd-4252cc8e6ee1/init/0.log" Dec 03 09:14:49 crc kubenswrapper[4831]: I1203 09:14:49.181898 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5db96bfff9-kgj7z_4bc527be-3f47-4fac-9edd-4252cc8e6ee1/init/0.log" Dec 03 09:14:49 crc kubenswrapper[4831]: I1203 09:14:49.232068 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-openstack-openstack-cell1-nzmhv_13bbbc69-d429-4189-a64f-070d16440ed4/download-cache-openstack-openstack-cell1/0.log" Dec 03 09:14:49 crc kubenswrapper[4831]: I1203 09:14:49.240772 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-5db96bfff9-kgj7z_4bc527be-3f47-4fac-9edd-4252cc8e6ee1/dnsmasq-dns/0.log" Dec 03 09:14:49 crc kubenswrapper[4831]: I1203 09:14:49.840390 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cdf75c26-8fcd-4895-954b-c02971d19231/glance-log/0.log" Dec 03 09:14:49 crc kubenswrapper[4831]: I1203 09:14:49.849774 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_34509753-3280-4c6f-91c5-c96cd04044a3/glance-log/0.log" Dec 03 09:14:49 crc kubenswrapper[4831]: I1203 09:14:49.867010 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_cdf75c26-8fcd-4895-954b-c02971d19231/glance-httpd/0.log" Dec 03 09:14:49 crc kubenswrapper[4831]: I1203 09:14:49.909031 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_34509753-3280-4c6f-91c5-c96cd04044a3/glance-httpd/0.log" Dec 03 09:14:50 crc kubenswrapper[4831]: I1203 09:14:50.126975 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-api-86c4895765-xdwvz_749aa4ea-dd38-4c6c-a33f-a7467e7d76ab/heat-api/0.log" Dec 03 09:14:50 crc kubenswrapper[4831]: I1203 09:14:50.225560 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-cfnapi-56bc69b54c-5mw4x_eb164731-d0b2-4f1a-992b-48d19d451819/heat-cfnapi/0.log" Dec 03 09:14:50 crc kubenswrapper[4831]: I1203 09:14:50.259061 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_heat-engine-675cdcc9cb-xvrsg_b7767894-04f2-4d29-b76b-7157e045803d/heat-engine/0.log" Dec 03 09:14:50 crc kubenswrapper[4831]: I1203 09:14:50.393420 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78874fb77c-dtxst_e0210cd0-445e-4bb0-94a4-b7387a4ff3fe/horizon/0.log" Dec 03 09:14:50 crc kubenswrapper[4831]: I1203 09:14:50.523198 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-78874fb77c-dtxst_e0210cd0-445e-4bb0-94a4-b7387a4ff3fe/horizon-log/0.log" Dec 03 09:14:50 crc kubenswrapper[4831]: I1203 09:14:50.558087 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-openstack-openstack-cell1-5jpj2_8a974bcc-3f87-42a4-9a9c-75fa79fcd20f/install-certs-openstack-openstack-cell1/0.log" Dec 03 09:14:50 crc kubenswrapper[4831]: I1203 09:14:50.620425 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-openstack-openstack-cell1-cjwd2_4287499e-d78b-43b2-b353-18c288c585a4/install-os-openstack-openstack-cell1/0.log" Dec 03 09:14:50 crc kubenswrapper[4831]: I1203 09:14:50.830337 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-549cddd46f-vzg66_ebdb8511-1378-49f0-bda9-e7ae48599ca6/keystone-api/0.log" Dec 03 09:14:50 crc kubenswrapper[4831]: I1203 09:14:50.835725 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412481-qnsng_34df800e-8b70-4658-a6ae-a639bca251f5/keystone-cron/0.log" Dec 03 09:14:51 crc kubenswrapper[4831]: I1203 09:14:51.002921 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29412541-9qw4r_b8d4be97-9167-4feb-b779-ba8db1269611/keystone-cron/0.log" Dec 03 09:14:51 crc kubenswrapper[4831]: I1203 09:14:51.124727 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_670949d7-821a-4f20-84df-e32be87cac88/kube-state-metrics/0.log" Dec 03 09:14:51 crc kubenswrapper[4831]: I1203 09:14:51.185616 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-openstack-openstack-cell1-kwxr4_eb53812e-d0f1-4a38-b47b-00917ae4fa4f/libvirt-openstack-openstack-cell1/0.log" Dec 03 09:14:51 crc kubenswrapper[4831]: I1203 09:14:51.356491 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_4e7d2410-1d15-4296-a3cb-adc2baff3fc4/manila-api-log/0.log" Dec 03 09:14:51 crc kubenswrapper[4831]: I1203 09:14:51.432286 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_4e7d2410-1d15-4296-a3cb-adc2baff3fc4/manila-api/0.log" Dec 03 09:14:51 crc kubenswrapper[4831]: I1203 09:14:51.507908 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_cc246fd1-9874-441f-85c4-67712abd90d3/manila-scheduler/0.log" Dec 03 09:14:51 crc kubenswrapper[4831]: I1203 09:14:51.552003 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_cc246fd1-9874-441f-85c4-67712abd90d3/probe/0.log" Dec 03 09:14:51 crc kubenswrapper[4831]: I1203 09:14:51.674742 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_93961821-7d28-43df-8440-250747588c2d/manila-share/0.log" Dec 03 09:14:51 crc kubenswrapper[4831]: I1203 09:14:51.683592 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_93961821-7d28-43df-8440-250747588c2d/probe/0.log" Dec 03 09:14:51 crc kubenswrapper[4831]: I1203 09:14:51.992897 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67c669c55-sljbg_f3f7b2f0-4404-4060-a04c-02da1d8f7c43/neutron-httpd/0.log" Dec 03 09:14:52 crc kubenswrapper[4831]: I1203 09:14:52.127480 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67c669c55-sljbg_f3f7b2f0-4404-4060-a04c-02da1d8f7c43/neutron-api/0.log" Dec 03 09:14:52 crc kubenswrapper[4831]: I1203 09:14:52.507669 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-dhcp-openstack-openstack-cell1-c65dr_44f4ecb4-df16-4534-804d-2df7d53861cb/neutron-dhcp-openstack-openstack-cell1/0.log" Dec 03 09:14:52 crc kubenswrapper[4831]: I1203 09:14:52.614441 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-openstack-openstack-cell1-vf2x4_1a59fd86-ca99-4128-bbcc-6f1075dbafce/neutron-metadata-openstack-openstack-cell1/0.log" Dec 03 09:14:52 crc kubenswrapper[4831]: I1203 09:14:52.955868 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-sriov-openstack-openstack-cell1-7hxpj_b0f24e55-9970-4294-8bca-b289f2958f85/neutron-sriov-openstack-openstack-cell1/0.log" Dec 03 09:14:53 crc kubenswrapper[4831]: I1203 09:14:53.037924 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_913de50c-5388-47ab-9bc1-32292ef5c42f/nova-api-api/0.log" Dec 03 09:14:53 crc kubenswrapper[4831]: I1203 09:14:53.092668 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_913de50c-5388-47ab-9bc1-32292ef5c42f/nova-api-log/0.log" Dec 03 09:14:53 crc kubenswrapper[4831]: I1203 09:14:53.290201 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_a2be4cea-396e-49b0-aef1-4c28ac8dcd78/nova-cell0-conductor-conductor/0.log" Dec 03 09:14:53 crc kubenswrapper[4831]: I1203 09:14:53.457030 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_2a89396d-a305-4cea-b054-d3ad772f79e0/nova-cell1-conductor-conductor/0.log" Dec 03 09:14:53 crc kubenswrapper[4831]: I1203 09:14:53.683230 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_45c49f52-ed33-4f4b-a70e-6fb63117f774/nova-cell1-novncproxy-novncproxy/0.log" Dec 03 09:14:53 crc kubenswrapper[4831]: I1203 09:14:53.778889 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cellpphqk_bfa3cd02-c3f9-4817-8c79-f57f2d2dfa4d/nova-cell1-openstack-nova-compute-ffu-cell1-openstack-cell1/0.log" Dec 03 09:14:54 crc kubenswrapper[4831]: I1203 09:14:54.085385 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9fafc47f-ac43-45da-b0da-9941dbdc87f1/nova-metadata-log/0.log" Dec 03 09:14:54 crc kubenswrapper[4831]: I1203 09:14:54.101263 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-openstack-openstack-cell1-fjbvm_71e965c2-d7fd-4d55-8383-e50b6f3ac1ac/nova-cell1-openstack-openstack-cell1/0.log" Dec 03 09:14:54 crc kubenswrapper[4831]: I1203 09:14:54.121303 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_9fafc47f-ac43-45da-b0da-9941dbdc87f1/nova-metadata-metadata/0.log" Dec 03 09:14:54 crc kubenswrapper[4831]: I1203 09:14:54.379890 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6655c79d96-8cnxb_ed6653bb-4e85-401b-a22c-f834ceea376b/init/0.log" Dec 03 09:14:54 crc kubenswrapper[4831]: I1203 09:14:54.424704 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_b2e131ef-51b4-4d7b-95f1-be753d22436a/nova-scheduler-scheduler/0.log" Dec 03 09:14:54 crc kubenswrapper[4831]: I1203 09:14:54.624429 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6655c79d96-8cnxb_ed6653bb-4e85-401b-a22c-f834ceea376b/init/0.log" Dec 03 09:14:54 crc kubenswrapper[4831]: I1203 09:14:54.634159 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6655c79d96-8cnxb_ed6653bb-4e85-401b-a22c-f834ceea376b/octavia-api-provider-agent/0.log" Dec 03 09:14:54 crc kubenswrapper[4831]: I1203 09:14:54.860062 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-hw4hf_4394f7db-9b3d-425c-a57b-2c7bdcbbe251/init/0.log" Dec 03 09:14:54 crc kubenswrapper[4831]: I1203 09:14:54.937539 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-api-6655c79d96-8cnxb_ed6653bb-4e85-401b-a22c-f834ceea376b/octavia-api/0.log" Dec 03 09:14:55 crc kubenswrapper[4831]: I1203 09:14:55.034602 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-hw4hf_4394f7db-9b3d-425c-a57b-2c7bdcbbe251/init/0.log" Dec 03 09:14:55 crc kubenswrapper[4831]: I1203 09:14:55.146735 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-healthmanager-hw4hf_4394f7db-9b3d-425c-a57b-2c7bdcbbe251/octavia-healthmanager/0.log" Dec 03 09:14:55 crc kubenswrapper[4831]: I1203 09:14:55.197491 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-2xwt9_d8c313f5-00e9-49ee-ab5e-3eefaaf09202/init/0.log" Dec 03 09:14:55 crc kubenswrapper[4831]: I1203 09:14:55.460722 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-2xwt9_d8c313f5-00e9-49ee-ab5e-3eefaaf09202/init/0.log" Dec 03 09:14:55 crc kubenswrapper[4831]: I1203 09:14:55.518939 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-7vxgf_d81b16a8-4c45-4d34-8c90-1e6cd00ead93/init/0.log" Dec 03 09:14:55 crc kubenswrapper[4831]: I1203 09:14:55.521510 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-housekeeping-2xwt9_d8c313f5-00e9-49ee-ab5e-3eefaaf09202/octavia-housekeeping/0.log" Dec 03 09:14:55 crc kubenswrapper[4831]: I1203 09:14:55.761094 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-7vxgf_d81b16a8-4c45-4d34-8c90-1e6cd00ead93/init/0.log" Dec 03 09:14:55 crc kubenswrapper[4831]: I1203 09:14:55.863786 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-rsyslog-7vxgf_d81b16a8-4c45-4d34-8c90-1e6cd00ead93/octavia-rsyslog/0.log" Dec 03 09:14:55 crc kubenswrapper[4831]: I1203 09:14:55.879638 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-qj6pc_c2473ca8-2ca7-4c12-afca-955d003ffa8b/init/0.log" Dec 03 09:14:56 crc kubenswrapper[4831]: I1203 09:14:56.335283 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-qj6pc_c2473ca8-2ca7-4c12-afca-955d003ffa8b/init/0.log" Dec 03 09:14:56 crc kubenswrapper[4831]: I1203 09:14:56.478497 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5db05cc0-7933-455e-8b79-23ee19abd027/mysql-bootstrap/0.log" Dec 03 09:14:56 crc kubenswrapper[4831]: I1203 09:14:56.498005 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_octavia-worker-qj6pc_c2473ca8-2ca7-4c12-afca-955d003ffa8b/octavia-worker/0.log" Dec 03 09:14:56 crc kubenswrapper[4831]: I1203 09:14:56.631578 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5db05cc0-7933-455e-8b79-23ee19abd027/galera/0.log" Dec 03 09:14:56 crc kubenswrapper[4831]: I1203 09:14:56.633116 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_5db05cc0-7933-455e-8b79-23ee19abd027/mysql-bootstrap/0.log" Dec 03 09:14:56 crc kubenswrapper[4831]: I1203 09:14:56.693399 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa/mysql-bootstrap/0.log" Dec 03 09:14:57 crc kubenswrapper[4831]: I1203 09:14:57.169161 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa/galera/0.log" Dec 03 09:14:57 crc kubenswrapper[4831]: I1203 09:14:57.178649 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_85e48574-0a26-4ec9-ac44-f59acf845387/openstackclient/0.log" Dec 03 09:14:57 crc kubenswrapper[4831]: I1203 09:14:57.210723 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_3d5e462b-9b73-4bd3-a8a9-e7c95e7a1ffa/mysql-bootstrap/0.log" Dec 03 09:14:57 crc kubenswrapper[4831]: I1203 09:14:57.427996 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-dcd7t_0627e5fa-a9cc-4fc2-ae3d-7348ecafd3f8/ovn-controller/0.log" Dec 03 09:14:57 crc kubenswrapper[4831]: I1203 09:14:57.504141 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-vmqws_a5d61e42-ab2f-40b3-8b9f-bb480e485790/openstack-network-exporter/0.log" Dec 03 09:14:57 crc kubenswrapper[4831]: I1203 09:14:57.662338 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9lvg5_6a8df1d7-27d5-417c-b10e-379dad30e5cf/ovsdb-server-init/0.log" Dec 03 09:14:57 crc kubenswrapper[4831]: I1203 09:14:57.879840 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9lvg5_6a8df1d7-27d5-417c-b10e-379dad30e5cf/ovsdb-server-init/0.log" Dec 03 09:14:57 crc kubenswrapper[4831]: I1203 09:14:57.923028 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9lvg5_6a8df1d7-27d5-417c-b10e-379dad30e5cf/ovs-vswitchd/0.log" Dec 03 09:14:57 crc kubenswrapper[4831]: I1203 09:14:57.969117 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-9lvg5_6a8df1d7-27d5-417c-b10e-379dad30e5cf/ovsdb-server/0.log" Dec 03 09:14:58 crc kubenswrapper[4831]: I1203 09:14:58.127555 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_67342401-b261-4a0c-9f2a-f275307dc042/openstack-network-exporter/0.log" Dec 03 09:14:58 crc kubenswrapper[4831]: I1203 09:14:58.147655 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_67342401-b261-4a0c-9f2a-f275307dc042/ovn-northd/0.log" Dec 03 09:14:58 crc kubenswrapper[4831]: I1203 09:14:58.455260 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7e33c41f-a322-46e0-83ed-309466254c79/openstack-network-exporter/0.log" Dec 03 09:14:58 crc kubenswrapper[4831]: I1203 09:14:58.501635 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-openstack-openstack-cell1-xk5l7_98063d34-441b-4c6b-aa0c-4c600be73767/ovn-openstack-openstack-cell1/0.log" Dec 03 09:14:58 crc kubenswrapper[4831]: I1203 09:14:58.573150 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_7e33c41f-a322-46e0-83ed-309466254c79/ovsdbserver-nb/0.log" Dec 03 09:14:59 crc kubenswrapper[4831]: I1203 09:14:59.382045 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_a1bcc142-49f9-466a-af9b-033b4375a87e/openstack-network-exporter/0.log" Dec 03 09:14:59 crc kubenswrapper[4831]: I1203 09:14:59.518531 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-1_a1bcc142-49f9-466a-af9b-033b4375a87e/ovsdbserver-nb/0.log" Dec 03 09:14:59 crc kubenswrapper[4831]: I1203 09:14:59.625835 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_0589080a-1977-4ad4-9660-3db4472b78b4/openstack-network-exporter/0.log" Dec 03 09:14:59 crc kubenswrapper[4831]: I1203 09:14:59.697510 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-2_0589080a-1977-4ad4-9660-3db4472b78b4/ovsdbserver-nb/0.log" Dec 03 09:14:59 crc kubenswrapper[4831]: I1203 09:14:59.802619 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_80e2b474-5b25-4074-920b-844874ab8fab/openstack-network-exporter/0.log" Dec 03 09:14:59 crc kubenswrapper[4831]: I1203 09:14:59.894147 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_80e2b474-5b25-4074-920b-844874ab8fab/ovsdbserver-sb/0.log" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.023620 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_2131c614-9f45-4b2b-99cf-1b830da4013a/ovsdbserver-sb/0.log" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.071255 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-1_2131c614-9f45-4b2b-99cf-1b830da4013a/openstack-network-exporter/0.log" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.152643 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw"] Dec 03 09:15:00 crc kubenswrapper[4831]: E1203 09:15:00.158592 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" containerName="extract-utilities" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.158612 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" containerName="extract-utilities" Dec 03 09:15:00 crc kubenswrapper[4831]: E1203 09:15:00.158631 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" containerName="extract-content" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.158641 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" containerName="extract-content" Dec 03 09:15:00 crc kubenswrapper[4831]: E1203 09:15:00.158659 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" containerName="registry-server" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.158667 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" containerName="registry-server" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.158934 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="89c1dc30-6b2e-4dbd-916d-0c6fd5907a71" containerName="registry-server" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.160544 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.163095 4831 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.163642 4831 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.175197 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw"] Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.298885 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07677260-9c2a-4dba-af59-a52e06364344-secret-volume\") pod \"collect-profiles-29412555-cfjgw\" (UID: \"07677260-9c2a-4dba-af59-a52e06364344\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.298947 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsrrl\" (UniqueName: \"kubernetes.io/projected/07677260-9c2a-4dba-af59-a52e06364344-kube-api-access-tsrrl\") pod \"collect-profiles-29412555-cfjgw\" (UID: \"07677260-9c2a-4dba-af59-a52e06364344\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.299061 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07677260-9c2a-4dba-af59-a52e06364344-config-volume\") pod \"collect-profiles-29412555-cfjgw\" (UID: \"07677260-9c2a-4dba-af59-a52e06364344\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.401588 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07677260-9c2a-4dba-af59-a52e06364344-config-volume\") pod \"collect-profiles-29412555-cfjgw\" (UID: \"07677260-9c2a-4dba-af59-a52e06364344\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.401797 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07677260-9c2a-4dba-af59-a52e06364344-secret-volume\") pod \"collect-profiles-29412555-cfjgw\" (UID: \"07677260-9c2a-4dba-af59-a52e06364344\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.401833 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsrrl\" (UniqueName: \"kubernetes.io/projected/07677260-9c2a-4dba-af59-a52e06364344-kube-api-access-tsrrl\") pod \"collect-profiles-29412555-cfjgw\" (UID: \"07677260-9c2a-4dba-af59-a52e06364344\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.402945 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07677260-9c2a-4dba-af59-a52e06364344-config-volume\") pod \"collect-profiles-29412555-cfjgw\" (UID: \"07677260-9c2a-4dba-af59-a52e06364344\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.415179 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07677260-9c2a-4dba-af59-a52e06364344-secret-volume\") pod \"collect-profiles-29412555-cfjgw\" (UID: \"07677260-9c2a-4dba-af59-a52e06364344\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.437300 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsrrl\" (UniqueName: \"kubernetes.io/projected/07677260-9c2a-4dba-af59-a52e06364344-kube-api-access-tsrrl\") pod \"collect-profiles-29412555-cfjgw\" (UID: \"07677260-9c2a-4dba-af59-a52e06364344\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.488226 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.616046 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_acbbc7a4-e670-4315-b4d2-28702c8af2aa/ovsdbserver-sb/0.log" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.679053 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-2_acbbc7a4-e670-4315-b4d2-28702c8af2aa/openstack-network-exporter/0.log" Dec 03 09:15:00 crc kubenswrapper[4831]: I1203 09:15:00.931885 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7647b977b-m9c66_3cd70bab-bce8-47d5-a5cd-115fb729ec02/placement-api/0.log" Dec 03 09:15:01 crc kubenswrapper[4831]: I1203 09:15:01.031904 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-7647b977b-m9c66_3cd70bab-bce8-47d5-a5cd-115fb729ec02/placement-log/0.log" Dec 03 09:15:01 crc kubenswrapper[4831]: I1203 09:15:01.112212 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw"] Dec 03 09:15:01 crc kubenswrapper[4831]: I1203 09:15:01.121977 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_pre-adoption-validation-openstack-pre-adoption-openstack-c746qw_b73afe21-6869-4926-b295-8ec52f0e41be/pre-adoption-validation-openstack-pre-adoption-openstack-cell1/0.log" Dec 03 09:15:01 crc kubenswrapper[4831]: I1203 09:15:01.322130 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b7346d4a-a166-4419-8305-76ecd3ccf9b1/init-config-reloader/0.log" Dec 03 09:15:01 crc kubenswrapper[4831]: I1203 09:15:01.631810 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b7346d4a-a166-4419-8305-76ecd3ccf9b1/prometheus/0.log" Dec 03 09:15:01 crc kubenswrapper[4831]: I1203 09:15:01.649913 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b7346d4a-a166-4419-8305-76ecd3ccf9b1/init-config-reloader/0.log" Dec 03 09:15:01 crc kubenswrapper[4831]: I1203 09:15:01.712588 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b7346d4a-a166-4419-8305-76ecd3ccf9b1/config-reloader/0.log" Dec 03 09:15:01 crc kubenswrapper[4831]: I1203 09:15:01.713020 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_prometheus-metric-storage-0_b7346d4a-a166-4419-8305-76ecd3ccf9b1/thanos-sidecar/0.log" Dec 03 09:15:01 crc kubenswrapper[4831]: I1203 09:15:01.908776 4831 generic.go:334] "Generic (PLEG): container finished" podID="07677260-9c2a-4dba-af59-a52e06364344" containerID="deb9efcdf1a628d81c79fb622cda081d48c6aa5c1390499843b6d2715728955c" exitCode=0 Dec 03 09:15:01 crc kubenswrapper[4831]: I1203 09:15:01.909116 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" event={"ID":"07677260-9c2a-4dba-af59-a52e06364344","Type":"ContainerDied","Data":"deb9efcdf1a628d81c79fb622cda081d48c6aa5c1390499843b6d2715728955c"} Dec 03 09:15:01 crc kubenswrapper[4831]: I1203 09:15:01.909144 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" event={"ID":"07677260-9c2a-4dba-af59-a52e06364344","Type":"ContainerStarted","Data":"0ed342ca7e8270c9a48a81ae43f43d5237c58df5314b418c2dd5e62f7c7c7f04"} Dec 03 09:15:01 crc kubenswrapper[4831]: I1203 09:15:01.936780 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_933ee7aa-0ba3-46f2-a093-ffe91b58f62e/setup-container/0.log" Dec 03 09:15:02 crc kubenswrapper[4831]: I1203 09:15:02.354959 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_af0f187c-abc8-40f5-97a3-8e75e2e12769/setup-container/0.log" Dec 03 09:15:02 crc kubenswrapper[4831]: I1203 09:15:02.433962 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_933ee7aa-0ba3-46f2-a093-ffe91b58f62e/setup-container/0.log" Dec 03 09:15:02 crc kubenswrapper[4831]: I1203 09:15:02.452124 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_933ee7aa-0ba3-46f2-a093-ffe91b58f62e/rabbitmq/0.log" Dec 03 09:15:02 crc kubenswrapper[4831]: I1203 09:15:02.482075 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_fda9b8fb-e9a0-4c3a-8b54-13a6d5f35af4/memcached/0.log" Dec 03 09:15:02 crc kubenswrapper[4831]: I1203 09:15:02.634792 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_af0f187c-abc8-40f5-97a3-8e75e2e12769/setup-container/0.log" Dec 03 09:15:02 crc kubenswrapper[4831]: I1203 09:15:02.762307 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_af0f187c-abc8-40f5-97a3-8e75e2e12769/rabbitmq/0.log" Dec 03 09:15:02 crc kubenswrapper[4831]: I1203 09:15:02.822206 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-openstack-openstack-cell1-c684f_bc45326f-d4eb-443f-babc-57f5bd7aa587/reboot-os-openstack-openstack-cell1/0.log" Dec 03 09:15:02 crc kubenswrapper[4831]: I1203 09:15:02.881366 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-openstack-openstack-cell1-fnq88_7eac7128-cee7-426e-83e3-7579b7744457/run-os-openstack-openstack-cell1/0.log" Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.101337 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-openstack-lfwtg_d34dc8ad-d601-4df6-8bb8-4dc8d76e3935/ssh-known-hosts-openstack/0.log" Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.396603 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-openstack-openstack-cell1-7z2tk_e35e9dbe-4290-4ad4-80b2-2672e5d6903f/telemetry-openstack-openstack-cell1/0.log" Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.462035 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.518050 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07677260-9c2a-4dba-af59-a52e06364344-config-volume\") pod \"07677260-9c2a-4dba-af59-a52e06364344\" (UID: \"07677260-9c2a-4dba-af59-a52e06364344\") " Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.518126 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tsrrl\" (UniqueName: \"kubernetes.io/projected/07677260-9c2a-4dba-af59-a52e06364344-kube-api-access-tsrrl\") pod \"07677260-9c2a-4dba-af59-a52e06364344\" (UID: \"07677260-9c2a-4dba-af59-a52e06364344\") " Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.518146 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07677260-9c2a-4dba-af59-a52e06364344-secret-volume\") pod \"07677260-9c2a-4dba-af59-a52e06364344\" (UID: \"07677260-9c2a-4dba-af59-a52e06364344\") " Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.519959 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07677260-9c2a-4dba-af59-a52e06364344-config-volume" (OuterVolumeSpecName: "config-volume") pod "07677260-9c2a-4dba-af59-a52e06364344" (UID: "07677260-9c2a-4dba-af59-a52e06364344"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.524715 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07677260-9c2a-4dba-af59-a52e06364344-kube-api-access-tsrrl" (OuterVolumeSpecName: "kube-api-access-tsrrl") pod "07677260-9c2a-4dba-af59-a52e06364344" (UID: "07677260-9c2a-4dba-af59-a52e06364344"). InnerVolumeSpecName "kube-api-access-tsrrl". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.544975 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07677260-9c2a-4dba-af59-a52e06364344-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "07677260-9c2a-4dba-af59-a52e06364344" (UID: "07677260-9c2a-4dba-af59-a52e06364344"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.619759 4831 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07677260-9c2a-4dba-af59-a52e06364344-config-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.619787 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tsrrl\" (UniqueName: \"kubernetes.io/projected/07677260-9c2a-4dba-af59-a52e06364344-kube-api-access-tsrrl\") on node \"crc\" DevicePath \"\"" Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.619800 4831 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/07677260-9c2a-4dba-af59-a52e06364344-secret-volume\") on node \"crc\" DevicePath \"\"" Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.675229 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tripleo-cleanup-tripleo-cleanup-openstack-cell1-mg72t_307848db-fd00-472b-8653-c35696f43e6d/tripleo-cleanup-tripleo-cleanup-openstack-cell1/0.log" Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.725096 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-openstack-openstack-cell1-mrmbq_4526405f-7aac-4d79-ad67-42f6a2b1f241/validate-network-openstack-openstack-cell1/0.log" Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.929512 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" event={"ID":"07677260-9c2a-4dba-af59-a52e06364344","Type":"ContainerDied","Data":"0ed342ca7e8270c9a48a81ae43f43d5237c58df5314b418c2dd5e62f7c7c7f04"} Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.929573 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29412555-cfjgw" Dec 03 09:15:03 crc kubenswrapper[4831]: I1203 09:15:03.929557 4831 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ed342ca7e8270c9a48a81ae43f43d5237c58df5314b418c2dd5e62f7c7c7f04" Dec 03 09:15:04 crc kubenswrapper[4831]: I1203 09:15:04.539111 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz"] Dec 03 09:15:04 crc kubenswrapper[4831]: I1203 09:15:04.553463 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29412510-jlcjz"] Dec 03 09:15:05 crc kubenswrapper[4831]: I1203 09:15:05.025169 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="451128f9-59a1-4ac3-a52c-472c4b87c8c5" path="/var/lib/kubelet/pods/451128f9-59a1-4ac3-a52c-472c4b87c8c5/volumes" Dec 03 09:15:27 crc kubenswrapper[4831]: I1203 09:15:27.596895 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:15:27 crc kubenswrapper[4831]: I1203 09:15:27.597463 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:15:28 crc kubenswrapper[4831]: I1203 09:15:28.372606 4831 scope.go:117] "RemoveContainer" containerID="097f4e54c14f87e9d683188e1eff86121233a196fcec34264c153e33753110c3" Dec 03 09:15:28 crc kubenswrapper[4831]: I1203 09:15:28.406615 4831 scope.go:117] "RemoveContainer" containerID="3db66a56d102c8c488a606708f65976b7ac7a4369d1975f7b0467d1f366bfc1d" Dec 03 09:15:29 crc kubenswrapper[4831]: I1203 09:15:29.686958 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-9tvxr_dd91cf33-b91c-4430-ae22-ff8f52171f08/kube-rbac-proxy/0.log" Dec 03 09:15:29 crc kubenswrapper[4831]: I1203 09:15:29.840627 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7d9dfd778-9tvxr_dd91cf33-b91c-4430-ae22-ff8f52171f08/manager/0.log" Dec 03 09:15:29 crc kubenswrapper[4831]: I1203 09:15:29.893704 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-mcp6w_379fb9f5-e9c6-4362-b40a-c80ac7f58562/kube-rbac-proxy/0.log" Dec 03 09:15:29 crc kubenswrapper[4831]: I1203 09:15:29.970134 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-859b6ccc6-mcp6w_379fb9f5-e9c6-4362-b40a-c80ac7f58562/manager/0.log" Dec 03 09:15:30 crc kubenswrapper[4831]: I1203 09:15:30.039678 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n_dcfbcafe-4cef-4e71-a46f-e655a31beb6b/util/0.log" Dec 03 09:15:30 crc kubenswrapper[4831]: I1203 09:15:30.283428 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n_dcfbcafe-4cef-4e71-a46f-e655a31beb6b/util/0.log" Dec 03 09:15:30 crc kubenswrapper[4831]: I1203 09:15:30.284547 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n_dcfbcafe-4cef-4e71-a46f-e655a31beb6b/pull/0.log" Dec 03 09:15:30 crc kubenswrapper[4831]: I1203 09:15:30.290805 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n_dcfbcafe-4cef-4e71-a46f-e655a31beb6b/pull/0.log" Dec 03 09:15:30 crc kubenswrapper[4831]: I1203 09:15:30.541869 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n_dcfbcafe-4cef-4e71-a46f-e655a31beb6b/pull/0.log" Dec 03 09:15:30 crc kubenswrapper[4831]: I1203 09:15:30.542327 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n_dcfbcafe-4cef-4e71-a46f-e655a31beb6b/extract/0.log" Dec 03 09:15:30 crc kubenswrapper[4831]: I1203 09:15:30.608898 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_de27516925349f4302b839c8f653c877f0ae30e06d3090f097d88f57e7mpg4n_dcfbcafe-4cef-4e71-a46f-e655a31beb6b/util/0.log" Dec 03 09:15:30 crc kubenswrapper[4831]: I1203 09:15:30.755278 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-5lshd_61e7e997-91d1-4a49-8243-a0032d9ce077/kube-rbac-proxy/0.log" Dec 03 09:15:30 crc kubenswrapper[4831]: I1203 09:15:30.798027 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-78b4bc895b-5lshd_61e7e997-91d1-4a49-8243-a0032d9ce077/manager/0.log" Dec 03 09:15:30 crc kubenswrapper[4831]: I1203 09:15:30.889698 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-xgw7z_13a6a910-c42b-4ba4-85ad-62d932c41b4d/kube-rbac-proxy/0.log" Dec 03 09:15:31 crc kubenswrapper[4831]: I1203 09:15:31.104009 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-vqvxq_facee23f-2039-4bc2-84e2-c209c96f0812/kube-rbac-proxy/0.log" Dec 03 09:15:31 crc kubenswrapper[4831]: I1203 09:15:31.184798 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-77987cd8cd-xgw7z_13a6a910-c42b-4ba4-85ad-62d932c41b4d/manager/0.log" Dec 03 09:15:31 crc kubenswrapper[4831]: I1203 09:15:31.221084 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-5f64f6f8bb-vqvxq_facee23f-2039-4bc2-84e2-c209c96f0812/manager/0.log" Dec 03 09:15:31 crc kubenswrapper[4831]: I1203 09:15:31.814755 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-cvxcd_f4448da7-6edc-46ba-8a6c-d5491ddfc9a2/kube-rbac-proxy/0.log" Dec 03 09:15:31 crc kubenswrapper[4831]: I1203 09:15:31.893303 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-68c6d99b8f-cvxcd_f4448da7-6edc-46ba-8a6c-d5491ddfc9a2/manager/0.log" Dec 03 09:15:31 crc kubenswrapper[4831]: I1203 09:15:31.984900 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-bmk89_34580c97-5b51-43ab-affa-68c03a7c1d4d/kube-rbac-proxy/0.log" Dec 03 09:15:32 crc kubenswrapper[4831]: I1203 09:15:32.152341 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-l25t8_0566c3c5-2a1f-4fee-a3f0-ed880b1aba9b/kube-rbac-proxy/0.log" Dec 03 09:15:32 crc kubenswrapper[4831]: I1203 09:15:32.232131 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6c548fd776-l25t8_0566c3c5-2a1f-4fee-a3f0-ed880b1aba9b/manager/0.log" Dec 03 09:15:32 crc kubenswrapper[4831]: I1203 09:15:32.461580 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-57548d458d-bmk89_34580c97-5b51-43ab-affa-68c03a7c1d4d/manager/0.log" Dec 03 09:15:32 crc kubenswrapper[4831]: I1203 09:15:32.525474 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-xl7b2_6589826b-47ab-4f38-bfc6-e6d79787e272/kube-rbac-proxy/0.log" Dec 03 09:15:32 crc kubenswrapper[4831]: I1203 09:15:32.713559 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-7765d96ddf-xl7b2_6589826b-47ab-4f38-bfc6-e6d79787e272/manager/0.log" Dec 03 09:15:32 crc kubenswrapper[4831]: I1203 09:15:32.816696 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-k9882_a7a1c9f6-03de-405f-b50a-31494910f498/kube-rbac-proxy/0.log" Dec 03 09:15:32 crc kubenswrapper[4831]: I1203 09:15:32.892002 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-7c79b5df47-k9882_a7a1c9f6-03de-405f-b50a-31494910f498/manager/0.log" Dec 03 09:15:33 crc kubenswrapper[4831]: I1203 09:15:33.043870 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-dg285_f720b38f-39f1-4b9e-a6ee-268c76a855a0/kube-rbac-proxy/0.log" Dec 03 09:15:33 crc kubenswrapper[4831]: I1203 09:15:33.096839 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-56bbcc9d85-dg285_f720b38f-39f1-4b9e-a6ee-268c76a855a0/manager/0.log" Dec 03 09:15:33 crc kubenswrapper[4831]: I1203 09:15:33.191969 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-6wzs6_273bb4e9-067c-47e7-8ef0-973e2890ecb0/kube-rbac-proxy/0.log" Dec 03 09:15:33 crc kubenswrapper[4831]: I1203 09:15:33.293498 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-5fdfd5b6b5-6wzs6_273bb4e9-067c-47e7-8ef0-973e2890ecb0/manager/0.log" Dec 03 09:15:33 crc kubenswrapper[4831]: I1203 09:15:33.328787 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-4fxmh_17a7b8c5-b7a4-430e-b910-20d0c9a97dc1/kube-rbac-proxy/0.log" Dec 03 09:15:33 crc kubenswrapper[4831]: I1203 09:15:33.519658 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-cdbrr_aa93d4b6-18b8-43c4-a1d2-e6c95e4eebf2/kube-rbac-proxy/0.log" Dec 03 09:15:33 crc kubenswrapper[4831]: I1203 09:15:33.648373 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn_1a45949e-adca-4398-82a3-a0d25c8f9702/kube-rbac-proxy/0.log" Dec 03 09:15:33 crc kubenswrapper[4831]: I1203 09:15:33.648586 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-697bc559fc-4fxmh_17a7b8c5-b7a4-430e-b910-20d0c9a97dc1/manager/0.log" Dec 03 09:15:33 crc kubenswrapper[4831]: I1203 09:15:33.649388 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-998648c74-cdbrr_aa93d4b6-18b8-43c4-a1d2-e6c95e4eebf2/manager/0.log" Dec 03 09:15:33 crc kubenswrapper[4831]: I1203 09:15:33.702721 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-64bc77cfd4qttdn_1a45949e-adca-4398-82a3-a0d25c8f9702/manager/0.log" Dec 03 09:15:34 crc kubenswrapper[4831]: I1203 09:15:34.036309 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-operator-5b4678cf94-jj86g_3ed824d8-b8a7-4ada-9d92-a8d58a5e91b1/operator/0.log" Dec 03 09:15:34 crc kubenswrapper[4831]: I1203 09:15:34.079052 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-k24lq_f32bda31-93c7-4b6e-af11-0db32110019e/registry-server/0.log" Dec 03 09:15:34 crc kubenswrapper[4831]: I1203 09:15:34.182077 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-kx5qm_4628f220-2e57-479f-b91a-9dea443d3456/kube-rbac-proxy/0.log" Dec 03 09:15:34 crc kubenswrapper[4831]: I1203 09:15:34.364322 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-vg5hb_3c0f7e03-1610-4c34-824f-c6b7ad6310ea/kube-rbac-proxy/0.log" Dec 03 09:15:34 crc kubenswrapper[4831]: I1203 09:15:34.440419 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-b6456fdb6-kx5qm_4628f220-2e57-479f-b91a-9dea443d3456/manager/0.log" Dec 03 09:15:34 crc kubenswrapper[4831]: I1203 09:15:34.463270 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-78f8948974-vg5hb_3c0f7e03-1610-4c34-824f-c6b7ad6310ea/manager/0.log" Dec 03 09:15:34 crc kubenswrapper[4831]: I1203 09:15:34.633040 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-wmfmc_0c0294b0-7070-44b3-adc3-63f6cae3992c/operator/0.log" Dec 03 09:15:34 crc kubenswrapper[4831]: I1203 09:15:34.744306 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-v9ptz_45d17cba-18de-45cc-8561-dd1d50b9061a/kube-rbac-proxy/0.log" Dec 03 09:15:34 crc kubenswrapper[4831]: I1203 09:15:34.828074 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-5f8c65bbfc-v9ptz_45d17cba-18de-45cc-8561-dd1d50b9061a/manager/0.log" Dec 03 09:15:34 crc kubenswrapper[4831]: I1203 09:15:34.914048 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-txlnp_44a494a8-2cda-4092-9510-41314e5f93c8/kube-rbac-proxy/0.log" Dec 03 09:15:35 crc kubenswrapper[4831]: I1203 09:15:35.161126 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-4sh4s_8ce51182-8548-461c-a6b9-1dae9a549221/kube-rbac-proxy/0.log" Dec 03 09:15:35 crc kubenswrapper[4831]: I1203 09:15:35.163094 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5854674fcc-4sh4s_8ce51182-8548-461c-a6b9-1dae9a549221/manager/0.log" Dec 03 09:15:35 crc kubenswrapper[4831]: I1203 09:15:35.267648 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-76cc84c6bb-txlnp_44a494a8-2cda-4092-9510-41314e5f93c8/manager/0.log" Dec 03 09:15:35 crc kubenswrapper[4831]: I1203 09:15:35.361099 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-2tgr7_c1bc052b-bcf5-43a3-a84b-7ea23a95f18a/manager/0.log" Dec 03 09:15:35 crc kubenswrapper[4831]: I1203 09:15:35.396920 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-769dc69bc-2tgr7_c1bc052b-bcf5-43a3-a84b-7ea23a95f18a/kube-rbac-proxy/0.log" Dec 03 09:15:36 crc kubenswrapper[4831]: I1203 09:15:36.072887 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-5586f6bb8b-ps5jg_8a2d00a9-7a0c-45d2-8f1d-080748f8366b/manager/0.log" Dec 03 09:15:55 crc kubenswrapper[4831]: I1203 09:15:55.952415 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-d8p7m_51eb384f-9a82-4a1b-ab8d-749a23376b2f/control-plane-machine-set-operator/0.log" Dec 03 09:15:56 crc kubenswrapper[4831]: I1203 09:15:56.107144 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7pqsq_0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6/machine-api-operator/0.log" Dec 03 09:15:56 crc kubenswrapper[4831]: I1203 09:15:56.119271 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7pqsq_0f57f1a1-1ccf-4fda-be1e-dcbd41f2e0f6/kube-rbac-proxy/0.log" Dec 03 09:15:57 crc kubenswrapper[4831]: I1203 09:15:57.597114 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:15:57 crc kubenswrapper[4831]: I1203 09:15:57.597168 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:16:09 crc kubenswrapper[4831]: I1203 09:16:09.144219 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-ztn66_28530532-eea3-4d88-9615-e7525130ea89/cert-manager-controller/0.log" Dec 03 09:16:09 crc kubenswrapper[4831]: I1203 09:16:09.303722 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-gkk6c_4bbcf00b-ec72-4ee0-a9ff-683a7ffe476b/cert-manager-cainjector/0.log" Dec 03 09:16:09 crc kubenswrapper[4831]: I1203 09:16:09.344860 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-rs4hm_2334e9c7-0d74-4b40-bfa2-42916f53c7aa/cert-manager-webhook/0.log" Dec 03 09:16:22 crc kubenswrapper[4831]: I1203 09:16:22.288508 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7fbb5f6569-4h5zz_9b13740f-a5e5-40c1-8925-12aaa3a9498c/nmstate-console-plugin/0.log" Dec 03 09:16:22 crc kubenswrapper[4831]: I1203 09:16:22.535095 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-m2cm2_328e107f-bb1a-448c-92c0-7aacaa6bb84f/kube-rbac-proxy/0.log" Dec 03 09:16:22 crc kubenswrapper[4831]: I1203 09:16:22.592169 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-frsqc_a85480d7-0191-4b3b-8542-ee01a494109f/nmstate-handler/0.log" Dec 03 09:16:22 crc kubenswrapper[4831]: I1203 09:16:22.662005 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-7f946cbc9-m2cm2_328e107f-bb1a-448c-92c0-7aacaa6bb84f/nmstate-metrics/0.log" Dec 03 09:16:22 crc kubenswrapper[4831]: I1203 09:16:22.817478 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-5b5b58f5c8-j4lt4_52e0691e-6d8e-473a-84d0-11d5872313d7/nmstate-operator/0.log" Dec 03 09:16:22 crc kubenswrapper[4831]: I1203 09:16:22.896971 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f6d4c5ccb-gl6ns_123ef7d5-4ad6-4a82-8dc7-63621e57d51c/nmstate-webhook/0.log" Dec 03 09:16:27 crc kubenswrapper[4831]: I1203 09:16:27.596487 4831 patch_prober.go:28] interesting pod/machine-config-daemon-dvcq5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Dec 03 09:16:27 crc kubenswrapper[4831]: I1203 09:16:27.597252 4831 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Dec 03 09:16:27 crc kubenswrapper[4831]: I1203 09:16:27.597337 4831 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" Dec 03 09:16:27 crc kubenswrapper[4831]: I1203 09:16:27.599064 4831 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf"} pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Dec 03 09:16:27 crc kubenswrapper[4831]: I1203 09:16:27.599304 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" containerName="machine-config-daemon" containerID="cri-o://12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" gracePeriod=600 Dec 03 09:16:28 crc kubenswrapper[4831]: E1203 09:16:28.160276 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:16:29 crc kubenswrapper[4831]: I1203 09:16:29.007235 4831 generic.go:334] "Generic (PLEG): container finished" podID="4e04caf2-8e18-4af8-9779-c5711262077b" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" exitCode=0 Dec 03 09:16:29 crc kubenswrapper[4831]: I1203 09:16:29.007295 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerDied","Data":"12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf"} Dec 03 09:16:29 crc kubenswrapper[4831]: I1203 09:16:29.007566 4831 scope.go:117] "RemoveContainer" containerID="fa4eb3500b4af6b5ec70d42d28664dd92eb6006e7417e0949f20cc87d4e6794c" Dec 03 09:16:29 crc kubenswrapper[4831]: I1203 09:16:29.008298 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:16:29 crc kubenswrapper[4831]: E1203 09:16:29.008789 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.161738 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-8hrcc"] Dec 03 09:16:38 crc kubenswrapper[4831]: E1203 09:16:38.162641 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="07677260-9c2a-4dba-af59-a52e06364344" containerName="collect-profiles" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.162655 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="07677260-9c2a-4dba-af59-a52e06364344" containerName="collect-profiles" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.162878 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="07677260-9c2a-4dba-af59-a52e06364344" containerName="collect-profiles" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.164454 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.187302 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8hrcc"] Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.312578 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnr79\" (UniqueName: \"kubernetes.io/projected/cd10f087-28e1-4c0f-a90b-511b577b7c14-kube-api-access-vnr79\") pod \"certified-operators-8hrcc\" (UID: \"cd10f087-28e1-4c0f-a90b-511b577b7c14\") " pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.312648 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd10f087-28e1-4c0f-a90b-511b577b7c14-catalog-content\") pod \"certified-operators-8hrcc\" (UID: \"cd10f087-28e1-4c0f-a90b-511b577b7c14\") " pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.312864 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd10f087-28e1-4c0f-a90b-511b577b7c14-utilities\") pod \"certified-operators-8hrcc\" (UID: \"cd10f087-28e1-4c0f-a90b-511b577b7c14\") " pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.414228 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd10f087-28e1-4c0f-a90b-511b577b7c14-utilities\") pod \"certified-operators-8hrcc\" (UID: \"cd10f087-28e1-4c0f-a90b-511b577b7c14\") " pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.414338 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnr79\" (UniqueName: \"kubernetes.io/projected/cd10f087-28e1-4c0f-a90b-511b577b7c14-kube-api-access-vnr79\") pod \"certified-operators-8hrcc\" (UID: \"cd10f087-28e1-4c0f-a90b-511b577b7c14\") " pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.414374 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd10f087-28e1-4c0f-a90b-511b577b7c14-catalog-content\") pod \"certified-operators-8hrcc\" (UID: \"cd10f087-28e1-4c0f-a90b-511b577b7c14\") " pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.414821 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd10f087-28e1-4c0f-a90b-511b577b7c14-catalog-content\") pod \"certified-operators-8hrcc\" (UID: \"cd10f087-28e1-4c0f-a90b-511b577b7c14\") " pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.414813 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd10f087-28e1-4c0f-a90b-511b577b7c14-utilities\") pod \"certified-operators-8hrcc\" (UID: \"cd10f087-28e1-4c0f-a90b-511b577b7c14\") " pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.461093 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnr79\" (UniqueName: \"kubernetes.io/projected/cd10f087-28e1-4c0f-a90b-511b577b7c14-kube-api-access-vnr79\") pod \"certified-operators-8hrcc\" (UID: \"cd10f087-28e1-4c0f-a90b-511b577b7c14\") " pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:38 crc kubenswrapper[4831]: I1203 09:16:38.491176 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:39 crc kubenswrapper[4831]: I1203 09:16:39.068920 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-8hrcc"] Dec 03 09:16:39 crc kubenswrapper[4831]: I1203 09:16:39.133025 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hrcc" event={"ID":"cd10f087-28e1-4c0f-a90b-511b577b7c14","Type":"ContainerStarted","Data":"6497fc9f53346d8e12bc8abfc0c2614dd87f552808b2f0ebfa28626622d9fac9"} Dec 03 09:16:40 crc kubenswrapper[4831]: I1203 09:16:40.146565 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hrcc" event={"ID":"cd10f087-28e1-4c0f-a90b-511b577b7c14","Type":"ContainerStarted","Data":"820dd4780b13677149b00681ed10d0ac4b97f2a8423d63d5d551b53b939cd2d6"} Dec 03 09:16:41 crc kubenswrapper[4831]: I1203 09:16:41.013220 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:16:41 crc kubenswrapper[4831]: E1203 09:16:41.014187 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:16:41 crc kubenswrapper[4831]: I1203 09:16:41.102297 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-vwlgn_6e184be1-0196-438c-a4ed-05ee32ccac09/kube-rbac-proxy/0.log" Dec 03 09:16:41 crc kubenswrapper[4831]: I1203 09:16:41.156542 4831 generic.go:334] "Generic (PLEG): container finished" podID="cd10f087-28e1-4c0f-a90b-511b577b7c14" containerID="820dd4780b13677149b00681ed10d0ac4b97f2a8423d63d5d551b53b939cd2d6" exitCode=0 Dec 03 09:16:41 crc kubenswrapper[4831]: I1203 09:16:41.156642 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hrcc" event={"ID":"cd10f087-28e1-4c0f-a90b-511b577b7c14","Type":"ContainerDied","Data":"820dd4780b13677149b00681ed10d0ac4b97f2a8423d63d5d551b53b939cd2d6"} Dec 03 09:16:41 crc kubenswrapper[4831]: I1203 09:16:41.159024 4831 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Dec 03 09:16:41 crc kubenswrapper[4831]: I1203 09:16:41.398608 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/cp-frr-files/0.log" Dec 03 09:16:41 crc kubenswrapper[4831]: I1203 09:16:41.605114 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-f8648f98b-vwlgn_6e184be1-0196-438c-a4ed-05ee32ccac09/controller/0.log" Dec 03 09:16:42 crc kubenswrapper[4831]: I1203 09:16:42.212336 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/cp-reloader/0.log" Dec 03 09:16:42 crc kubenswrapper[4831]: I1203 09:16:42.244511 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/cp-frr-files/0.log" Dec 03 09:16:42 crc kubenswrapper[4831]: I1203 09:16:42.264067 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/cp-metrics/0.log" Dec 03 09:16:42 crc kubenswrapper[4831]: I1203 09:16:42.344291 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/cp-reloader/0.log" Dec 03 09:16:42 crc kubenswrapper[4831]: I1203 09:16:42.517013 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/cp-reloader/0.log" Dec 03 09:16:42 crc kubenswrapper[4831]: I1203 09:16:42.549893 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/cp-frr-files/0.log" Dec 03 09:16:42 crc kubenswrapper[4831]: I1203 09:16:42.579425 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/cp-metrics/0.log" Dec 03 09:16:42 crc kubenswrapper[4831]: I1203 09:16:42.618399 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/cp-metrics/0.log" Dec 03 09:16:42 crc kubenswrapper[4831]: I1203 09:16:42.783232 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/cp-frr-files/0.log" Dec 03 09:16:42 crc kubenswrapper[4831]: I1203 09:16:42.829557 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/cp-reloader/0.log" Dec 03 09:16:42 crc kubenswrapper[4831]: I1203 09:16:42.841970 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/cp-metrics/0.log" Dec 03 09:16:42 crc kubenswrapper[4831]: I1203 09:16:42.914728 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/controller/0.log" Dec 03 09:16:43 crc kubenswrapper[4831]: I1203 09:16:43.110515 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/kube-rbac-proxy/0.log" Dec 03 09:16:43 crc kubenswrapper[4831]: I1203 09:16:43.114536 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/frr-metrics/0.log" Dec 03 09:16:43 crc kubenswrapper[4831]: I1203 09:16:43.195519 4831 generic.go:334] "Generic (PLEG): container finished" podID="cd10f087-28e1-4c0f-a90b-511b577b7c14" containerID="185998197d9479d98a9a2c511078aa31c9b9dd84604ec9182b84da77073a4f72" exitCode=0 Dec 03 09:16:43 crc kubenswrapper[4831]: I1203 09:16:43.195835 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hrcc" event={"ID":"cd10f087-28e1-4c0f-a90b-511b577b7c14","Type":"ContainerDied","Data":"185998197d9479d98a9a2c511078aa31c9b9dd84604ec9182b84da77073a4f72"} Dec 03 09:16:43 crc kubenswrapper[4831]: I1203 09:16:43.196124 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/kube-rbac-proxy-frr/0.log" Dec 03 09:16:43 crc kubenswrapper[4831]: I1203 09:16:43.988336 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7fcb986d4-kvrm8_8ce7e180-7f81-4cb1-b046-7e53111c2731/frr-k8s-webhook-server/0.log" Dec 03 09:16:43 crc kubenswrapper[4831]: I1203 09:16:43.990958 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/reloader/0.log" Dec 03 09:16:44 crc kubenswrapper[4831]: I1203 09:16:44.293025 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5bbcb6bcff-p6plt_ae73146a-b079-4641-942b-9ceebbfbae34/webhook-server/0.log" Dec 03 09:16:44 crc kubenswrapper[4831]: I1203 09:16:44.391177 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-8666f48b7d-c4vmb_e149f20c-2288-4fcf-b90a-f2bb4029436d/manager/0.log" Dec 03 09:16:44 crc kubenswrapper[4831]: I1203 09:16:44.536553 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-flnpv_4ed12823-b3b1-4ee5-af2e-07320e5421eb/kube-rbac-proxy/0.log" Dec 03 09:16:45 crc kubenswrapper[4831]: I1203 09:16:45.223529 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hrcc" event={"ID":"cd10f087-28e1-4c0f-a90b-511b577b7c14","Type":"ContainerStarted","Data":"85dd7556290936b07a0bed3bf296fa73a1fa0948702df713252d713e320c9002"} Dec 03 09:16:45 crc kubenswrapper[4831]: I1203 09:16:45.245798 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-8hrcc" podStartSLOduration=4.352695103 podStartE2EDuration="7.245780805s" podCreationTimestamp="2025-12-03 09:16:38 +0000 UTC" firstStartedPulling="2025-12-03 09:16:41.158735914 +0000 UTC m=+9938.502319432" lastFinishedPulling="2025-12-03 09:16:44.051821636 +0000 UTC m=+9941.395405134" observedRunningTime="2025-12-03 09:16:45.244872226 +0000 UTC m=+9942.588455734" watchObservedRunningTime="2025-12-03 09:16:45.245780805 +0000 UTC m=+9942.589364313" Dec 03 09:16:45 crc kubenswrapper[4831]: I1203 09:16:45.465221 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-flnpv_4ed12823-b3b1-4ee5-af2e-07320e5421eb/speaker/0.log" Dec 03 09:16:46 crc kubenswrapper[4831]: I1203 09:16:46.998868 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-49hk4_b023523a-cf93-48a2-be02-a6f4ba831bca/frr/0.log" Dec 03 09:16:48 crc kubenswrapper[4831]: I1203 09:16:48.492145 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:48 crc kubenswrapper[4831]: I1203 09:16:48.492536 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:48 crc kubenswrapper[4831]: I1203 09:16:48.541014 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:49 crc kubenswrapper[4831]: I1203 09:16:49.340128 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:49 crc kubenswrapper[4831]: I1203 09:16:49.411626 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8hrcc"] Dec 03 09:16:51 crc kubenswrapper[4831]: I1203 09:16:51.291933 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-8hrcc" podUID="cd10f087-28e1-4c0f-a90b-511b577b7c14" containerName="registry-server" containerID="cri-o://85dd7556290936b07a0bed3bf296fa73a1fa0948702df713252d713e320c9002" gracePeriod=2 Dec 03 09:16:52 crc kubenswrapper[4831]: I1203 09:16:52.309468 4831 generic.go:334] "Generic (PLEG): container finished" podID="cd10f087-28e1-4c0f-a90b-511b577b7c14" containerID="85dd7556290936b07a0bed3bf296fa73a1fa0948702df713252d713e320c9002" exitCode=0 Dec 03 09:16:52 crc kubenswrapper[4831]: I1203 09:16:52.309541 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hrcc" event={"ID":"cd10f087-28e1-4c0f-a90b-511b577b7c14","Type":"ContainerDied","Data":"85dd7556290936b07a0bed3bf296fa73a1fa0948702df713252d713e320c9002"} Dec 03 09:16:52 crc kubenswrapper[4831]: I1203 09:16:52.540176 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:52 crc kubenswrapper[4831]: I1203 09:16:52.665962 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnr79\" (UniqueName: \"kubernetes.io/projected/cd10f087-28e1-4c0f-a90b-511b577b7c14-kube-api-access-vnr79\") pod \"cd10f087-28e1-4c0f-a90b-511b577b7c14\" (UID: \"cd10f087-28e1-4c0f-a90b-511b577b7c14\") " Dec 03 09:16:52 crc kubenswrapper[4831]: I1203 09:16:52.666370 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd10f087-28e1-4c0f-a90b-511b577b7c14-catalog-content\") pod \"cd10f087-28e1-4c0f-a90b-511b577b7c14\" (UID: \"cd10f087-28e1-4c0f-a90b-511b577b7c14\") " Dec 03 09:16:52 crc kubenswrapper[4831]: I1203 09:16:52.666844 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd10f087-28e1-4c0f-a90b-511b577b7c14-utilities\") pod \"cd10f087-28e1-4c0f-a90b-511b577b7c14\" (UID: \"cd10f087-28e1-4c0f-a90b-511b577b7c14\") " Dec 03 09:16:52 crc kubenswrapper[4831]: I1203 09:16:52.667550 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd10f087-28e1-4c0f-a90b-511b577b7c14-utilities" (OuterVolumeSpecName: "utilities") pod "cd10f087-28e1-4c0f-a90b-511b577b7c14" (UID: "cd10f087-28e1-4c0f-a90b-511b577b7c14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:16:52 crc kubenswrapper[4831]: I1203 09:16:52.667837 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cd10f087-28e1-4c0f-a90b-511b577b7c14-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:16:52 crc kubenswrapper[4831]: I1203 09:16:52.719507 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd10f087-28e1-4c0f-a90b-511b577b7c14-kube-api-access-vnr79" (OuterVolumeSpecName: "kube-api-access-vnr79") pod "cd10f087-28e1-4c0f-a90b-511b577b7c14" (UID: "cd10f087-28e1-4c0f-a90b-511b577b7c14"). InnerVolumeSpecName "kube-api-access-vnr79". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:16:52 crc kubenswrapper[4831]: I1203 09:16:52.738772 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cd10f087-28e1-4c0f-a90b-511b577b7c14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cd10f087-28e1-4c0f-a90b-511b577b7c14" (UID: "cd10f087-28e1-4c0f-a90b-511b577b7c14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:16:52 crc kubenswrapper[4831]: I1203 09:16:52.771241 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnr79\" (UniqueName: \"kubernetes.io/projected/cd10f087-28e1-4c0f-a90b-511b577b7c14-kube-api-access-vnr79\") on node \"crc\" DevicePath \"\"" Dec 03 09:16:52 crc kubenswrapper[4831]: I1203 09:16:52.771301 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cd10f087-28e1-4c0f-a90b-511b577b7c14-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:16:53 crc kubenswrapper[4831]: I1203 09:16:53.027884 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:16:53 crc kubenswrapper[4831]: E1203 09:16:53.028812 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:16:53 crc kubenswrapper[4831]: I1203 09:16:53.326198 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-8hrcc" event={"ID":"cd10f087-28e1-4c0f-a90b-511b577b7c14","Type":"ContainerDied","Data":"6497fc9f53346d8e12bc8abfc0c2614dd87f552808b2f0ebfa28626622d9fac9"} Dec 03 09:16:53 crc kubenswrapper[4831]: I1203 09:16:53.326256 4831 scope.go:117] "RemoveContainer" containerID="85dd7556290936b07a0bed3bf296fa73a1fa0948702df713252d713e320c9002" Dec 03 09:16:53 crc kubenswrapper[4831]: I1203 09:16:53.326404 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-8hrcc" Dec 03 09:16:53 crc kubenswrapper[4831]: I1203 09:16:53.352958 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-8hrcc"] Dec 03 09:16:53 crc kubenswrapper[4831]: I1203 09:16:53.364399 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-8hrcc"] Dec 03 09:16:53 crc kubenswrapper[4831]: I1203 09:16:53.372917 4831 scope.go:117] "RemoveContainer" containerID="185998197d9479d98a9a2c511078aa31c9b9dd84604ec9182b84da77073a4f72" Dec 03 09:16:53 crc kubenswrapper[4831]: I1203 09:16:53.395541 4831 scope.go:117] "RemoveContainer" containerID="820dd4780b13677149b00681ed10d0ac4b97f2a8423d63d5d551b53b939cd2d6" Dec 03 09:16:55 crc kubenswrapper[4831]: I1203 09:16:55.026870 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd10f087-28e1-4c0f-a90b-511b577b7c14" path="/var/lib/kubelet/pods/cd10f087-28e1-4c0f-a90b-511b577b7c14/volumes" Dec 03 09:16:59 crc kubenswrapper[4831]: I1203 09:16:59.226098 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd_c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4/util/0.log" Dec 03 09:16:59 crc kubenswrapper[4831]: I1203 09:16:59.481794 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd_c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4/util/0.log" Dec 03 09:16:59 crc kubenswrapper[4831]: I1203 09:16:59.495040 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd_c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4/pull/0.log" Dec 03 09:16:59 crc kubenswrapper[4831]: I1203 09:16:59.500240 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd_c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4/pull/0.log" Dec 03 09:16:59 crc kubenswrapper[4831]: I1203 09:16:59.665258 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd_c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4/util/0.log" Dec 03 09:16:59 crc kubenswrapper[4831]: I1203 09:16:59.691677 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd_c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4/pull/0.log" Dec 03 09:16:59 crc kubenswrapper[4831]: I1203 09:16:59.710389 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931aqbggd_c7ea5269-a0d1-4074-a27b-ff29b0dd0ec4/extract/0.log" Dec 03 09:16:59 crc kubenswrapper[4831]: I1203 09:16:59.844002 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf_62af8b2f-d7af-47ca-9111-1e4fc68aaf8f/util/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.005447 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf_62af8b2f-d7af-47ca-9111-1e4fc68aaf8f/pull/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.022803 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf_62af8b2f-d7af-47ca-9111-1e4fc68aaf8f/util/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.057053 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf_62af8b2f-d7af-47ca-9111-1e4fc68aaf8f/pull/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.212581 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf_62af8b2f-d7af-47ca-9111-1e4fc68aaf8f/util/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.220739 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf_62af8b2f-d7af-47ca-9111-1e4fc68aaf8f/pull/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.222198 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_5064f9f8917b246f69f5d7fc025e7e6c34236c02bca31167615d38212fv6lmf_62af8b2f-d7af-47ca-9111-1e4fc68aaf8f/extract/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.400981 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2_f064abcd-d3d7-4a44-a224-6fde3a142406/util/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.588914 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2_f064abcd-d3d7-4a44-a224-6fde3a142406/util/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.597405 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2_f064abcd-d3d7-4a44-a224-6fde3a142406/pull/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.627882 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2_f064abcd-d3d7-4a44-a224-6fde3a142406/pull/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.785637 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2_f064abcd-d3d7-4a44-a224-6fde3a142406/util/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.788169 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2_f064abcd-d3d7-4a44-a224-6fde3a142406/extract/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.820252 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6c372a8d094fad7255d3bbeabb4914bd2356af7b203a2d2176be1c92102k8f2_f064abcd-d3d7-4a44-a224-6fde3a142406/pull/0.log" Dec 03 09:17:00 crc kubenswrapper[4831]: I1203 09:17:00.993217 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj_6be5b407-422d-4d4c-8897-d2477b1c1ae1/util/0.log" Dec 03 09:17:01 crc kubenswrapper[4831]: I1203 09:17:01.169117 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj_6be5b407-422d-4d4c-8897-d2477b1c1ae1/util/0.log" Dec 03 09:17:01 crc kubenswrapper[4831]: I1203 09:17:01.177016 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj_6be5b407-422d-4d4c-8897-d2477b1c1ae1/pull/0.log" Dec 03 09:17:01 crc kubenswrapper[4831]: I1203 09:17:01.177435 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj_6be5b407-422d-4d4c-8897-d2477b1c1ae1/pull/0.log" Dec 03 09:17:01 crc kubenswrapper[4831]: I1203 09:17:01.409605 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj_6be5b407-422d-4d4c-8897-d2477b1c1ae1/util/0.log" Dec 03 09:17:01 crc kubenswrapper[4831]: I1203 09:17:01.414354 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj_6be5b407-422d-4d4c-8897-d2477b1c1ae1/pull/0.log" Dec 03 09:17:01 crc kubenswrapper[4831]: I1203 09:17:01.447054 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_af69698b82fdf008f5ff9e195cf808a654240e16b13dcd924b74994f832rknj_6be5b407-422d-4d4c-8897-d2477b1c1ae1/extract/0.log" Dec 03 09:17:01 crc kubenswrapper[4831]: I1203 09:17:01.574141 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vtksj_6df1e16e-4331-43c5-94f9-73d6ad45157b/extract-utilities/0.log" Dec 03 09:17:01 crc kubenswrapper[4831]: I1203 09:17:01.823337 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vtksj_6df1e16e-4331-43c5-94f9-73d6ad45157b/extract-content/0.log" Dec 03 09:17:01 crc kubenswrapper[4831]: I1203 09:17:01.833566 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vtksj_6df1e16e-4331-43c5-94f9-73d6ad45157b/extract-utilities/0.log" Dec 03 09:17:01 crc kubenswrapper[4831]: I1203 09:17:01.851974 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vtksj_6df1e16e-4331-43c5-94f9-73d6ad45157b/extract-content/0.log" Dec 03 09:17:01 crc kubenswrapper[4831]: I1203 09:17:01.999529 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vtksj_6df1e16e-4331-43c5-94f9-73d6ad45157b/extract-utilities/0.log" Dec 03 09:17:02 crc kubenswrapper[4831]: I1203 09:17:02.097597 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vtksj_6df1e16e-4331-43c5-94f9-73d6ad45157b/extract-content/0.log" Dec 03 09:17:02 crc kubenswrapper[4831]: I1203 09:17:02.271719 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ws8r_d9119e77-c547-4584-98b5-44c99988a1c0/extract-utilities/0.log" Dec 03 09:17:02 crc kubenswrapper[4831]: I1203 09:17:02.518943 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ws8r_d9119e77-c547-4584-98b5-44c99988a1c0/extract-content/0.log" Dec 03 09:17:02 crc kubenswrapper[4831]: I1203 09:17:02.569602 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ws8r_d9119e77-c547-4584-98b5-44c99988a1c0/extract-content/0.log" Dec 03 09:17:02 crc kubenswrapper[4831]: I1203 09:17:02.629599 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ws8r_d9119e77-c547-4584-98b5-44c99988a1c0/extract-utilities/0.log" Dec 03 09:17:02 crc kubenswrapper[4831]: I1203 09:17:02.695974 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-vtksj_6df1e16e-4331-43c5-94f9-73d6ad45157b/registry-server/0.log" Dec 03 09:17:03 crc kubenswrapper[4831]: I1203 09:17:03.515176 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ws8r_d9119e77-c547-4584-98b5-44c99988a1c0/extract-content/0.log" Dec 03 09:17:03 crc kubenswrapper[4831]: I1203 09:17:03.541423 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-j22fq_5c596f0f-729f-4beb-b1f7-58ce65c9a928/marketplace-operator/0.log" Dec 03 09:17:03 crc kubenswrapper[4831]: I1203 09:17:03.613147 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ws8r_d9119e77-c547-4584-98b5-44c99988a1c0/extract-utilities/0.log" Dec 03 09:17:03 crc kubenswrapper[4831]: I1203 09:17:03.775700 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-njc9k_4139c162-640d-47e5-878f-b6c3835bd31d/extract-utilities/0.log" Dec 03 09:17:04 crc kubenswrapper[4831]: I1203 09:17:04.016060 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-njc9k_4139c162-640d-47e5-878f-b6c3835bd31d/extract-content/0.log" Dec 03 09:17:04 crc kubenswrapper[4831]: I1203 09:17:04.069891 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-njc9k_4139c162-640d-47e5-878f-b6c3835bd31d/extract-content/0.log" Dec 03 09:17:04 crc kubenswrapper[4831]: I1203 09:17:04.071539 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-njc9k_4139c162-640d-47e5-878f-b6c3835bd31d/extract-utilities/0.log" Dec 03 09:17:04 crc kubenswrapper[4831]: I1203 09:17:04.202450 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-njc9k_4139c162-640d-47e5-878f-b6c3835bd31d/extract-utilities/0.log" Dec 03 09:17:04 crc kubenswrapper[4831]: I1203 09:17:04.247820 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-njc9k_4139c162-640d-47e5-878f-b6c3835bd31d/extract-content/0.log" Dec 03 09:17:04 crc kubenswrapper[4831]: I1203 09:17:04.405717 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6ws8r_d9119e77-c547-4584-98b5-44c99988a1c0/registry-server/0.log" Dec 03 09:17:04 crc kubenswrapper[4831]: I1203 09:17:04.427614 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zsw26_eff1e6c4-5941-48cc-a498-1dbd6c977a9a/extract-utilities/0.log" Dec 03 09:17:04 crc kubenswrapper[4831]: I1203 09:17:04.709812 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-njc9k_4139c162-640d-47e5-878f-b6c3835bd31d/registry-server/0.log" Dec 03 09:17:04 crc kubenswrapper[4831]: I1203 09:17:04.993015 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zsw26_eff1e6c4-5941-48cc-a498-1dbd6c977a9a/extract-content/0.log" Dec 03 09:17:05 crc kubenswrapper[4831]: I1203 09:17:05.004913 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zsw26_eff1e6c4-5941-48cc-a498-1dbd6c977a9a/extract-content/0.log" Dec 03 09:17:05 crc kubenswrapper[4831]: I1203 09:17:05.013266 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:17:05 crc kubenswrapper[4831]: E1203 09:17:05.013702 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:17:05 crc kubenswrapper[4831]: I1203 09:17:05.049137 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zsw26_eff1e6c4-5941-48cc-a498-1dbd6c977a9a/extract-utilities/0.log" Dec 03 09:17:05 crc kubenswrapper[4831]: I1203 09:17:05.190001 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zsw26_eff1e6c4-5941-48cc-a498-1dbd6c977a9a/extract-utilities/0.log" Dec 03 09:17:05 crc kubenswrapper[4831]: I1203 09:17:05.202702 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zsw26_eff1e6c4-5941-48cc-a498-1dbd6c977a9a/extract-content/0.log" Dec 03 09:17:05 crc kubenswrapper[4831]: I1203 09:17:05.290891 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-zsw26_eff1e6c4-5941-48cc-a498-1dbd6c977a9a/registry-server/0.log" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.346526 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-n54lx"] Dec 03 09:17:10 crc kubenswrapper[4831]: E1203 09:17:10.347712 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd10f087-28e1-4c0f-a90b-511b577b7c14" containerName="extract-utilities" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.347733 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd10f087-28e1-4c0f-a90b-511b577b7c14" containerName="extract-utilities" Dec 03 09:17:10 crc kubenswrapper[4831]: E1203 09:17:10.347746 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd10f087-28e1-4c0f-a90b-511b577b7c14" containerName="extract-content" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.347755 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd10f087-28e1-4c0f-a90b-511b577b7c14" containerName="extract-content" Dec 03 09:17:10 crc kubenswrapper[4831]: E1203 09:17:10.347789 4831 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cd10f087-28e1-4c0f-a90b-511b577b7c14" containerName="registry-server" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.347798 4831 state_mem.go:107] "Deleted CPUSet assignment" podUID="cd10f087-28e1-4c0f-a90b-511b577b7c14" containerName="registry-server" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.348089 4831 memory_manager.go:354] "RemoveStaleState removing state" podUID="cd10f087-28e1-4c0f-a90b-511b577b7c14" containerName="registry-server" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.350045 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.360802 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n54lx"] Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.511765 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441da6e5-70c4-49b1-9c8c-5c652657d074-utilities\") pod \"community-operators-n54lx\" (UID: \"441da6e5-70c4-49b1-9c8c-5c652657d074\") " pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.512174 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktvgq\" (UniqueName: \"kubernetes.io/projected/441da6e5-70c4-49b1-9c8c-5c652657d074-kube-api-access-ktvgq\") pod \"community-operators-n54lx\" (UID: \"441da6e5-70c4-49b1-9c8c-5c652657d074\") " pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.512588 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441da6e5-70c4-49b1-9c8c-5c652657d074-catalog-content\") pod \"community-operators-n54lx\" (UID: \"441da6e5-70c4-49b1-9c8c-5c652657d074\") " pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.613421 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441da6e5-70c4-49b1-9c8c-5c652657d074-catalog-content\") pod \"community-operators-n54lx\" (UID: \"441da6e5-70c4-49b1-9c8c-5c652657d074\") " pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.613528 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441da6e5-70c4-49b1-9c8c-5c652657d074-utilities\") pod \"community-operators-n54lx\" (UID: \"441da6e5-70c4-49b1-9c8c-5c652657d074\") " pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.613568 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktvgq\" (UniqueName: \"kubernetes.io/projected/441da6e5-70c4-49b1-9c8c-5c652657d074-kube-api-access-ktvgq\") pod \"community-operators-n54lx\" (UID: \"441da6e5-70c4-49b1-9c8c-5c652657d074\") " pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.614072 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441da6e5-70c4-49b1-9c8c-5c652657d074-utilities\") pod \"community-operators-n54lx\" (UID: \"441da6e5-70c4-49b1-9c8c-5c652657d074\") " pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.614072 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441da6e5-70c4-49b1-9c8c-5c652657d074-catalog-content\") pod \"community-operators-n54lx\" (UID: \"441da6e5-70c4-49b1-9c8c-5c652657d074\") " pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.639703 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktvgq\" (UniqueName: \"kubernetes.io/projected/441da6e5-70c4-49b1-9c8c-5c652657d074-kube-api-access-ktvgq\") pod \"community-operators-n54lx\" (UID: \"441da6e5-70c4-49b1-9c8c-5c652657d074\") " pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:10 crc kubenswrapper[4831]: I1203 09:17:10.690037 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:11 crc kubenswrapper[4831]: I1203 09:17:11.254330 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-n54lx"] Dec 03 09:17:11 crc kubenswrapper[4831]: I1203 09:17:11.509931 4831 generic.go:334] "Generic (PLEG): container finished" podID="441da6e5-70c4-49b1-9c8c-5c652657d074" containerID="a06dde41308244717eecf469415377b2bb0354214d3a748b38b21eba814037fd" exitCode=0 Dec 03 09:17:11 crc kubenswrapper[4831]: I1203 09:17:11.509973 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n54lx" event={"ID":"441da6e5-70c4-49b1-9c8c-5c652657d074","Type":"ContainerDied","Data":"a06dde41308244717eecf469415377b2bb0354214d3a748b38b21eba814037fd"} Dec 03 09:17:11 crc kubenswrapper[4831]: I1203 09:17:11.509995 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n54lx" event={"ID":"441da6e5-70c4-49b1-9c8c-5c652657d074","Type":"ContainerStarted","Data":"047dd942f09acad566a4868c5b3f4b2474b36cecf9c6256fc082fcd3b81ac57f"} Dec 03 09:17:11 crc kubenswrapper[4831]: E1203 09:17:11.765289 4831 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod441da6e5_70c4_49b1_9c8c_5c652657d074.slice/crio-conmon-a06dde41308244717eecf469415377b2bb0354214d3a748b38b21eba814037fd.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod441da6e5_70c4_49b1_9c8c_5c652657d074.slice/crio-a06dde41308244717eecf469415377b2bb0354214d3a748b38b21eba814037fd.scope\": RecentStats: unable to find data in memory cache]" Dec 03 09:17:13 crc kubenswrapper[4831]: I1203 09:17:13.531938 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n54lx" event={"ID":"441da6e5-70c4-49b1-9c8c-5c652657d074","Type":"ContainerStarted","Data":"6350551b17e90d8e15ead336d185010c4599a4373c63fc9f8e88efa35f231bae"} Dec 03 09:17:14 crc kubenswrapper[4831]: I1203 09:17:14.543789 4831 generic.go:334] "Generic (PLEG): container finished" podID="441da6e5-70c4-49b1-9c8c-5c652657d074" containerID="6350551b17e90d8e15ead336d185010c4599a4373c63fc9f8e88efa35f231bae" exitCode=0 Dec 03 09:17:14 crc kubenswrapper[4831]: I1203 09:17:14.543906 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n54lx" event={"ID":"441da6e5-70c4-49b1-9c8c-5c652657d074","Type":"ContainerDied","Data":"6350551b17e90d8e15ead336d185010c4599a4373c63fc9f8e88efa35f231bae"} Dec 03 09:17:15 crc kubenswrapper[4831]: I1203 09:17:15.557824 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n54lx" event={"ID":"441da6e5-70c4-49b1-9c8c-5c652657d074","Type":"ContainerStarted","Data":"1000d54136dde4a85cabd04c6fc90bdb04b0de7e1f5159da23953aa496f5dd69"} Dec 03 09:17:15 crc kubenswrapper[4831]: I1203 09:17:15.581414 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-n54lx" podStartSLOduration=2.155362545 podStartE2EDuration="5.581393599s" podCreationTimestamp="2025-12-03 09:17:10 +0000 UTC" firstStartedPulling="2025-12-03 09:17:11.512124671 +0000 UTC m=+9968.855708179" lastFinishedPulling="2025-12-03 09:17:14.938155725 +0000 UTC m=+9972.281739233" observedRunningTime="2025-12-03 09:17:15.575878038 +0000 UTC m=+9972.919461536" watchObservedRunningTime="2025-12-03 09:17:15.581393599 +0000 UTC m=+9972.924977107" Dec 03 09:17:17 crc kubenswrapper[4831]: I1203 09:17:17.013260 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:17:17 crc kubenswrapper[4831]: E1203 09:17:17.013977 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:17:18 crc kubenswrapper[4831]: I1203 09:17:18.938864 4831 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-fv869"] Dec 03 09:17:18 crc kubenswrapper[4831]: I1203 09:17:18.941974 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:18 crc kubenswrapper[4831]: I1203 09:17:18.971792 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv869"] Dec 03 09:17:19 crc kubenswrapper[4831]: I1203 09:17:19.131154 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c95e37c3-4722-495f-8dc4-5c6585e24c24-utilities\") pod \"redhat-marketplace-fv869\" (UID: \"c95e37c3-4722-495f-8dc4-5c6585e24c24\") " pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:19 crc kubenswrapper[4831]: I1203 09:17:19.131249 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgtk9\" (UniqueName: \"kubernetes.io/projected/c95e37c3-4722-495f-8dc4-5c6585e24c24-kube-api-access-vgtk9\") pod \"redhat-marketplace-fv869\" (UID: \"c95e37c3-4722-495f-8dc4-5c6585e24c24\") " pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:19 crc kubenswrapper[4831]: I1203 09:17:19.131397 4831 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c95e37c3-4722-495f-8dc4-5c6585e24c24-catalog-content\") pod \"redhat-marketplace-fv869\" (UID: \"c95e37c3-4722-495f-8dc4-5c6585e24c24\") " pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:19 crc kubenswrapper[4831]: I1203 09:17:19.233111 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c95e37c3-4722-495f-8dc4-5c6585e24c24-utilities\") pod \"redhat-marketplace-fv869\" (UID: \"c95e37c3-4722-495f-8dc4-5c6585e24c24\") " pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:19 crc kubenswrapper[4831]: I1203 09:17:19.233228 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgtk9\" (UniqueName: \"kubernetes.io/projected/c95e37c3-4722-495f-8dc4-5c6585e24c24-kube-api-access-vgtk9\") pod \"redhat-marketplace-fv869\" (UID: \"c95e37c3-4722-495f-8dc4-5c6585e24c24\") " pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:19 crc kubenswrapper[4831]: I1203 09:17:19.233388 4831 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c95e37c3-4722-495f-8dc4-5c6585e24c24-catalog-content\") pod \"redhat-marketplace-fv869\" (UID: \"c95e37c3-4722-495f-8dc4-5c6585e24c24\") " pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:19 crc kubenswrapper[4831]: I1203 09:17:19.233620 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c95e37c3-4722-495f-8dc4-5c6585e24c24-utilities\") pod \"redhat-marketplace-fv869\" (UID: \"c95e37c3-4722-495f-8dc4-5c6585e24c24\") " pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:19 crc kubenswrapper[4831]: I1203 09:17:19.234120 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c95e37c3-4722-495f-8dc4-5c6585e24c24-catalog-content\") pod \"redhat-marketplace-fv869\" (UID: \"c95e37c3-4722-495f-8dc4-5c6585e24c24\") " pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:19 crc kubenswrapper[4831]: I1203 09:17:19.256591 4831 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgtk9\" (UniqueName: \"kubernetes.io/projected/c95e37c3-4722-495f-8dc4-5c6585e24c24-kube-api-access-vgtk9\") pod \"redhat-marketplace-fv869\" (UID: \"c95e37c3-4722-495f-8dc4-5c6585e24c24\") " pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:19 crc kubenswrapper[4831]: I1203 09:17:19.261985 4831 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:19 crc kubenswrapper[4831]: I1203 09:17:19.806472 4831 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv869"] Dec 03 09:17:20 crc kubenswrapper[4831]: I1203 09:17:20.068929 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-668cf9dfbb-89ql6_ae87822e-4b31-43cb-af6e-33739656a430/prometheus-operator/0.log" Dec 03 09:17:20 crc kubenswrapper[4831]: I1203 09:17:20.151839 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57c7c77f4f-n2jvx_ad3a46cd-b6a3-4229-8d68-9d9931dc33bf/prometheus-operator-admission-webhook/0.log" Dec 03 09:17:20 crc kubenswrapper[4831]: I1203 09:17:20.258357 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-57c7c77f4f-svvqp_7d153656-c76f-46c6-a2c1-51e507cd8705/prometheus-operator-admission-webhook/0.log" Dec 03 09:17:20 crc kubenswrapper[4831]: I1203 09:17:20.369167 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-d8bb48f5d-tchlg_c5828415-c585-469e-8596-3c7142eb299a/operator/0.log" Dec 03 09:17:20 crc kubenswrapper[4831]: I1203 09:17:20.439340 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5446b9c989-c7d8p_930ff1fd-f482-47a3-a52a-09970ac40b24/perses-operator/0.log" Dec 03 09:17:20 crc kubenswrapper[4831]: I1203 09:17:20.614538 4831 generic.go:334] "Generic (PLEG): container finished" podID="c95e37c3-4722-495f-8dc4-5c6585e24c24" containerID="942ccc56eb83b905913f09c661759a1c1246d172726fd7a83e189599607f6db0" exitCode=0 Dec 03 09:17:20 crc kubenswrapper[4831]: I1203 09:17:20.614629 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv869" event={"ID":"c95e37c3-4722-495f-8dc4-5c6585e24c24","Type":"ContainerDied","Data":"942ccc56eb83b905913f09c661759a1c1246d172726fd7a83e189599607f6db0"} Dec 03 09:17:20 crc kubenswrapper[4831]: I1203 09:17:20.614891 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv869" event={"ID":"c95e37c3-4722-495f-8dc4-5c6585e24c24","Type":"ContainerStarted","Data":"b1601ebdc1cc71bc8fe0e7b81b00a5c62b2d4e621ab2f91d9cb5a025baa2fc9c"} Dec 03 09:17:20 crc kubenswrapper[4831]: I1203 09:17:20.706304 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:20 crc kubenswrapper[4831]: I1203 09:17:20.709406 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:20 crc kubenswrapper[4831]: I1203 09:17:20.775672 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:21 crc kubenswrapper[4831]: I1203 09:17:21.680421 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:22 crc kubenswrapper[4831]: I1203 09:17:22.645659 4831 generic.go:334] "Generic (PLEG): container finished" podID="c95e37c3-4722-495f-8dc4-5c6585e24c24" containerID="95e74861373c46310eb0229b411e86d133ba89ad79bc223249437d92903891e3" exitCode=0 Dec 03 09:17:22 crc kubenswrapper[4831]: I1203 09:17:22.645732 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv869" event={"ID":"c95e37c3-4722-495f-8dc4-5c6585e24c24","Type":"ContainerDied","Data":"95e74861373c46310eb0229b411e86d133ba89ad79bc223249437d92903891e3"} Dec 03 09:17:23 crc kubenswrapper[4831]: I1203 09:17:23.110078 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n54lx"] Dec 03 09:17:23 crc kubenswrapper[4831]: I1203 09:17:23.664993 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv869" event={"ID":"c95e37c3-4722-495f-8dc4-5c6585e24c24","Type":"ContainerStarted","Data":"6515b6922e4a0706c4dbd3b8a048189653b51f3dd73158e2c888e06f3e5487e3"} Dec 03 09:17:23 crc kubenswrapper[4831]: I1203 09:17:23.665207 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-n54lx" podUID="441da6e5-70c4-49b1-9c8c-5c652657d074" containerName="registry-server" containerID="cri-o://1000d54136dde4a85cabd04c6fc90bdb04b0de7e1f5159da23953aa496f5dd69" gracePeriod=2 Dec 03 09:17:23 crc kubenswrapper[4831]: I1203 09:17:23.706673 4831 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-fv869" podStartSLOduration=3.276082066 podStartE2EDuration="5.70662916s" podCreationTimestamp="2025-12-03 09:17:18 +0000 UTC" firstStartedPulling="2025-12-03 09:17:20.616610876 +0000 UTC m=+9977.960194394" lastFinishedPulling="2025-12-03 09:17:23.04715797 +0000 UTC m=+9980.390741488" observedRunningTime="2025-12-03 09:17:23.696872676 +0000 UTC m=+9981.040456184" watchObservedRunningTime="2025-12-03 09:17:23.70662916 +0000 UTC m=+9981.050212678" Dec 03 09:17:24 crc kubenswrapper[4831]: I1203 09:17:24.681936 4831 generic.go:334] "Generic (PLEG): container finished" podID="441da6e5-70c4-49b1-9c8c-5c652657d074" containerID="1000d54136dde4a85cabd04c6fc90bdb04b0de7e1f5159da23953aa496f5dd69" exitCode=0 Dec 03 09:17:24 crc kubenswrapper[4831]: I1203 09:17:24.682043 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n54lx" event={"ID":"441da6e5-70c4-49b1-9c8c-5c652657d074","Type":"ContainerDied","Data":"1000d54136dde4a85cabd04c6fc90bdb04b0de7e1f5159da23953aa496f5dd69"} Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.156671 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.274866 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441da6e5-70c4-49b1-9c8c-5c652657d074-utilities\") pod \"441da6e5-70c4-49b1-9c8c-5c652657d074\" (UID: \"441da6e5-70c4-49b1-9c8c-5c652657d074\") " Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.275207 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441da6e5-70c4-49b1-9c8c-5c652657d074-catalog-content\") pod \"441da6e5-70c4-49b1-9c8c-5c652657d074\" (UID: \"441da6e5-70c4-49b1-9c8c-5c652657d074\") " Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.275276 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ktvgq\" (UniqueName: \"kubernetes.io/projected/441da6e5-70c4-49b1-9c8c-5c652657d074-kube-api-access-ktvgq\") pod \"441da6e5-70c4-49b1-9c8c-5c652657d074\" (UID: \"441da6e5-70c4-49b1-9c8c-5c652657d074\") " Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.275964 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/441da6e5-70c4-49b1-9c8c-5c652657d074-utilities" (OuterVolumeSpecName: "utilities") pod "441da6e5-70c4-49b1-9c8c-5c652657d074" (UID: "441da6e5-70c4-49b1-9c8c-5c652657d074"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.276697 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/441da6e5-70c4-49b1-9c8c-5c652657d074-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.291119 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/441da6e5-70c4-49b1-9c8c-5c652657d074-kube-api-access-ktvgq" (OuterVolumeSpecName: "kube-api-access-ktvgq") pod "441da6e5-70c4-49b1-9c8c-5c652657d074" (UID: "441da6e5-70c4-49b1-9c8c-5c652657d074"). InnerVolumeSpecName "kube-api-access-ktvgq". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.322515 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/441da6e5-70c4-49b1-9c8c-5c652657d074-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "441da6e5-70c4-49b1-9c8c-5c652657d074" (UID: "441da6e5-70c4-49b1-9c8c-5c652657d074"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.380619 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/441da6e5-70c4-49b1-9c8c-5c652657d074-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.380645 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ktvgq\" (UniqueName: \"kubernetes.io/projected/441da6e5-70c4-49b1-9c8c-5c652657d074-kube-api-access-ktvgq\") on node \"crc\" DevicePath \"\"" Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.707564 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-n54lx" event={"ID":"441da6e5-70c4-49b1-9c8c-5c652657d074","Type":"ContainerDied","Data":"047dd942f09acad566a4868c5b3f4b2474b36cecf9c6256fc082fcd3b81ac57f"} Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.707633 4831 scope.go:117] "RemoveContainer" containerID="1000d54136dde4a85cabd04c6fc90bdb04b0de7e1f5159da23953aa496f5dd69" Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.707650 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-n54lx" Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.747116 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-n54lx"] Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.749486 4831 scope.go:117] "RemoveContainer" containerID="6350551b17e90d8e15ead336d185010c4599a4373c63fc9f8e88efa35f231bae" Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.760181 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-n54lx"] Dec 03 09:17:25 crc kubenswrapper[4831]: I1203 09:17:25.801780 4831 scope.go:117] "RemoveContainer" containerID="a06dde41308244717eecf469415377b2bb0354214d3a748b38b21eba814037fd" Dec 03 09:17:27 crc kubenswrapper[4831]: I1203 09:17:27.028264 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="441da6e5-70c4-49b1-9c8c-5c652657d074" path="/var/lib/kubelet/pods/441da6e5-70c4-49b1-9c8c-5c652657d074/volumes" Dec 03 09:17:29 crc kubenswrapper[4831]: I1203 09:17:29.262583 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:29 crc kubenswrapper[4831]: I1203 09:17:29.263132 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:29 crc kubenswrapper[4831]: I1203 09:17:29.324648 4831 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:29 crc kubenswrapper[4831]: I1203 09:17:29.859441 4831 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:30 crc kubenswrapper[4831]: I1203 09:17:30.708725 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv869"] Dec 03 09:17:31 crc kubenswrapper[4831]: I1203 09:17:31.824455 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-fv869" podUID="c95e37c3-4722-495f-8dc4-5c6585e24c24" containerName="registry-server" containerID="cri-o://6515b6922e4a0706c4dbd3b8a048189653b51f3dd73158e2c888e06f3e5487e3" gracePeriod=2 Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.012523 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:17:32 crc kubenswrapper[4831]: E1203 09:17:32.013201 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.403644 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.585221 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c95e37c3-4722-495f-8dc4-5c6585e24c24-catalog-content\") pod \"c95e37c3-4722-495f-8dc4-5c6585e24c24\" (UID: \"c95e37c3-4722-495f-8dc4-5c6585e24c24\") " Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.585262 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c95e37c3-4722-495f-8dc4-5c6585e24c24-utilities\") pod \"c95e37c3-4722-495f-8dc4-5c6585e24c24\" (UID: \"c95e37c3-4722-495f-8dc4-5c6585e24c24\") " Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.585416 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vgtk9\" (UniqueName: \"kubernetes.io/projected/c95e37c3-4722-495f-8dc4-5c6585e24c24-kube-api-access-vgtk9\") pod \"c95e37c3-4722-495f-8dc4-5c6585e24c24\" (UID: \"c95e37c3-4722-495f-8dc4-5c6585e24c24\") " Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.586125 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c95e37c3-4722-495f-8dc4-5c6585e24c24-utilities" (OuterVolumeSpecName: "utilities") pod "c95e37c3-4722-495f-8dc4-5c6585e24c24" (UID: "c95e37c3-4722-495f-8dc4-5c6585e24c24"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.586786 4831 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c95e37c3-4722-495f-8dc4-5c6585e24c24-utilities\") on node \"crc\" DevicePath \"\"" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.591146 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c95e37c3-4722-495f-8dc4-5c6585e24c24-kube-api-access-vgtk9" (OuterVolumeSpecName: "kube-api-access-vgtk9") pod "c95e37c3-4722-495f-8dc4-5c6585e24c24" (UID: "c95e37c3-4722-495f-8dc4-5c6585e24c24"). InnerVolumeSpecName "kube-api-access-vgtk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.604766 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c95e37c3-4722-495f-8dc4-5c6585e24c24-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c95e37c3-4722-495f-8dc4-5c6585e24c24" (UID: "c95e37c3-4722-495f-8dc4-5c6585e24c24"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.688589 4831 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c95e37c3-4722-495f-8dc4-5c6585e24c24-catalog-content\") on node \"crc\" DevicePath \"\"" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.688617 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vgtk9\" (UniqueName: \"kubernetes.io/projected/c95e37c3-4722-495f-8dc4-5c6585e24c24-kube-api-access-vgtk9\") on node \"crc\" DevicePath \"\"" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.838073 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-fv869" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.838095 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv869" event={"ID":"c95e37c3-4722-495f-8dc4-5c6585e24c24","Type":"ContainerDied","Data":"6515b6922e4a0706c4dbd3b8a048189653b51f3dd73158e2c888e06f3e5487e3"} Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.838160 4831 scope.go:117] "RemoveContainer" containerID="6515b6922e4a0706c4dbd3b8a048189653b51f3dd73158e2c888e06f3e5487e3" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.837950 4831 generic.go:334] "Generic (PLEG): container finished" podID="c95e37c3-4722-495f-8dc4-5c6585e24c24" containerID="6515b6922e4a0706c4dbd3b8a048189653b51f3dd73158e2c888e06f3e5487e3" exitCode=0 Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.847432 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-fv869" event={"ID":"c95e37c3-4722-495f-8dc4-5c6585e24c24","Type":"ContainerDied","Data":"b1601ebdc1cc71bc8fe0e7b81b00a5c62b2d4e621ab2f91d9cb5a025baa2fc9c"} Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.860273 4831 scope.go:117] "RemoveContainer" containerID="95e74861373c46310eb0229b411e86d133ba89ad79bc223249437d92903891e3" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.887154 4831 scope.go:117] "RemoveContainer" containerID="942ccc56eb83b905913f09c661759a1c1246d172726fd7a83e189599607f6db0" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.889733 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv869"] Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.909001 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-fv869"] Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.961816 4831 scope.go:117] "RemoveContainer" containerID="6515b6922e4a0706c4dbd3b8a048189653b51f3dd73158e2c888e06f3e5487e3" Dec 03 09:17:32 crc kubenswrapper[4831]: E1203 09:17:32.962425 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6515b6922e4a0706c4dbd3b8a048189653b51f3dd73158e2c888e06f3e5487e3\": container with ID starting with 6515b6922e4a0706c4dbd3b8a048189653b51f3dd73158e2c888e06f3e5487e3 not found: ID does not exist" containerID="6515b6922e4a0706c4dbd3b8a048189653b51f3dd73158e2c888e06f3e5487e3" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.962454 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6515b6922e4a0706c4dbd3b8a048189653b51f3dd73158e2c888e06f3e5487e3"} err="failed to get container status \"6515b6922e4a0706c4dbd3b8a048189653b51f3dd73158e2c888e06f3e5487e3\": rpc error: code = NotFound desc = could not find container \"6515b6922e4a0706c4dbd3b8a048189653b51f3dd73158e2c888e06f3e5487e3\": container with ID starting with 6515b6922e4a0706c4dbd3b8a048189653b51f3dd73158e2c888e06f3e5487e3 not found: ID does not exist" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.962480 4831 scope.go:117] "RemoveContainer" containerID="95e74861373c46310eb0229b411e86d133ba89ad79bc223249437d92903891e3" Dec 03 09:17:32 crc kubenswrapper[4831]: E1203 09:17:32.962847 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95e74861373c46310eb0229b411e86d133ba89ad79bc223249437d92903891e3\": container with ID starting with 95e74861373c46310eb0229b411e86d133ba89ad79bc223249437d92903891e3 not found: ID does not exist" containerID="95e74861373c46310eb0229b411e86d133ba89ad79bc223249437d92903891e3" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.962911 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95e74861373c46310eb0229b411e86d133ba89ad79bc223249437d92903891e3"} err="failed to get container status \"95e74861373c46310eb0229b411e86d133ba89ad79bc223249437d92903891e3\": rpc error: code = NotFound desc = could not find container \"95e74861373c46310eb0229b411e86d133ba89ad79bc223249437d92903891e3\": container with ID starting with 95e74861373c46310eb0229b411e86d133ba89ad79bc223249437d92903891e3 not found: ID does not exist" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.962944 4831 scope.go:117] "RemoveContainer" containerID="942ccc56eb83b905913f09c661759a1c1246d172726fd7a83e189599607f6db0" Dec 03 09:17:32 crc kubenswrapper[4831]: E1203 09:17:32.963309 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"942ccc56eb83b905913f09c661759a1c1246d172726fd7a83e189599607f6db0\": container with ID starting with 942ccc56eb83b905913f09c661759a1c1246d172726fd7a83e189599607f6db0 not found: ID does not exist" containerID="942ccc56eb83b905913f09c661759a1c1246d172726fd7a83e189599607f6db0" Dec 03 09:17:32 crc kubenswrapper[4831]: I1203 09:17:32.963352 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"942ccc56eb83b905913f09c661759a1c1246d172726fd7a83e189599607f6db0"} err="failed to get container status \"942ccc56eb83b905913f09c661759a1c1246d172726fd7a83e189599607f6db0\": rpc error: code = NotFound desc = could not find container \"942ccc56eb83b905913f09c661759a1c1246d172726fd7a83e189599607f6db0\": container with ID starting with 942ccc56eb83b905913f09c661759a1c1246d172726fd7a83e189599607f6db0 not found: ID does not exist" Dec 03 09:17:33 crc kubenswrapper[4831]: I1203 09:17:33.028529 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c95e37c3-4722-495f-8dc4-5c6585e24c24" path="/var/lib/kubelet/pods/c95e37c3-4722-495f-8dc4-5c6585e24c24/volumes" Dec 03 09:17:43 crc kubenswrapper[4831]: I1203 09:17:43.403617 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:17:43 crc kubenswrapper[4831]: E1203 09:17:43.406294 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:17:57 crc kubenswrapper[4831]: I1203 09:17:57.013214 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:17:57 crc kubenswrapper[4831]: E1203 09:17:57.014249 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:18:12 crc kubenswrapper[4831]: I1203 09:18:12.012742 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:18:12 crc kubenswrapper[4831]: E1203 09:18:12.013564 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:18:25 crc kubenswrapper[4831]: I1203 09:18:25.013965 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:18:25 crc kubenswrapper[4831]: E1203 09:18:25.014914 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:18:39 crc kubenswrapper[4831]: I1203 09:18:39.013627 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:18:39 crc kubenswrapper[4831]: E1203 09:18:39.014335 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:18:52 crc kubenswrapper[4831]: I1203 09:18:52.013105 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:18:52 crc kubenswrapper[4831]: E1203 09:18:52.013799 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:19:07 crc kubenswrapper[4831]: I1203 09:19:07.014052 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:19:07 crc kubenswrapper[4831]: E1203 09:19:07.015857 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:19:19 crc kubenswrapper[4831]: I1203 09:19:19.014523 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:19:19 crc kubenswrapper[4831]: E1203 09:19:19.015177 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:19:33 crc kubenswrapper[4831]: I1203 09:19:33.837160 4831 generic.go:334] "Generic (PLEG): container finished" podID="8e6975b6-9f6b-4100-a72e-fcf15a303bc0" containerID="4c1f0e2b9ba0cd61d64214ec595fff1f77b9e18ed811e0e06aa1da228e175407" exitCode=0 Dec 03 09:19:33 crc kubenswrapper[4831]: I1203 09:19:33.837336 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-cqxbg/must-gather-wpzz9" event={"ID":"8e6975b6-9f6b-4100-a72e-fcf15a303bc0","Type":"ContainerDied","Data":"4c1f0e2b9ba0cd61d64214ec595fff1f77b9e18ed811e0e06aa1da228e175407"} Dec 03 09:19:33 crc kubenswrapper[4831]: I1203 09:19:33.838844 4831 scope.go:117] "RemoveContainer" containerID="4c1f0e2b9ba0cd61d64214ec595fff1f77b9e18ed811e0e06aa1da228e175407" Dec 03 09:19:34 crc kubenswrapper[4831]: I1203 09:19:34.020081 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:19:34 crc kubenswrapper[4831]: E1203 09:19:34.020726 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:19:34 crc kubenswrapper[4831]: I1203 09:19:34.813875 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cqxbg_must-gather-wpzz9_8e6975b6-9f6b-4100-a72e-fcf15a303bc0/gather/0.log" Dec 03 09:19:44 crc kubenswrapper[4831]: I1203 09:19:44.295531 4831 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-cqxbg/must-gather-wpzz9"] Dec 03 09:19:44 crc kubenswrapper[4831]: I1203 09:19:44.296526 4831 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-cqxbg/must-gather-wpzz9" podUID="8e6975b6-9f6b-4100-a72e-fcf15a303bc0" containerName="copy" containerID="cri-o://01f720971ce668fcc9ee70cd554765829dfb54cc48c34571669ba7468792312e" gracePeriod=2 Dec 03 09:19:44 crc kubenswrapper[4831]: I1203 09:19:44.309481 4831 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-cqxbg/must-gather-wpzz9"] Dec 03 09:19:44 crc kubenswrapper[4831]: I1203 09:19:44.801363 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cqxbg_must-gather-wpzz9_8e6975b6-9f6b-4100-a72e-fcf15a303bc0/copy/0.log" Dec 03 09:19:44 crc kubenswrapper[4831]: I1203 09:19:44.802053 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxbg/must-gather-wpzz9" Dec 03 09:19:44 crc kubenswrapper[4831]: I1203 09:19:44.970285 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e6975b6-9f6b-4100-a72e-fcf15a303bc0-must-gather-output\") pod \"8e6975b6-9f6b-4100-a72e-fcf15a303bc0\" (UID: \"8e6975b6-9f6b-4100-a72e-fcf15a303bc0\") " Dec 03 09:19:44 crc kubenswrapper[4831]: I1203 09:19:44.970365 4831 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mxv5\" (UniqueName: \"kubernetes.io/projected/8e6975b6-9f6b-4100-a72e-fcf15a303bc0-kube-api-access-7mxv5\") pod \"8e6975b6-9f6b-4100-a72e-fcf15a303bc0\" (UID: \"8e6975b6-9f6b-4100-a72e-fcf15a303bc0\") " Dec 03 09:19:44 crc kubenswrapper[4831]: I1203 09:19:44.976426 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e6975b6-9f6b-4100-a72e-fcf15a303bc0-kube-api-access-7mxv5" (OuterVolumeSpecName: "kube-api-access-7mxv5") pod "8e6975b6-9f6b-4100-a72e-fcf15a303bc0" (UID: "8e6975b6-9f6b-4100-a72e-fcf15a303bc0"). InnerVolumeSpecName "kube-api-access-7mxv5". PluginName "kubernetes.io/projected", VolumeGidValue "" Dec 03 09:19:44 crc kubenswrapper[4831]: I1203 09:19:44.981721 4831 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-cqxbg_must-gather-wpzz9_8e6975b6-9f6b-4100-a72e-fcf15a303bc0/copy/0.log" Dec 03 09:19:44 crc kubenswrapper[4831]: I1203 09:19:44.983526 4831 generic.go:334] "Generic (PLEG): container finished" podID="8e6975b6-9f6b-4100-a72e-fcf15a303bc0" containerID="01f720971ce668fcc9ee70cd554765829dfb54cc48c34571669ba7468792312e" exitCode=143 Dec 03 09:19:44 crc kubenswrapper[4831]: I1203 09:19:44.983582 4831 scope.go:117] "RemoveContainer" containerID="01f720971ce668fcc9ee70cd554765829dfb54cc48c34571669ba7468792312e" Dec 03 09:19:44 crc kubenswrapper[4831]: I1203 09:19:44.983603 4831 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-cqxbg/must-gather-wpzz9" Dec 03 09:19:45 crc kubenswrapper[4831]: I1203 09:19:45.059029 4831 scope.go:117] "RemoveContainer" containerID="4c1f0e2b9ba0cd61d64214ec595fff1f77b9e18ed811e0e06aa1da228e175407" Dec 03 09:19:45 crc kubenswrapper[4831]: I1203 09:19:45.073265 4831 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mxv5\" (UniqueName: \"kubernetes.io/projected/8e6975b6-9f6b-4100-a72e-fcf15a303bc0-kube-api-access-7mxv5\") on node \"crc\" DevicePath \"\"" Dec 03 09:19:45 crc kubenswrapper[4831]: I1203 09:19:45.181248 4831 scope.go:117] "RemoveContainer" containerID="01f720971ce668fcc9ee70cd554765829dfb54cc48c34571669ba7468792312e" Dec 03 09:19:45 crc kubenswrapper[4831]: E1203 09:19:45.181933 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01f720971ce668fcc9ee70cd554765829dfb54cc48c34571669ba7468792312e\": container with ID starting with 01f720971ce668fcc9ee70cd554765829dfb54cc48c34571669ba7468792312e not found: ID does not exist" containerID="01f720971ce668fcc9ee70cd554765829dfb54cc48c34571669ba7468792312e" Dec 03 09:19:45 crc kubenswrapper[4831]: I1203 09:19:45.181989 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01f720971ce668fcc9ee70cd554765829dfb54cc48c34571669ba7468792312e"} err="failed to get container status \"01f720971ce668fcc9ee70cd554765829dfb54cc48c34571669ba7468792312e\": rpc error: code = NotFound desc = could not find container \"01f720971ce668fcc9ee70cd554765829dfb54cc48c34571669ba7468792312e\": container with ID starting with 01f720971ce668fcc9ee70cd554765829dfb54cc48c34571669ba7468792312e not found: ID does not exist" Dec 03 09:19:45 crc kubenswrapper[4831]: I1203 09:19:45.182016 4831 scope.go:117] "RemoveContainer" containerID="4c1f0e2b9ba0cd61d64214ec595fff1f77b9e18ed811e0e06aa1da228e175407" Dec 03 09:19:45 crc kubenswrapper[4831]: E1203 09:19:45.182241 4831 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c1f0e2b9ba0cd61d64214ec595fff1f77b9e18ed811e0e06aa1da228e175407\": container with ID starting with 4c1f0e2b9ba0cd61d64214ec595fff1f77b9e18ed811e0e06aa1da228e175407 not found: ID does not exist" containerID="4c1f0e2b9ba0cd61d64214ec595fff1f77b9e18ed811e0e06aa1da228e175407" Dec 03 09:19:45 crc kubenswrapper[4831]: I1203 09:19:45.182264 4831 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c1f0e2b9ba0cd61d64214ec595fff1f77b9e18ed811e0e06aa1da228e175407"} err="failed to get container status \"4c1f0e2b9ba0cd61d64214ec595fff1f77b9e18ed811e0e06aa1da228e175407\": rpc error: code = NotFound desc = could not find container \"4c1f0e2b9ba0cd61d64214ec595fff1f77b9e18ed811e0e06aa1da228e175407\": container with ID starting with 4c1f0e2b9ba0cd61d64214ec595fff1f77b9e18ed811e0e06aa1da228e175407 not found: ID does not exist" Dec 03 09:19:45 crc kubenswrapper[4831]: I1203 09:19:45.220575 4831 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e6975b6-9f6b-4100-a72e-fcf15a303bc0-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "8e6975b6-9f6b-4100-a72e-fcf15a303bc0" (UID: "8e6975b6-9f6b-4100-a72e-fcf15a303bc0"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Dec 03 09:19:45 crc kubenswrapper[4831]: I1203 09:19:45.277411 4831 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/8e6975b6-9f6b-4100-a72e-fcf15a303bc0-must-gather-output\") on node \"crc\" DevicePath \"\"" Dec 03 09:19:47 crc kubenswrapper[4831]: I1203 09:19:47.027300 4831 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e6975b6-9f6b-4100-a72e-fcf15a303bc0" path="/var/lib/kubelet/pods/8e6975b6-9f6b-4100-a72e-fcf15a303bc0/volumes" Dec 03 09:19:49 crc kubenswrapper[4831]: I1203 09:19:49.012831 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:19:49 crc kubenswrapper[4831]: E1203 09:19:49.013789 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:20:03 crc kubenswrapper[4831]: I1203 09:20:03.023270 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:20:03 crc kubenswrapper[4831]: E1203 09:20:03.024044 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:20:15 crc kubenswrapper[4831]: I1203 09:20:15.013644 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:20:15 crc kubenswrapper[4831]: E1203 09:20:15.015123 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:20:27 crc kubenswrapper[4831]: I1203 09:20:27.013595 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:20:27 crc kubenswrapper[4831]: E1203 09:20:27.014634 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:20:39 crc kubenswrapper[4831]: I1203 09:20:39.014559 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:20:39 crc kubenswrapper[4831]: E1203 09:20:39.016053 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:20:51 crc kubenswrapper[4831]: I1203 09:20:51.013151 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:20:51 crc kubenswrapper[4831]: E1203 09:20:51.015703 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:21:05 crc kubenswrapper[4831]: I1203 09:21:05.013771 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:21:05 crc kubenswrapper[4831]: E1203 09:21:05.014843 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:21:17 crc kubenswrapper[4831]: I1203 09:21:17.012906 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:21:17 crc kubenswrapper[4831]: E1203 09:21:17.013812 4831 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-dvcq5_openshift-machine-config-operator(4e04caf2-8e18-4af8-9779-c5711262077b)\"" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" podUID="4e04caf2-8e18-4af8-9779-c5711262077b" Dec 03 09:21:30 crc kubenswrapper[4831]: I1203 09:21:30.013499 4831 scope.go:117] "RemoveContainer" containerID="12146e37b545f07bbb8d773a8c10014dbef218c0bc3a16597ae947fc9a877ebf" Dec 03 09:21:30 crc kubenswrapper[4831]: I1203 09:21:30.421831 4831 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-dvcq5" event={"ID":"4e04caf2-8e18-4af8-9779-c5711262077b","Type":"ContainerStarted","Data":"a03e090a170e88589fc16b6f79720ef72c2981367c3bc19f95015f804c3bf9f6"}